You appear to be claiming that climate change is not an existential risk?
No, quite the opposite, I'm saying that, given you've said nothing specific about AI x-risk or whatever it is you're concerned about, the post might as well apply to climate change, which is a sign something is off/that you should be more specific since if it could apply to climate change, something is wrong because climate change is obviously a thing we should be worried about, which brings me to my next point:
For something to be a boogeyman, it has to be an imagined danger that captures the fears of a lot of otherwise rational people.
What makes climate change a reasonable concern where AI x-risk isn't?
AI existential risk is a really unusual candidate for this generation's Boogeyman. Unless you're a specific kind of nerdy techy person, the idea isn't emotionally resonant in the same way your children being abused in a Satanic ritual is for most people. Perhaps a better candidate would be transgender people? A lot of people are freaking out over the idea of trans people abusing their children in much the same way people did with Satanic cults.
Also, for all the arguments you've actually said, this post just as readily applies to global warming/climate change.
What do you mean by not recommend? Not having it as a cause area is very different from not thinking it would be good.