I agree that a wisdom explosion would have been a much better path for humanity. But given the competitive pressures driving AGI today, do you think there was ever a realistic scenario where that path would have been chosen?
If capitalism and geopolitics inherently reward intelligence maximization over wisdom, wouldn’t that have always pushed us toward an intelligence explosion, no matter what people hoped for?
In other words, was a wisdom-first approach ever actually viable, or was it just an idealistic path that was doomed from the start?
I believe you're psychologically sidestepping the argument, and I discuss reactions like this in my latest essay if you'd like to take a look.
Thanks again for your thoughts. You're right—we haven't empirically tested a wisdom-first approach. However, my core argument is that capitalism and geopolitics inherently favor rapid intelligence gains over incremental wisdom. Even incremental wisdom progress would inevitably lag behind more aggressive intelligence-focused strategies, given these systemic incentives.
The core of my essay focuses on the almost inevitable extinction of humanity at the hands of AGI, which literally no one has been able to engage with. I think your focus on hypothetical alternatives rather than confronting this systemic reality illustrates the psychological sidestepping I discuss in my recent essay. If you have time, I encourage you to take a look.