EA is compelling insofar as it is about genuinely making the world a better place, ie we care about the actual consequences. Just because there are probably no specific people/processes to blame, doesn't mean we should be satisfied with how things are.
There is now decent evidence that EA might cause considerable harm in the world, so we should be strongly motivated to figure out how to change that. Maybe EA's failures are just the cost of ambition and agency, and come along with the good it does, but I think that's both untrue and worryingly defeatist.
I care about the end result of all of this, and the fact that we're okay with some serious Ls happening (and not being willing to fix the root cause of those errors) is concerning.
I think that's why it's informative. If EA radically changes in response to the FTX crisis, then it could easily put itself in a worse position (leading to more negative consequences in the world).
The intrinsic problem appears to be in the quality of the governance, rather than a systematic error/blind-spot.
My take is:
To be more clear, I am bringing the OpenAI drama up as it is instructive for highlighting what is and is not going wrong more generally. I don't think the specifics of what went wrong with FTX point at the central thing that's of concern. I think the key factor behind EA's past and future failures come down to poor quality decision-making among those with the most influence, rather than the degree to which everybody is sensitive to someone's shadiness.
(I'm assuming we agree FTX and the OpenAI drama were both failures, and that failures can happen even among groups of competent, moral people that act according to the expectations set for them.)
I don't know what the cause of the poor decision-making is. Social norms preventing people from expressing disagreement, org structures, unclear responsibilities, conflicts of interests, lack of communication, low intellectual diversity — it could be one of these, a combination, or maybe something totally different. I think it should be figured out and resolved, though, if we are trying to change the world.
So, if there is an investigation, it should be part of a move to making sure EAs in positions of power will consistently handle difficult situations incredibly well (as opposed to just satisfying people's needs for more specific explanations of what went wrong with FTX).
There are many ways in which EA can create or destroy value, and looking just at our eagerness to 'do something' in response to people being shady is a weirdly narrow metric to assess the movement on.
EDIT: would really appreciate someone saying what they disagree with