What about a scenario where long-termism turns out to be right, but there is some sort of community-level value drift which results in long-term cause areas becoming neglected, perhaps as a result of the community growing too quickly or some intra-community interest groups becoming too powerful? I wouldn't say this is very likely (maybe 5%), but we should consider the base rate of this type of thing happening.
I realise that this outcome might be subsumed in the the points raised above. Specifically, it might be that instead of directly trying to intervene in the long-term future, EA should have invested in sustainably growing the community with the intention of avoiding value drift (option 2) - I am just wondering how granular we can get with this pre-mortem before it becomes unhelpfully complex.
From a strategic point of view this pre-mortem is a great idea.
What about a scenario where long-termism turns out to be right, but there is some sort of community-level value drift which results in long-term cause areas becoming neglected, perhaps as a result of the community growing too quickly or some intra-community interest groups becoming too powerful? I wouldn't say this is very likely (maybe 5%), but we should consider the base rate of this type of thing happening.
I realise that this outcome might be subsumed in the the points raised above. Specifically, it might be that instead of directly trying to intervene in the long-term future, EA should have invested in sustainably growing the community with the intention of avoiding value drift (option 2) - I am just wondering how granular we can get with this pre-mortem before it becomes unhelpfully complex.
From a strategic point of view this pre-mortem is a great idea.