This is an open call for CEA to be more transparent with its finances and allocation of resources to different projects (historically and currently)
A quick google shows pretty inconsistent reporting metrics and update cadence over the past several years, as well as reporting gaps.
There is no easily available breakdown of funding / budgeting for most years
It seems like CEA staff do share numbers when asked ad-hoc - e.g. see this comment from JP Addison on spending of the Online team. But if someone wants to get a quick overview it would be incredibly time consuming to compute all the numbers
So it’s not that they are entirely transparent, but that they aren’t making this information easy to access, which feels bad / like obfuscation. And I think they are succeeding - I don’t think many people do have the time or inclination to sift through data.
As a key entity in the EA ecosystem (even if the scope changes), it seems good to demonstrate high transparency and data accessibility even if the decisions are not endorsed by the average community member.
Spicier take: I think they aren’t sharing it, in large part, because of optics. This feels like a bad reason not to be transparent. For example, I think right now, given FTX and less funding, the Events team in particular is hesitant to share how much they put into EAG(x)’s in 2022. Their update on spending post didn’t mention specific numbers. My understanding is that most people who requested it were given full travel & lodging subsidies (maybe up to $1500 per person, for thousands of individuals). This could be several million in funding in 2022 alone.
Ollie here from the CEA events team, thanks for this nudge. We’re planning on sharing an update w.r.t to our costs here later this year. You can also see my recent sequence about the costs of EAGx and how we prioritise among events (this doesn’t cover EA Global though).
Tickets are often around 10-20% of the cost to host each person (our funders are covering the rest).
Assuming $100 per ticket and 600 attendees it would be $300,000 - $600,000 per EAG(x), possibly excluding travel & lodging subsidies.
But I do not understand why you think these numbers are important. What actions would you do differently if the cost was $60,000 per event or $10,000,000 per event?
Some people seem hopeful that the new CEA CEO will lead to important changes and improvements in CEA. I’m pessimistic. The CEO is a powerful role, but faces a lot of constraints from trustees, funders, past decisions, and organizational setup.
While CEA’s finances are difficult to track down, it’s safe to assume a large chunk of their funding comes from OP, and specifically the OP LT team (estimated $30 million in general support from March 2022 - January 2023). The senior OP LT grantmaker who leads OP’s work on the EA community is one of three people on the search team. The search team has signaled openness to radical changes to CEA’s vision by CEO candidates, but also thinks CEA’s work has historically been “highly effective” and are open to candidates who just build on CEA’s existing work.
It’s likely the Center for Effective Altruism will continue to act as a talent recruitment arm for OP’s EA LT team’s priorities and be unenthusiastic about transparency and accountability to the broader EA community in its programs.
On the bright side, they may change their name to something more fitting, which would help publicize a vacuum that others can fill in - see prior discussion. This would help alleviate some of the historical problems of CEA wanting to have general community authority without responsibility towards to the EA community.
Note that this won’t wholly fix the problem. In the past when CEA or 80K have tried to be clearer on the scope of their work and left some clearer gaps for others, those gaps sometimes haven’t been filled. Even when they are attempted to be filled, new parties may lack the trust, skills and networks to excel at filling the gap because they are starting from scratch from an institutional knowledge POV.
The most likely possibility for fundamental improvements to CEA might be if someone very skilled who is also not deeply embedded within the EA community becomes CEO. But I’m pessimistic that this will happen.
While I don’t doubt the search team’s desire to have a fair process that includes outsiders from EA, why would skilled outsiders want to be head of CEA when they could have many better, less thankless (and probably more impactful) jobs?
An outside CEO will likely need to be a very impressive operator and communicator in addition to relevant alignment with the search committee and key funders and decision makers. Making fundamental improvements to CEA will involve a lot of cleaning up other people’s messes and making the best of bad situations for relatively little reward in the near-mid term. The dependence of funding on one megafunder with a niche vision and the complex web of relationships involved is not an easy problem to navigate for anyone, let alone someone seeking fundamental changes. And as an outsider it will take time and effort to simply understand the situation and what can be effectively changed and what can’t. Executing on real, substantive changes may inevitably involve some amount of conflict with powerful figures in EA. What would attract people who are capable of executing such tough tasks to the CEA CEO role?
Moreover, why would the search committee take the risk of recommending an outsider to EA, when they seem content with the status quo and hiring an outsider could be easily seen as a predictable mistake by key EA decision makers if it runs into problems? I think an outsider suggesting radical changes to CEA, especially changes that might (intentionally or unintentionally) pull the rug back on previous bad decisions, would have to be significantly more skilled and impressive than the best insider to be hired.
I think the best option might be a paired leadership role with one person more embedded in EA and the other less. But this seems unlikely. It can be difficult to construct and maintain co-leadership roles. Perhaps if there was an agreement for an insider to guide the outsider for some period of time before transitioning to the outsider being the sole leader, but this would require a level of commitment to the outsider that seems unlikely to be achieved even if deserved.
To summarize, I expect some minor improvements and changes but fundamental problems to remain with CEA, except for a possible name (and clear scope) change.
All that being said, I still recommended people apply for the role if they’re interested. Just be careful what you’re getting yourself into.
Thanks for writing this! Some quick thoughts on possibilities for CEA to consider:
Moving to a Membership Model: I think Open Phil's status as the main customer of CEA (raised above) is a problem and that a move to CEA as a membership organization (with board elected by the membership) could help with this. Membership could be anyone who provides evidence of giving >5% of money to charity (maybe excluding other religious groups) who chooses to register as a member. (You could also create some sort of application process for people outside the 5% donors -- that number just seems to be a useful commitment mechanism).
Rotating Annual Presidents: One way to get broader buy-in and legitimacy would be to do what professional societies do and have the public face of the organization (the president) rotate each year (or on some regular basis) and then have an executive director who manages the organization's operations. This could also help organize how CEA's board should function (since often professional societies structure their board around the transition from past to future presidents, where the board is made up of next year's president, the current president, the past year's president, and a few other potential candidates for the next year's president).
Dissociate from FTX: It would probably be good for people who worked at FTX/FTX Foundation to leave the EV/CEA board prior to the Sam Bankman Fried trial.
Also a direction for CEA that would interest me would be to search for, evaluate, and highlight historical or current effective altruist projects in the world (i.e. things that are plausibly altruistic and come from outside the "effective altruist" community but are likely to fall within 1/10th the GiveWell bar).
Will flag that I think EA should move towards a much more decentralized community/community-building apparatus (e.g. split up EV into separate nonprofits that may contract with the same entity for certain back-office functions). I also think EA community building should be cause neutral/individual centric and not community/cause-centric (i.e. support people who want to be effectively altruistic in their attempt to live a meaningful life rather than drive energy towards effective causes). I think the attempt to sort of be utilitarian all the way down and use the community-building arm to drive towards the most effective goals creates harmful epistemic and political dynamics -- a more neutral and member-empowering approach would be better.
I have a lot of respect for many individuals at CEA and OP. At the same time, I'd definitely agree that things haven't been as good as many would like.
I believe that OP is funds the large majority of CEA. They've been doing so for a while, and now control multiple board seats at CEA.
Some quick thoughts: - If OP is by far the largest donor of CEA, then they're effectively the main customer. If other EA groups donated more or would discuss donations with them, I'd expect that other interests would correspondingly be better represented. - I think a lot of senior talent is now more excited about the AI safety space and others than they are the "core EA infrastructure" space, especially because this has so few donors (almost solely OP, it feels). - I'd guess that the people previously in charge of things here are not going to make highly radical reforms. It's roughly the same people as were doing things before, ultimately. - I haven't yet come across many funders or founders other than CEA/OP who seem excited to spend many resources reforming EA. - I think the people working on these things have been and will continue to work fairly long hours, and are generally trying to do good things, for what's that's worth. - CEA has previously stated that there's a whole lot that they explicitly won't do. I think people have high expectations/wants from them, that would have been really hard for them to provide. (It's really tough to do huge things well, being targeted is often essential.) I'd expect that given the whole FTX situation, we should expect things to be harder, not easier, for a while. - A lot of CEA is made up of specific teams (community health, online, events). I think these teams have done useful work and would naively expect them to continue such work. - I also recommend people apply to these roles. I think there's great work to be done here, I'm just not expecting lots of growth to come quickly.
It can be difficult to construct and maintain co-leadership roles.
This might generally be true, but some of the more prominent EA organisations have successfully pulled this off, with Rethink Priorities having both Peter and Marcus as Co-CEOs or Open Philanthropy with their temporary Co-CEO split.
Counterpoints: - limited data available (Co-CEOs still in the minority, few successful case studies of Co-CEO partnerships that lasted decades, not just years) - RP splits their portfolio and so does OP, so a split in executive leadership seems reasonable - I'm unsure what such a split might look like for CEA
As an addendum, it's also non-trivial to find out the exact appointment & departure dates of all the EVF UK & US board of trustees over the history of the organization. The Way Back Machine is somewhat spotty, and financial records for EVF UK are hard to find.
It would be good to have a history of the leadership & trustees of both entities somewhere publicly available.
1. As far as I know [edited] there hasn't been a previous rule around strong language not resulting in a front page ban since the language was toned down in response to feedback and the comments were civil and productive. (Note that the author still hasn't changed the title, which a number of people have commented on, but this doesn't seem like it would have changed the mod decision)
2. I also agree with Linch's point that correctness hasn't been a distinction point either.
3. Several people left responses and criticisms of the decision, all were popular with readers but the mod team has not (yet) replied or substantively engaged with those comments (which seem reasonable).
It is really hard to be highly critical of a persons behavior without coming off as uncharitable or mean. If you see someone regularly make shockingly wrong statements, and then double down on them when confronted, are you allowed to describe them as "egregiously overconfident"? "shockingly overconfident"? "regularly overconfident to a large degree"? Which of these are uncharitable, and which of these are just an accurate statement of your beliefs about a person's behavior?
This is why people really need to be cut some slack when engaging in good faith criticism of public figures. Ignoring all criticism that is not presented in a sufficently nice tone is not far off from just ignoring all criticism.
I just think this is a "law of opposite advice" situation! You're right, but the point is that EAs are already trying so so hard to correct in this direction that it's a little silly sometimes (certainly on the forum). The hugboxing frame makes a lot of sense to me.
I think that this is an iterated game. I'd encourage people to have their own blogs, with their own rss feeds and comment sections (as in, e.g., substack), and then occasionally crosspost to the EA Forum. Then EA Forum censoriousness doesn't matter as much.
Is the game something like "EA online discussion norms" and the strategy that you are proposing something like "make your writings independent of the EA forum, and allow for competing discussion spaces on your posts"?
Is the game something like "EA online discussion norms"
Mmh, probably, but I was thinking about it less abstractly, e.g., CEA is offering the EA community a forum with such and such characteristics, the EA community responds by xyz, etc.
make your writings independent of the EA forum, and allow for competing discussion spaces on your posts
Yes, but not "competing", such that either option is ok, but "separate", such that the EA forum doesn't have to have a role to play. For example, this blogpost of mine: https://nunosempere.com/blog/2023/07/19/better-harder-faster-stronger/ has a rich comment section, and it didn't need the EA forum to have it.
FWIW I think it would likely be hard for most people (especially those without a strong internet presence or who don't write regularly) to have a rich comments section on other platforms, but I could be underestimating the difficulty of getting blog readers.
Yep. At the same time, it becomes easier over the course of a career. And the easier option is to post something on substack and crosspost it to the forum, which gets a bunch of the benefits.
The point is that it seemed like the post was banned from the front page because of strong language, and this doesn't see to have been a rule that has been enforced in the past.
EAforum needs a broad "obvious bad actor" ban, since EA is the kind of group where an unusually large proportion of people are sufficiently capable (and motivated) at thinking outside the box to work around/bend rules in order to achieve a goal.
It's possible that strategic ambiguity might be neccesary, since such a large number of people are willing and able to spend a ton of time thinking of ways to work around the rules in pursuit of an objective. If the rules aren't clear, then people can't form complex plots to find loopholes or historically unprecedented approaches.
One problem with scandals is that there’s no real solution or steps taken in response to them. When stuff happens, there is a lot of discussion, but little changes structurally or institutionally.
I think the community was not meant to scale. OP and EVF (affiliated organization) promoted EA growth in the past so they could get more people working on their priority problems but never set in place mechanisms to govern a larger community.
Essentially, I think to some extent these organizations see the EA community as an means to an end, and don't want to take responsibility for the messiness that comes with building a community.
I would like to see the EA community set up in ways to govern itself rather than depend on a few actors (OP/EVF ecosystem) whose incentives don't always line up with what is best for the community.
This is an open call for CEA to be more transparent with its finances and allocation of resources to different projects (historically and currently)
As a key entity in the EA ecosystem (even if the scope changes), it seems good to demonstrate high transparency and data accessibility even if the decisions are not endorsed by the average community member.
Spicier take: I think they aren’t sharing it, in large part, because of optics. This feels like a bad reason not to be transparent. For example, I think right now, given FTX and less funding, the Events team in particular is hesitant to share how much they put into EAG(x)’s in 2022. Their update on spending post didn’t mention specific numbers. My understanding is that most people who requested it were given full travel & lodging subsidies (maybe up to $1500 per person, for thousands of individuals). This could be several million in funding in 2022 alone.
Ollie here from the CEA events team, thanks for this nudge. We’re planning on sharing an update w.r.t to our costs here later this year. You can also see my recent sequence about the costs of EAGx and how we prioritise among events (this doesn’t cover EA Global though).
To follow up here, Eli recently published this post outlining recent costs and what we plan on doing to bring them down.
I was excited to see this post - appreciate the events team sharing this.
https://twitter.com/Ollie_Base/status/1695084951807349095
Assuming $100 per ticket and 600 attendees it would be $300,000 - $600,000 per EAG(x), possibly excluding travel & lodging subsidies.
But I do not understand why you think these numbers are important. What actions would you do differently if the cost was $60,000 per event or $10,000,000 per event?
ETA: I was very wrong: EAGx events do typically cost $150–500k, but the CEA-run EAGs cost $2-3M
Thanks for copying this across!
Yep, your estimate was right for EAGxNYC (~$500k) but that was much cheaper than EA Global.
Some people seem hopeful that the new CEA CEO will lead to important changes and improvements in CEA. I’m pessimistic. The CEO is a powerful role, but faces a lot of constraints from trustees, funders, past decisions, and organizational setup.
While CEA’s finances are difficult to track down, it’s safe to assume a large chunk of their funding comes from OP, and specifically the OP LT team (estimated $30 million in general support from March 2022 - January 2023). The senior OP LT grantmaker who leads OP’s work on the EA community is one of three people on the search team. The search team has signaled openness to radical changes to CEA’s vision by CEO candidates, but also thinks CEA’s work has historically been “highly effective” and are open to candidates who just build on CEA’s existing work.
It’s likely the Center for Effective Altruism will continue to act as a talent recruitment arm for OP’s EA LT team’s priorities and be unenthusiastic about transparency and accountability to the broader EA community in its programs.
On the bright side, they may change their name to something more fitting, which would help publicize a vacuum that others can fill in - see prior discussion. This would help alleviate some of the historical problems of CEA wanting to have general community authority without responsibility towards to the EA community.
Note that this won’t wholly fix the problem. In the past when CEA or 80K have tried to be clearer on the scope of their work and left some clearer gaps for others, those gaps sometimes haven’t been filled. Even when they are attempted to be filled, new parties may lack the trust, skills and networks to excel at filling the gap because they are starting from scratch from an institutional knowledge POV.
The most likely possibility for fundamental improvements to CEA might be if someone very skilled who is also not deeply embedded within the EA community becomes CEO. But I’m pessimistic that this will happen.
While I don’t doubt the search team’s desire to have a fair process that includes outsiders from EA, why would skilled outsiders want to be head of CEA when they could have many better, less thankless (and probably more impactful) jobs?
An outside CEO will likely need to be a very impressive operator and communicator in addition to relevant alignment with the search committee and key funders and decision makers. Making fundamental improvements to CEA will involve a lot of cleaning up other people’s messes and making the best of bad situations for relatively little reward in the near-mid term. The dependence of funding on one megafunder with a niche vision and the complex web of relationships involved is not an easy problem to navigate for anyone, let alone someone seeking fundamental changes. And as an outsider it will take time and effort to simply understand the situation and what can be effectively changed and what can’t. Executing on real, substantive changes may inevitably involve some amount of conflict with powerful figures in EA. What would attract people who are capable of executing such tough tasks to the CEA CEO role?
Moreover, why would the search committee take the risk of recommending an outsider to EA, when they seem content with the status quo and hiring an outsider could be easily seen as a predictable mistake by key EA decision makers if it runs into problems? I think an outsider suggesting radical changes to CEA, especially changes that might (intentionally or unintentionally) pull the rug back on previous bad decisions, would have to be significantly more skilled and impressive than the best insider to be hired.
I think the best option might be a paired leadership role with one person more embedded in EA and the other less. But this seems unlikely. It can be difficult to construct and maintain co-leadership roles. Perhaps if there was an agreement for an insider to guide the outsider for some period of time before transitioning to the outsider being the sole leader, but this would require a level of commitment to the outsider that seems unlikely to be achieved even if deserved.
To summarize, I expect some minor improvements and changes but fundamental problems to remain with CEA, except for a possible name (and clear scope) change.
All that being said, I still recommended people apply for the role if they’re interested. Just be careful what you’re getting yourself into.
(Note: Multiple people contributed to this note.)
Thanks for writing this! Some quick thoughts on possibilities for CEA to consider:
Also a direction for CEA that would interest me would be to search for, evaluate, and highlight historical or current effective altruist projects in the world (i.e. things that are plausibly altruistic and come from outside the "effective altruist" community but are likely to fall within 1/10th the GiveWell bar).
Will flag that I think EA should move towards a much more decentralized community/community-building apparatus (e.g. split up EV into separate nonprofits that may contract with the same entity for certain back-office functions). I also think EA community building should be cause neutral/individual centric and not community/cause-centric (i.e. support people who want to be effectively altruistic in their attempt to live a meaningful life rather than drive energy towards effective causes). I think the attempt to sort of be utilitarian all the way down and use the community-building arm to drive towards the most effective goals creates harmful epistemic and political dynamics -- a more neutral and member-empowering approach would be better.
I have a lot of respect for many individuals at CEA and OP. At the same time, I'd definitely agree that things haven't been as good as many would like.
I believe that OP is funds the large majority of CEA. They've been doing so for a while, and now control multiple board seats at CEA.
Some quick thoughts:
- If OP is by far the largest donor of CEA, then they're effectively the main customer. If other EA groups donated more or would discuss donations with them, I'd expect that other interests would correspondingly be better represented.
- I think a lot of senior talent is now more excited about the AI safety space and others than they are the "core EA infrastructure" space, especially because this has so few donors (almost solely OP, it feels).
- I'd guess that the people previously in charge of things here are not going to make highly radical reforms. It's roughly the same people as were doing things before, ultimately.
- I haven't yet come across many funders or founders other than CEA/OP who seem excited to spend many resources reforming EA.
- I think the people working on these things have been and will continue to work fairly long hours, and are generally trying to do good things, for what's that's worth.
- CEA has previously stated that there's a whole lot that they explicitly won't do. I think people have high expectations/wants from them, that would have been really hard for them to provide. (It's really tough to do huge things well, being targeted is often essential.) I'd expect that given the whole FTX situation, we should expect things to be harder, not easier, for a while.
- A lot of CEA is made up of specific teams (community health, online, events). I think these teams have done useful work and would naively expect them to continue such work.
- I also recommend people apply to these roles. I think there's great work to be done here, I'm just not expecting lots of growth to come quickly.
This might generally be true, but some of the more prominent EA organisations have successfully pulled this off, with Rethink Priorities having both Peter and Marcus as Co-CEOs or Open Philanthropy with their temporary Co-CEO split.
Counterpoints:
- limited data available (Co-CEOs still in the minority, few successful case studies of Co-CEO partnerships that lasted decades, not just years)
- RP splits their portfolio and so does OP, so a split in executive leadership seems reasonable - I'm unsure what such a split might look like for CEA
As an addendum, it's also non-trivial to find out the exact appointment & departure dates of all the EVF UK & US board of trustees over the history of the organization. The Way Back Machine is somewhat spotty, and financial records for EVF UK are hard to find.
It would be good to have a history of the leadership & trustees of both entities somewhere publicly available.
Seems right to me, +1
I'm disappointed in the mod team's recent actions regarding this post on Yudkowsky.
1. As far as I know [edited] there hasn't been a previous rule around strong language not resulting in a front page ban since the language was toned down in response to feedback and the comments were civil and productive. (Note that the author still hasn't changed the title, which a number of people have commented on, but this doesn't seem like it would have changed the mod decision)
2. I also agree with Linch's point that correctness hasn't been a distinction point either.
3. Several people left responses and criticisms of the decision, all were popular with readers but the mod team has not (yet) replied or substantively engaged with those comments (which seem reasonable).
Edits / notes since posting:
I edited 1) and added point 2) after Linch's clarification comment.
Note that Lizka responded to the post a few hours after this shortform went up (thanks for flagging Lorenzo)
It is really hard to be highly critical of a persons behavior without coming off as uncharitable or mean. If you see someone regularly make shockingly wrong statements, and then double down on them when confronted, are you allowed to describe them as "egregiously overconfident"? "shockingly overconfident"? "regularly overconfident to a large degree"? Which of these are uncharitable, and which of these are just an accurate statement of your beliefs about a person's behavior?
This is why people really need to be cut some slack when engaging in good faith criticism of public figures. Ignoring all criticism that is not presented in a sufficently nice tone is not far off from just ignoring all criticism.
I just think this is a "law of opposite advice" situation! You're right, but the point is that EAs are already trying so so hard to correct in this direction that it's a little silly sometimes (certainly on the forum). The hugboxing frame makes a lot of sense to me.
I think that this is an iterated game. I'd encourage people to have their own blogs, with their own rss feeds and comment sections (as in, e.g., substack), and then occasionally crosspost to the EA Forum. Then EA Forum censoriousness doesn't matter as much.
Is the game something like "EA online discussion norms" and the strategy that you are proposing something like "make your writings independent of the EA forum, and allow for competing discussion spaces on your posts"?
Mmh, probably, but I was thinking about it less abstractly, e.g., CEA is offering the EA community a forum with such and such characteristics, the EA community responds by xyz, etc.
Yes, but not "competing", such that either option is ok, but "separate", such that the EA forum doesn't have to have a role to play. For example, this blogpost of mine: https://nunosempere.com/blog/2023/07/19/better-harder-faster-stronger/ has a rich comment section, and it didn't need the EA forum to have it.
FWIW I think it would likely be hard for most people (especially those without a strong internet presence or who don't write regularly) to have a rich comments section on other platforms, but I could be underestimating the difficulty of getting blog readers.
Yep. At the same time, it becomes easier over the course of a career. And the easier option is to post something on substack and crosspost it to the forum, which gets a bunch of the benefits.
Lizka just replied here
To be clear, that was not my exact claim. My claim was that correctness has not historically been a frontpage/personal post distinction.
Thanks for clarifying Linch, removing the reference to your comment since it's making a different claim.
(out of curiosity, do you agree with that statement as it stands?)
tbh I'm confused about the double negative and I'm not entirely sure what the exact statement I might be agreeing with is.
The point is that it seemed like the post was banned from the front page because of strong language, and this doesn't see to have been a rule that has been enforced in the past.
EAforum needs a broad "obvious bad actor" ban, since EA is the kind of group where an unusually large proportion of people are sufficiently capable (and motivated) at thinking outside the box to work around/bend rules in order to achieve a goal.
EAforum currently has guidelines on this, but no clear policy:
It's possible that strategic ambiguity might be neccesary, since such a large number of people are willing and able to spend a ton of time thinking of ways to work around the rules in pursuit of an objective. If the rules aren't clear, then people can't form complex plots to find loopholes or historically unprecedented approaches.
Nice rendering! It’s very pretty
One problem with scandals is that there’s no real solution or steps taken in response to them. When stuff happens, there is a lot of discussion, but little changes structurally or institutionally.
I think the community was not meant to scale. OP and EVF (affiliated organization) promoted EA growth in the past so they could get more people working on their priority problems but never set in place mechanisms to govern a larger community.
Essentially, I think to some extent these organizations see the EA community as an means to an end, and don't want to take responsibility for the messiness that comes with building a community.
I would like to see the EA community set up in ways to govern itself rather than depend on a few actors (OP/EVF ecosystem) whose incentives don't always line up with what is best for the community.
You might enjoy this https://forum.effectivealtruism.org/posts/vaqoGFRdi6ftvwGkn/what-is-effective-altruism-how-could-it-be-improved