Hide table of contents

Charity Entrepreneurship (CE) is researching EA meta as one of four cause areas in which we plan to launch charities in 2021. EA meta has always been an area we have been excited about and think holds promise (after all, CE is a meta charity). Historically we have not focused on meta areas for a few reasons. One of the most important is that we wanted to confirm the impact of the CE model in more measurable areas such as global health and animal advocacy. After two formal programs we are now sufficiently confident in the model to expand to more meta charities. We were also impressed by the progress and traction that Animal Advocacy Careers made in their first year. Founded based on our research into animal interventions, this organization works at a more meta level than other charities we have incubated. 

In this document, I summarize the results of 40 interviews conducted with EA experts. These interviews constitute part of CE’s research into potentially effective new charities that could be founded in 2021 to improve the EA movement as a whole.

Methodology

In discussing meta charities, we are using a pretty broad definition. We include both charities that are one step removed from impact but in a single cause area (such as Animal Advocacy Careers), and more cross-cutting meta charities within the effective altruist movement.

Generally our first step when approaching an area would involve broad research and literature reviews. However, given the more limited resources focused on meta EA and our stronger baseline understanding of the area, we wanted to supplement this information with a broad range of interviews. We ended up speaking to about 40 effective altruists (EAs) across 16 different organizations and 8 current or former chapter leaders. We tried to pick people who had spent considerable time thinking about meta issues and could be considered EA experts, and overall aimed for a diverse range of perspectives. 

The duration of the interviews ranged from 30 minutes to 2 hours, running about an hour on average. Not everyone answered every question but the majority of questions were answered by the majority of people. The average question got ~35 responses and none got fewer than 30. Interviewees were informed that an EA Forum post would be written containing the aggregated data but not individual responses. The background notes ended up being ~100 pages. 

We broke down the questions asked into three sections:

  1. Open questions
    • What meta ideas might be uniquely impactful?
    • What ideas might be uniquely unimpactful?
  2. Crucial considerations
    • Expand vs improve
    • Time vs money vs information
    • Broad vs narrow
    • What do you think of the current EA community trajectory?
    • What do you think are the biggest flaws of the EA movement?
  3. Specific sub areas
    • We took ideas that people had historically suggested on the EA Forum and organized them into around a dozen categories, providing examples for each.
    • For each category, we were interested in whether it was seen as above or below average, as well as if any specific ideas stood out as promising.

The descriptions below aim to reflect the aggregate responses I got, not what CE thinks or my impression after speaking to everyone (that will be a different post). The results constitute one (but not the only) piece of data CE will use when coming to recommendations for a new organization. 

Results

1. Open questions

This was the hardest area to synchronize. It was surprising how much overall divergence there was between different people in terms of what ideas and concepts were seen as the most important.

Lots of ideas that came up in the open questions were covered in the category areas, but open questions also brought up original ideas that I subsequently added to the categories. One of the most interesting parts was certain concepts that came up a lot that fell outside of our categories and crucial considerations. Some concepts that came up that are not described more deeply in other sections were

  • Support for chapters
  • Consistency and clarity in the scope of meta orgs
  • Ideas that other orgs plan on doing in the future.

Chapter support: Quite a few chapters mentioned that they could share resources and coordinate much better. It seemed like chapters can lose momentum when their leadership turns over or when the structure of support that is being offered changes dramatically. Quite a few chapters felt uncertain and lacked confidence in what the chapter landscape would look like long term, or what the career paths moving forwards from working in chapters would be. 

Consistency and clarity in the scope of meta orgs: A lot of people mentioned that people overestimated the scope of what various organizations cover, and that posts explaining current and future scoping plans are super helpful. Many people said that new career orgs were very positively affected by 80,000 Hours clarifying and describing its scope on the EA Forum, and that more posts like this could further help meta charities getting started. 

Covered ideas: Lots of people and organizations also mentioned to me ideas about ground they plan to cover. So in some cases, ideas are not mentioned in the sections below as it seems they could be covered well by existing actors in the near future and are thus less relevant to the potential of new organizations. 

2. Crucial considerations

Expand vs improve: The first crucial consideration I asked most people about was expanding the EA movement vs improving the people already involved. I also sometimes described this as an internal focus vs an external focus. 35% of interviewees thought it was better to generally focus on expanding, 42% to focus on improving, and 23% were unsure whether one was better than the other. 

  • A lot of the expansion-focused people highlighted that EA is still very small, and that relatively low-cost outreach investments have succeeded in getting lots of people highly involved and doing impactful things.
  • Improvement-focused individuals generally highlighted major weaknesses of the EA community that ideally would be improved before a large number of new people enter, or the risk of the EA community becoming more diluted with lots of new people entering it. A particular concern that came up was that there is already a lot of frustration from job seekers and individuals not having a sense of how they can have an impact outside of a pretty small number of jobs and opportunities, with expansion likely to aggravate this problem.
  • The unsure people generally thought there are some good opportunities in both or that they are more connected (e.g. improving the community would naturally lead to expansion and vice versa).

Time vs money vs information: This is a common way of breaking up meta charities; the question asked which of the three is the most important focus for new charities. Time or talent was focused on careers and volunteer hours. Money was focused on funding and fundraising. Ideas/information was focused on research output and the creation of concepts. Overall 34% of people thought money is most important to focus on, with 26% of people finding ideas most important and 23%, talent. 17% were unsure or thought all areas are equally well covered. 

  • People focusing on money pointed out the historical success the EA movement had in finding new funders and the large room for funding in certain cause areas (particularly global poverty); they also generally thought that that money could be converted into talent or ideas.
  • Ideas focused individuals often flagged that although challenging to work on, ideas and concepts are one of the key outputs of the EA movement and could lead to very large improvements compared to just increasing the talent or money moved in directions currently seen as the most impactful.
  • Talent focused people often highlighted the limited pathways and opportunities for people who fell outside the top jobs in EA organizations or fellowship and incubation programs. Many people noted that talent historically was far more neglected before the founding of new organizations such as AAC, WANBAM and Probably Good. Many felt like they had made large improvements in the area.

Broad vs narrow: The next question was focused on broader meta organizations (for example, general mentorship or TED Talk-style outreach) vs more narrowly focused approaches (for example, training a small number of fellows or outreach for a specific idea such as effective giving). 41% leaned towards broad, 32% leaned towards narrow and 26% were unsure. In general, people seemed more tentative when considering these options than the other crucial considerations I asked about. 

  • Broader people generally brought up that broad outreach is a unique advantage that EA has over other movements or just nonprofits creating support networks for themselves.
  • Narrower people tended to think more focused interventions generally lead to nonprofits concentrating on the most impactful thing.

One approach that seemed to fit a lot of views was that organizations should generally start narrow, get really good at what they are doing, then expand, scale, and broaden. 

EA community trajectory: The next question asked for general thoughts on the EA community and about any positive trends that could be supported or negative trends that could be mitigated. This ended up tying quite closely to the next question asked, about the biggest current flaws in the EA movement. As such, I will talk about the results from both questions here. 

A few concerns with the EA movement that seemed promising for new organizations to tackle came up a number of times. The 7 listed below each came up 10 times or more, and so are worth diving into in a bit more depth.  

  1. Reinventing the wheel
  2. Overconfidence and misplaced difference
  3. Limited opportunities for engagement and involvement
  4. Abstraction in work
  5. Lack of transparency/info hazards
  6. Closed-mindedness/stagnation
  7. In-group helping/trust culture

The biggest concern and a trend that came up again and again was that EAs tend to reinvent the wheel a lot. Things like financial advice, management training, or even just organizational best practices are often re-derived or come to by trial and error as opposed to through talking to experts outside of the EA community. People generally thought this was more of an issue for areas that are well established outside of EA as opposed to areas that were more unique to EA (e.g. approaches to prioritization, charity reviewing etc). 

A related issue that came up was overconfidence in EA and misplaced deference. Many people expressed major concerns that EAs will often defer to other EAs who have thought even briefly about an issue as opposed to outside experts who have put considerably more time into the area. This was often described as EAs being far too confident in the movement as a whole and its abilities relative to those outside the EA movement. It was also mentioned that often more sophisticated processes are assumed to be used, than are used in practice. For example, the assumption is that grant makers in EA put GW-like levels of time or rigor into their grantmaking, when in practice far less rigor tends to be used. Another example of this was cause selection – many people expressed concerns about people assuming much stronger methodologies for initial EA cause selection than actually happened. 

Another common concern with the EA movement was the limited opportunities for engagement and involvement. Past posts that have been strongly upvoted on the EA forum have talked about related issues so it was not a surprise that it came up on many people's minds. A related concern was that the wrong advice is often given out to people, encouraging them to upskill and apply for the few highly competitive EA jobs instead of pursuing other paths to impact. In general lots of people suggested that having a greater variety of avenues that are socially respected and seen as impactful would be a really important way for the EA community to improve. 

Abstraction in work was another weakness brought up by the community. This concern was often directed at philosophical research. Many expressed concerns that this is high status and fun work to do but has pretty questionable impact on the world. Many people expressed a similar sentiment of EA needing more “do-ers” vs thinkers. 

Lack of transparency and groups being overly concerned with info hazards (i.e. the potentially harmful consequences of sharing true information; read more here) was another recurring theme. A few people suggested that certain groups are extremely risk averse with info hazards in a way that is disproportionate to what experts in the field find necessary and that ultimately harms the community. Some also questioned the intents of risk aversion (e.g. it being used to avoid more scrutiny of work). Many commented that transparency and open discussion used to be quite valued in EA and now feels far more discouraged. 

The above concern tied into another worry around EA closed-mindedness and intellectual stagnation. In particular, concerns came up with close mindedness to new ideas expressed by those who did not signal EA-ness clearly enough, e.g. using different vocabulary. But in general many expressed that if certain cause areas were around earlier they would likely be seen as a top EA cause area (biorisk and mental health were both mentioned specifically). Although many had concerns around general intellectual stagnation, people were unsure if this was due to low hanging fruit being picked, more conversations happening behind closed doors, or the EA movement getting more closed-minded in general.  

The last concern that came up many times was worries about EAs helping each other vs people outside of their group. This was described as both an issue with funding (e.g. a lot of funding coming through personal relationships and trust) and with project directions (e.g. if a lot of EAs were unsatisfied with an issue a lot of work would be done on it even if the area did not impact the wider world much). 

3. Sub areas

We took ideas that people had historically suggested on the EA forum (for example here, here, and here). We then organized them into nine categories, providing examples for each. For each category we were interested in if it was seen as above or below average, as well as whether any specific ideas seemed to stand out as promising. The summary results ordered by average score can be seen in table 1. Each category is described in deeper detail below.

Table 1

Category Examples of ideas

Average score

EA explorationCause X, new career paths, steelmanning ideas, bringing in information from other causes and areas, new metrics and methodologies (e.g. see research done by HLI)

3.58

Targeted EA outreachInheritance, certain careers, policy, location, fundraising, skeptics, humanists

3.38

EA content improvementSummarization, consolidation, preservation, sharing from internal to external, yearly handbook, creation of country specific content

2.86

EA community member improvementTraining, loans, increasing involvement, increasing dedication, near termism advocacy, tiers of involvement, book clubs, subcommunites in EA e.g. religious groups, advanced EA content

2.75

General EA outreachMedia, documentaries, books, common search terms, intro talks, chapters, content translation, EA courses, quizzes

2.74

Improving EA orgsConducting studies, reducing burnout, productivity improvement, researching flaws of the EA movement, improving coordination, process improvement, finding talent

2.74

Intercommunity coordination and connectionCoworking, hubs, group meetings, meta operations, donor advised funds services, research coordination

2.70

Evaluation and prioritizationProject ideas in established cause areas, meta projects, policy areas, career options, evaluating EA meta orgs, prediction improvement projects, improved talent processing, evaluations on demand

2.67

EA funding improvementReduced consolidation, improved methodology, increased vetting, reduced nepotism

2.61

Table 2

CategoryScore / 5Overall views More promising ideas in this categoryLess promising ideas in this category
EA exploration

3.58

A lot of people commented that this is core to EA as a whole and the framing of EA as a question. Many also felt this was pretty neglected and was done currently on a more ad hoc basis.People really liked the idea of bringing in best practices and ideas from non-EA sources. People also felt there was more room for systematic cause X research.People generally thought the career space is more populated and that although new perspectives would be really impactful, it would be very hard to find them.
Targeted EA outreach

3.38

Many felt as though outreach had really slowed down in recent years and there was still lots of room for improved outreach to specific communities. Many thought this is a higher risk thing and would have to be done carefully.A lot of people felt as though outreach for key ideas within EA would be really good (e.g. effective giving outreach) as well as lower risk than promoting EA as a whole. Others felt that given REG's success more such pathways with specific career-based demographics could be impactful.There were some concerns that targeting a particular group too successfully could change the culture of EA and make it less general. There were also concerns about doing outreach to policy makers or high net worth individuals unless a very strong plan were made. 
EA content improvement

2.86

This content question was more focused on content improvement and organization as opposed to the creation of new content. Most thought there is plenty of content but that it remains inaccessible and disorganized for newcomers. People particularly liked the idea of content being made in formats not currently used as much by EA, such as graphics and video content. Summarization was also seen as helpful by many. Many people who ranked this lower thought a lot of it was already being done and pointed to some failed projects in the space that required more upkeep than expected. 
EA community member improvement

2.75

Training was quite hit and miss in people’s minds. This was one of the categories with high variance between ideas.  There was a mix of skills that people would like to see more of in the movement including increased research and communications skills.There were worries here about EAs helping other EAs vs outsiders (e.g. in the case of offering loans), or reinventing the wheel such as for professional services training.
General EA outreach

2.74

People thought this was a neglected area but also one of the highest risk areas for the EA movement. People’s feelings towards the risk differed greatly. A common sentiment was that we can always grow later.Some historical successes were pointed out in this field such as handing out free books or the TED Talks. However, successes have been hard to predict. Concerns were expressed about lots of things that people expected to be very powerful outreach tools seemed to bring in far fewer new EAs than expected e.g. media engagements, most podcasts. There were also concerns about what to do with people given the challenges of the EA job space.
Improving EA orgs

2.74

Many people felt EA orgs have lots of room to improve but that the tractability of this for an outside non-funder org would be really low. No idea consistently stood out as better for people. The closest would be headhunting but even that was seen as hit and miss.The major concern here was tractability, with many EA orgs lacking pathways to coordinate with an external org.  
Intercommunity coordination and connection

2.70

Views on this category were polarized. Some people thought this was the most promising area and others the least.Those who thought coordination could be improved often saw gaps between what organizations knew about each other. Many thought that community coordination already happens internally at as high a level as could be expected.
Evaluation and prioritization

2.67

Although many felt like this was a strong area for EA most people felt as though it was already pretty covered by organizations such as GiveWell. Some people did see room for prioritization in areas that did not have a lot of work in them historically such as policy or volunteer tasks.The more skeptical thought it was hard to do prioritization well and lots of orgs naturally do this even if it was not their stated goal. 
EA funding improvement

2.61

Problems were recognized but most people had major concerns with tractability as an external org. Most instead suggested more diverse outreach as a better solution.People thought the best way to improve funding was generally having a wider range of funders, particularly in areas where one or two funders dominate the space. A few people thought increased consolidation was good and leads to less double vetting. Everyone agreed it would be really hard to change funders’ actions.



Lots of other ideas were also mentioned. People were generally pretty positive towards other ideas but not many of them came up across a wide range of people; I chose to focus on the more common views as more data were available. 

Overall we found this information very helpful and identified both more new ideas and a greater range of views than we expected across EAs from different spaces. 

If you might be interested in founding ideas like these in the EA meta space, we recommend applying to the CE 2021 Incubation Program, which runs June 28 to August 27, 2021. First round applications are open now and close on January 15. 

Comments13
Sorted by Click to highlight new comments since:

Thanks for sharing this Joey - really interesting!

>A particular concern that came up was that there is already a lot of frustration from job seekers and individuals not having a sense of how they can have an impact outside of a pretty small number of jobs and opportunities, with expansion likely to aggravate this problem.

This sentiment strikes me as incorrect, and a bit worrying. My perception is that EA could do with spreading out a bunch from its current distribution. For example, it would be great to see more policy work along the lines of HIPE for different governments (eg I know of very little few EAs thinking about EU policy) , more specialists in different types of potentially important technology like recommender systems and a greater breadth of understanding of different potentially important organisations, whether intergovernmental like the WHO and UN, or companies. I'm also excited about people starting more effective non-profits, as CE is facilitating. My perception is that what's hard for people seeking career paths is figuring out what path would best suit them, rather than an actual lack of roles which help others in expectation. It's not at all clear that expansion would make this problem worse rather than better, and it strikes me that it would depend a lot on how EA grows. For example, if we grew by attracting people from a broader range of organisations then they might bring more ideas for which people would be best suited helping where. If more people joined of an entrepreneurial mindset, perhaps it would cause new charities to spring up and so increase the number of roles available. That's not to say I think all things considered that EA should be expanding rather than improving - I'm not sure about that. But at the extreme this reasoning looks like avoiding the expansion of EA to prevent people currently outside EA from being hired to specific roles, which seems likely to harm our impact to me.  

[anonymous]7
0
0

Thanks for sharing. Would you be able to share more information on the top-ranked option "exploration". My thinking on this is limited (like in general regarding a cause X). Would you able to share concrete ideas people talked about or concrete proposed plans for such an organisation (a cause X organisation or an organisation focused on one particular underexplored cause area?)

 

And on a related note, will you publish the report about meta charities you describe here publish before the incubation programme application deadline (as it might be decision-relevant for some people)?

Our current plan is to publish a short description but not a full report of the top ideas we plan to recommend in the first week of Jan so possible applicants can get a sense before the deadline (Jan 15th).

[anonymous]0
0
0

Great. thank you :)

Very interesting, looking forward to seeing what else comes up from the research and the new organization(s) that will hopefully result from this! I have some questions if you are willing to clarify:

  1. Is the raised potential issue about infohazards discouraging open communication only relevant to discussions about dangers from new technologies (AI-risk, GCBRs,..)?
  2. I don't think that I understand "EAs helping each other vs people outside of their group". Is it that people in EA are focused more on helping other EAs that they know than EAs they don't, or work on projects that seem important in their close circles but not important globally?
  3. Is the polarization in "Intercommunity coordination and connection" obviously related to some other factor? Say interviewees with a focus on research vs community? (Similarly about "EA community member improvement")
  4. The people skeptical about "Evaluation and Prioritization" thought that "it was hard to do prioritization well and lots of orgs naturally do this even if it was not their stated goal". These two statements seem to be somewhat contradictory unless the first statement claims something like "our best efforts to prioritize well are not that much better (don't produce better results or aren't more cost-effective) than what orgs are usually doing in their internal prioritization using common best practice. Does this capture the sentiment you have heard?

Glad you found it interesting!

  1. It tended to come from people focused on that area but the concerns were not exclusive to technologies (or even xrisk more broadly). 
  2. To put it another way, people were concerned that “EAs tended to help their friends and others in their adjacent peer-group disproportionality to the impact it would cause.” 
  3. Regarding polarization in "Intercommunity coordination and connection," my sense is this came from different perceptions about how past projects had gone. No clear trend as to why for “community member improvement”
  4. I think the way this is reconciled is the view that “current organizations are highly capable and do prioritization research themselves when determining where they should work.” But prioritization including that type is hard to do right and others would struggle to do an equally good job.

P.s. reminder these are not my or CE’s views just describing what some interviewees thought. 

Thanks for this write up! I'm excited about CE looking into this area. I was wondering whether you were able to share information about the breakdown of which organisations  the 40 EAs you surveyed came from and/or which chapters were interviewed, or whether that data is anonymous? 

Sadly not able to share that data I can say it tended to be bigger organizations and bigger chapters. 

No worries, thanks!

Thanks! A lot of these tradeoffs are closer to a 50/50 split than I would have expected.

I'd be interested in any info you're able to share about who was actually asked -- but I'm guessing you've already shared as much info as you feel you are able to about that.

I'd also be interested in what the numerical scores are based on?

I agree I was expecting a much stronger consensus as well. Sorry to say I told the folks I interviewed the data would remain at this level of anonymity many were fine with sharing their results but some preferred it to be pretty anonymous.

Number scores are based on people ranking the option above or below average with 3 being average. 

What progress is being made on any of these areas for improvement? Would be excited to help out!

If there's not much in the way of projects currently, it would be awesome to select just one of the concerns to start with and see if there's some systemic way within he EA community to tackle it. 'Reinventing the wheel' as an example would be a fascinating challenge to try and tackle. Solving it could mean more trust, investment, cohesion and effectiveness of the EA community.

I'd be keen to lend my statistical abilities to track and test some changes in the EA community. Perhaps a few EA chapters could test one potential solution such as gamification (adapted from 'How to Change' by Katy Milkman), and then the progress tracked over a period of time.

Curated and popular this week
Relevant opportunities