Hide table of contents

Basically what it says on the tin. I have this psychological need to find a really intense structured organization to help me accomplish what I want in life (most importantly, saving the world), and EA organizations are natural candidates for this. However, most of the large ones I've found display too much "performative normalcy" and aren't really willing to be as hardcore as I want and need.

Any recommendations on where to find a hardcore totalizing community that can inject more structure into my life so I'm better equipped to save the world? I'm living in Boston for the next two years or so, so anything that requires moving somewhere else won't work, but other than that, all kinds of ideas are welcome.

16

3
0

Reactions

3
0
New Answer
New Comment

3 Answers sorted by

  • EA group house?
  • Tech startup incubator?
  • Research bootcamp, e.g. MATS?

Thanks for the advice. I was more wondering if there was some specific organization that was known to give that sort of environment and was fairly universally recognized as e.g. “the Navy SEALs of EA” in terms of intensity, but this broader advice sounds good too.

I think this is a joke, but for those who have less-explicit feelings in this direction:

I strongly encourage you to not join a totalizing community. Totalizing communities are often quite harmful to members and being in one makes it hard to reason well. Insofar as an EA org is a hardcore totalizing community, it is doing something wrong.

This was semi-serious, and maybe “totalizing” was the wrong word for what I was trying to say. Maybe the word I more meant was “intense” or “serious.”

CLARIFICATION: My broader sentiment was serious, but my phrasing was somewhat exaggerated to get my point across.

What you're asking for sounds risky; see here for a reflection from a former "hardcore" EA. I also imagine there aren't many really hardcore segments after the fall of Leverage Research, but I have no particular insight into that.

Thanks for the reflection.

I’ve read about Leverage, and it seems like people are unfairly hard on it. They’re the ones who basically started EA Global, and people don’t give them enough credit for that. And honestly, even after what I’ve read about them, their work environment still sounds better to me than a supposedly “normal” one.

3
RyanCarey
Yes, they were involved in the first, small, iteration of EAG, but their contributions were small compared to the human capital that they consumed. More importantly, they were a high-demand group that caused a lot of people serious psychological damage. For many, it has taken years to recover a sense of normality. They staged a partial takeover of some major EA institutions. They also gaslit the EA community about what they were doing, which confused and distracted decent-sized subsections of the EA communtiy for years. I watched The Master a couple of months ago, and found to be a simultaneously compelling and moving description of the experience of cult membership, that I would recommend.
4
Habryka
I agree with a broad gist of this comment, but I think this specific sentence is heavily underselling Leverage's involvement. They ran the first two EA Summits, and also were heavily involved with the first two full EA Globals (which I was officially in charge of, so I would know).
Comments14
Sorted by Click to highlight new comments since:

Sorry, I know you said you're stuck in Boston, but tbh you're most likely to find like-minded people in the Bay Area[1]. Even if you're stuck in Boston for now, perhaps, it'd be possible for you to visit it at some point?

Just to echo other commenters: This is something to be very careful with. Even if you're certain that you want an intense environment, other people who say they want the same, may not actually be the kind of person who thrives in such an environment.

  1. ^

    I've heard that EA's in the Bay Area are more intense than EA's elsewhere. I suspect that this effect is a result of people who are serious about AI Safety moving to the Bay area and this probably affects the culture in general. 

Thanks for the advice. To be clear, I'm not certain that a hardcore environment would be the best environment for me either, but it seems worth a shot. And judging by how people tend to change in their involvement in EA as they get older, I'll probably only be as hardcore as this for like ten years.

Additionally, I wonder why there hasn't been an effort to start a more "intense" EA hub somewhere outside the Bay to save on rent and office costs. Seems like we're been writing about coordination problems for quite some time; let's go and solve one.

There is an "EA Hotel", which is decently-sized, very intensely EA, and very cheap.

Occasionally it makes sense for people to accept very low cost-of-living situations. But a person's impact is usually a lot higher than their salary. Suppose that a person's salary is x, their impact 10x, and their impact is 1.1 times higher when they live in SF, due to proximity to funders and AI companies. Then you would have to cut costs by 90% to make it worthwhile to live elsewhere. Otherwise, you would essentially be stepping over dollars to pick up dimes.

One advantage of the EA hotel, compared to a grant, for example, is that selection effects for it are surprisingly strong. This can help resolve some of the challenges of evaluation.

There have been attempts:

Coordination of Rationality/EA/SSC Housing Projects
New EA Hub Search and Planning

Unfortunately, they haven't gotten anywhere. If you think you can solve the problem, then go for it! But keep in mind that people have tried this in the past and failed.

How many FTEs are working on this problem?

Seems like the kind of thing that should have at least one FTE on it. Is there a reason no one has really put a lot of time into it (e.g. a specific compelling argument that this isn't the right call), or is it just that no one has gotten to it?

Funding would be hard to come by.

Some folks in EA are pretty nervous about projects where a bunch of folks live together. Part of this is due to what happened in Leverage. Part of this is that when people live together, there is often drama and there are potential PR risks.

And what I'm describing isn't an individual project full of people who live together; it's coordinating a bunch of people who work on many different projects to move to the same general area. And even if I were describing an individual project full of people who live together, every single failure of such a project within EA is a rounding error compared to the Manhattan Project, for better or worse.

And one more thing: if some people are nervous, wouldn't it be possible to get funded from people who are enthusiastic?

Well, if you think you can pull it off, feel free to go for it and see if you can find interested funders.

I thought the whole point of EA was that we based our grantmaking decisions on rigorous analyses rather than hunches and anecdotes.

Curated and popular this week
Relevant opportunities