I'm going to be leaving 80,000 Hours and joining Charity Entrepreneurship's incubator programme this summer!
The summer 2023 incubator round is focused on biosecurity and scalable global health charities and I'm really excited to see what's the best fit for me and hopefully launch a new charity. The ideas that the research team have written up look really exciting and I'm trepidatious about the challenge of being a founder but psyched for getting started. Watch this space! <3
I've been at 80,000 Hours for the last 3 years. I'm very proud of the 800+ advising calls I did and feel very privileged I got to talk to so many people and try and help them along their careers!
I've learned so much during my time at 80k. And the team at 80k has been wonderful to work with - so thoughtful, committed to working out what is the right thing to do, kind, and fun - I'll for sure be sad to leave them.
There are a few main reasons why I'm leaving now:
1. New career challenge - I want to try out something that stretches my skills beyond what I've done before. I think I could be a good fit for being a founder and running something big and complicated and valuable that wouldn't exist without me - I'd like to give it a try sooner rather than later.
2. Post-EA crises stepping away from EA community building a bit - Events over the last few months in EA made me re-evaluate how valuable I think the EA community and EA community building are as well as re-evaluate my personal relationship with EA. I haven't gone to the last few EAGs and switched my work away from doing advising calls for the last few months, while processing all this. I have been somewhat sad that there hasn't been more discussion and changes by now though I have been glad to see more EA leaders share things more recently (e.g. this from Ben Todd). I do still believe there are some really important ideas that EA prioritises but I'm more circumspect about some of the things I think we're not doing as well as we could (
Application forms for EA jobs often give an estimate for how long you should expect it to take; often these estimates are *wildly* too low ime. (And others I know have said this too). This is bad because it makes the estimates unhelpful for planning, and because it probably makes people feel bad about themselves, or worry that they're unusually slow, when they take longer than the estimate.
Imo, if something involves any sort of writing from scratch, you should expect applicants to take at least an hour, and possibly more. (For context, I've seen application forms which say 'this application should take 10 minutes' and more commonly ones estimating 20 minutes or 30 minutes).
It doesn’t take long to type 300 words if you already know what you’re going to say and don’t particularly care about polish (I wrote this post in less than an hour probably). But job application questions —even ‘basic’ ones like ‘why do you want this job?’ and ‘why would you be a good fit?’-- take more time. You may feel intuitively that you’d be a good fit for the job, but take a while to articulate why. You have to think about how your skills might help with the job, perhaps cross-referencing with the job description. And you have to express everything in appropriately-formal and clear language.
Job applications are also very high-stakes, and many people find them difficult or ‘ugh-y’, which means applicants are likely to take longer to do them than they “should”, due to being stuck or procrastinating.
Maybe hirers put these time estimates because they don’t want applicants to spend too long on the first-stage form (for most of them, it won’t pay off, after all!) This respect for people’s time is laudable. But if someone really wants the job, they *will* feel motivated to put effort into the application form.
There’s a kind of coordination problem here too. Let's imagine there's an application for a job that I really want, and on the form it says 'this application should take you appr
Bad Things Are Bad: A Short List of Common Views Among EAs
1. No, we should not sterilize people against their will.
2. No, we should not murder AI researchers. Murder is generally bad. Martyrs are generally effective. Executing complicated plans is generally more difficult than you think, particularly if failure means getting arrested and massive amounts of bad publicity.
3. Sex and power are very complicated. If you have a power relationship, consider if you should also have a sexual one. Consider very carefully if you have an power relationship: many forms of power relationship are invisible, or at least transparent, to the person with power. Common forms of power include age, money, social connections, professional connections, and almost anything that correlates with money (race, gender, etc). Some of these will be more important than others. If you're concerned about something, talk to a friend who's on the other side of that from you. If you don't have any, maybe just don't.
4. And yes, also, don't assault people.
5. Sometimes deregulation is harmful. "More capitalism" is not the solution to every problem.
6. Very few people in wild animal suffering think that we should go and deliberately destroy the biosphere today.
7. Racism continues to be an incredibly negative force in the world. Anti-black racism seems pretty clearly the most harmful form of racism for the minority of the world that lives outside Asia.[1]
8. Much of the world is inadequate and in need of fixing. That EAs have not prioritized something does not mean that it is fine: it means we're busy.
9. The enumeration in the list, of certain bad things, being construed to deny or disparage other things also being bad, would be bad.
Hope that clears everything up. I expect with 90% confidence that over 90% of EAs would agree with every item on this list.
1. ^
Inside, I don't know enough to say with confidence. Could be caste discrimination, could be ongoing oppression of non-Ha
Not all "EA" things are good
just saying what everyone knows out loud (copied over with some edits from a twitter thread)
Maybe it's worth just saying the thing people probably know but isn't always salient aloud, which is that orgs (and people) who describe themselves as "EA" vary a lot in effectiveness, competence, and values, and using the branding alone will probably lead you astray.
Especially for newer or less connected people, I think it's important to make salient that there are a lot of takes (pos and neg) on the quality of thought and output of different people and orgs, which from afar might blur into "they have the EA stamp of approval"
Probably a lot of thoughtful people think whatever seems shiny in a "everyone supports this" kind of way is bad in a bunch of ways (though possibly net good!), and that granularity is valuable.
I think feel very free to ask around to get these takes and see what you find - it's been a learning experience for me, for sure. Lots of this is "common knowledge" to people who spend a lot of their time around professional EAs and so it doesn't even occur to people to say + it's sensitive to talk about publicly. But I think "some smart people in EA think this is totally wrongheaded" is a good prior for basically anything going on in EA.
Maybe at some point we should move to more explicit and legible conversations about each others' strengths and weaknesses, but I haven't thought through all the costs there, and there are many. Curious for thoughts on whether this would be good! (e.g. Oli Habryka talking about people with integrity here)
Features that contribute to heated discussion on the forum
From my observations. I recognize many of these in myself. Definitely not a complete list, and possibly some of these things are not very relevant, please feel free to comment to add your own.
Interpersonal and Emotional
* Fear, on all sides (according to me lots of debates are bravery debates and people on "both sides" feel in the minority and fighting against a more powerful majority (and often it's both true, just in different ways), and this is really important for understanding the dynamics)
* Political backlash
* What other EAs will think of you
* Just sometimes the experience of being on the forum
* Trying to protect colleagues or friends
* Speed as a reaction to having strong opinions, or worrying that others will jump on you
* Frustration at having to rehash arguments / protect things that should go without saying
* Desire to gain approval / goodwill from people you’d like to hire/fund/etc you in the future
* Desire to sound smart
* Desire to gain approval / goodwill from your friends, or people you respect
* Pattern matching (correctly or not) to conversations you’ve had before and porting over the emotional baggage from them
* Sometimes it helps to assume the people you’re talking to are still trying to win their last argument with someone else
Low trust environment
* Surprise that something is even a question
* I think there's a nasty feedback loop in tense situations with low trust. (This section by Ozzie Gooen)
* People don't communicate openly their takes on things.
* This leads to significant misunderstanding.
* This leads to distrust of each other and assumptions of poor intent.
* This leads to parties doing more zero-sum or adversarial actions to each other.
* When any communication does happen, it's inspected with a magnifying glass (because of how rare it is). It's misunderstood (because of how little communication there has been).
* The commun
Some post-EAG thoughts on journalists
For context, CEA accepted at EAG Bay Area 2023 a journalist who has at times written critically of EA and individual EAs, and who is very much not a community member. I am deliberately not naming the journalist, because they haven't done anything wrong and I'm still trying to work out my own thoughts.
On one hand, "journalists who write nice things get to go to the events, journalists who write mean things get excluded" is at best ethically problematic. It's very very very normal: political campaigns do it, industry events do it, individuals do it. "Access journalism" is the norm more than it is the exception. But that doesn't mean that we should. One solution is to be very very careful about maintaining the differentiation between "community member" and "critical or not". Dylan Matthews is straightforwardly an EA and has reported critically on a past EAG: if he was excluded for this I would be deeply concerned.
On the other hand, I think that, when hosting an EA event, an EA organization has certain obligations to the people at that event. One of them is protecting their safety and privacy. EAs who are journalists can, I think, generally be relied upon to be fair and to respect the privacy of individuals. That is not a trust I extend to journalists who are not community members: the linked example is particularly egregious, but tabloid reporting happens.
EAG is a gathering of community members. People go to advance their goals: see friends, network, be networked at, give advice, get advice, learn interesting things, and more. In a healthy movement, I think that EAGs should be a professional obligation, good for the individual, or fun for the individual. It doesn't have to be all of them, but it shouldn't harm them on any axis.
Someone might be out about being bi at an after-party with friends, but not want to see that detail being confirmed by a fact-checker for a national paper. This doesn't seem particularly unusual. The
On Socioeconomic Diversity:
I want to describe how the discourse on sexual misconduct may be reducing the specific type of socioeconomic diversity I am personally familiar with.
I’m a white female American who worked as an HVAC technician with co-workers mostly from racial minorities before going to college. Most of the sexual misconduct incidents discussed in the Time article have likely differed from standard workplace discussions in my former career only in that the higher status person expressed romantic/sexual attraction, making their statement much more vulnerable than the trash-talk I’m familiar with. In the places most of my workplace experience comes from, people of all genders and statuses make sexual jokes about coworkers of all genders and statuses not only in their field, but while on the clock. I had tremendous fun participating in these conversations. It didn’t feel sexist to me because I gave as good as I got. My experience generalizes well; Even when Donald Trump made a joke about sexual assault that many upper-class Americans believed disqualified him, immediately before the election he won, Republican women were no more likely to think he should drop out of the race than Republican voters in general. Donald Trump has been able to maintain much of his popularity despite denying the legitimacy of a legitimate election in part because he identified the gatekeeping elements of upper-class American norms as classist. I am strongly against Trump, but believe we should note that many female Americans from poorer backgrounds enjoy these conversations, and many more oppose the kind of punishments popular in upper class American communities. This means strongly disliking these conversations is not an intrinsic virtue, but a decision EA culture has made that is about more than simple morality.
When I post about EA on social media, many of my co-workers from my blue-collar days think it sounds really cool. If any of them decided to engage further and mad
Proposing a change to how Karma is accrued:
I recently reached over 1,000 Karma, meaning my upvotes now give 2 Karma and my strong upvotes give 6 Karma. I'm most proud of my contributions to the forum about economics, but almost all of my increased ability to influence discourse now is from participating in the discussions on sexual misconduct. An upvote from me on Global Health & Development (my primary cause area) now counts twice as much as an upvote from 12 out of 19 of the authors of posts with 200-300 Karma with the Global Health & Development tag. They are generally experts in their field working at major EA organizations, whereas I am an electrical engineering undergraduate.
I think these kinds of people should have far more ability to influence the discussion via the power of their upvotes than me. They will notice things about the merits of the cases people are making that I won't until I'm a lot smarter and wiser and farther along in my career. I don't think the ability to say something popular about culture wars translates well into having insights about the object level content. It is very easy to get Karma by participating in community discussions, so a lot of people are now probably in my position after the increased activity in that area around the scandals. I really want the people with more expertise in their field to be the ones influencing how visible posts and comments about their field are.
I propose that Karma earned from comments on posts with the community tag accrues at a slower rate.
Edit: I just noticed a post by moderators that does a better job of explaining why karma is so easy to accumulate in community posts:
https://forum.effectivealtruism.org/posts/dDudLPHv7AgPLrzef/karma-overrates-some-topics-resulting-issues-and-potential