H

hbesceli

332 karmaJoined

Posts
7

Sorted by New
2
· · 1m read

Comments
16

A lot of what I have seen regarding "EA Community teams" seems to be be about managing conflicts between different individuals. 

Not sure I understand this part - curious if you could say more. 

It would be interesting to see an organization or individual that was explicitly an expert in knowing different individuals and organizations and the projects that they are working on and could potentially connect people who might be able to add value to each other's projects.

I like this idea. A related idea/ framing that comes to mind. 

  • There's often a lot of value for people having a strong professional network. Eg. for finding collaborators, getting feedback or input etc.
  • People's skills/ inclination for network building will vary a lot. And I suspect there's a significant fraction of people working on EA projects that have lower network building inclination/ skills, and would benefit from support in building their network.
  • eg. If I could sign up for a service that substantially increased my professional network/ helped me build more valuable professional relationships, I would and would be willing to pay for such a service. 
     

Thanks for the suggestion - I read the proposal a while ago, and hadn't thought about it recently, so it's good to be reminded of it again. 

The fact that that has not already been funded, and that talk around it has died down, makes me wonder if you have already ruled out funding such a project.

We haven't decided against funding projects like this. (EAIF's grantmaking historically has been very passive - eg. the projects that we end up considering for funding has been determined by the applications we received. And we haven't received any strong applications in the 'FHI of the West' ballpark - at least as far as I'm aware)

as you get better models of the world/ability to get better models of the world, you start noticing things that are inconvenient for others. Some of those inconvenient truths can break coordination games people are playing, and leave them with worse alternatives.

I haven't thought about this particular framing before, and it's interesting to me to think about - I don't quite have an opinion on it at the moment. Here's some of the things that are on my mind at the moment which feel related to this. 

Perhaps relatedly or perhaps as a non-sequitur, I'm also curious about what changed since your post a year ago talking about how EA doesn't bring out the best in you.

 

This seems related to me, and I don't have a full answer here, but some things that come to mind:

  • For me personally, I feel a lot happier engaging with EA than I did previously. I don't quite know why this is, I think some combination of: being more selective in terms of what I engage with and how, having a more realistic view of what EA is and what to expect from it, being part of other social environments which I get value from and make me feel less 'attached to EA' and my mental health improving. And also perhaps having a stronger view of what I want to be different with EA, and feeling more willing to stand behind that. 
  • I still feel pretty wary of things which I feel that EA 'brings out of me' (envy, dismissiveness, self-centredness etc.) which I don't like, and it still can feel like a struggle to avoid the pull of those things. 

Have you messaged people on the EA and epistemics slack?

... there is an EA and epistemics slack?? (cool!) if it's free for anyone to join, would you be able to send me an access link or somesuch? 

Thanks! (I don't have an immediate response to this, and found a bunch of the points you're raising here pretty interesting)

EA Jobs, Scarcity and Performance

It seems like:

  1. For many people, having an EA job is pretty important. 
  2. It’s pretty competitive and many people who want EA jobs will not in fact get them. 

There’s been some discussion related to this on the EA Forum, focusing in particular on jobseekers. I’m also interested in exploring this dynamic with people who are working in EA jobs. 

I expect EA job scarcity not only have an impact on EA jobseekers, but also people who are working in EA jobs. 

Given 1 and 2, it seems like for people working in EA jobs it will be pretty important for them to keep their jobs. If the job market is competitive it may not be obvious that they can get another one. (For people who have got one EA job, it will presumably be easier to get another, but maybe not guaranteed). 

For someone who’s in a position of scarcity about their EA job, I can imagine this meaning they focus primarily on performing well/ being seen to perform well. 

This becomes a problem if what counts as performing well and what is actually good to do comes into conflict. Eg. this might involve things like:

  • Agreeing with the organisational strategy or one’s manager more than one endorses
  • Focusing on ensuring that they have achieved certain outputs independent of whether that output seems good 

In general I expect that under conditions of scarcity people will be less able to do valuable work (and I mean valuable here as ‘actually good’ as opposed to ‘work that is perceived to be valuable). 

(If I’m right about this, then one potential answer to ‘what is it for EA to thrive’, is: EAs aren’t in a position of scarcity). 

Things I’d be interested to ask people who are working at EA jobs to understand whether this is in fact a thing:

  • How concerned are you about your perceived performance?
  • If your employer/ manager/ funder/ relevant people said something like: ‘We have full confidence in you, your job is guaranteed and we want you to focus on whatever you think is best’ - would that change what you focus on? How much? 

Some EA psychological phenomena

Some things that people report in EA:

Are these EA phenomena? Also, are they psychological phenomena? 

These things (I guess excluding EA disillusionment), don’t just exist within EA they exist within society in general, so it’s plausibly unfair to call them EA phenomena. Though it also seems to me that for each of these things, there’s somewhat strong fit with EA, and EA culture.

Taking impostor syndrome as an example: EA often particularly values ambitious and talented people. Also, it seems to me there’s something of a culture of assessing and prioritising people on this basis. Insofar as it’s important for people to be successful within EA, it’s also important for people to be seen in a certain way by others (talented, ambitious etc.). In general, the stronger the pressure there is for people to be perceived in a certain way, the more prominent I expect impostor syndrome to be. 

(I’m a bit wary of ‘just so’ stories here, but my best guess is that this in fact explanatory). 

I think impostor syndrome and other things in this ballpark are often discussed as an individual/ psychological phenomena. I think such framings are pretty useful. And there’s another framing which is seeing it instead as a ~sociological phenomena - these are things which happen in a social context, as a result of different social pressures and incentives within the environment.

I don’t know quite what to conclude here, in a large part because I don’t know how common these things are within EA, and how this compares to other places (or even what the relevant comparison class is). Though tentatively, if I’m asking ‘What does it look like for EA to thrive?’, then part of my answer is ‘being an environment where impostor syndrome, burnout, impact obsession and EA disillusionment are less common’.

What’s going on with ‘EA Adjacents’? 

There’s a thing where lots of people will say that they are EA Adjacent rather than EA (funny post related to this). In particular, it seems to me that the closer to the core people are, the less inclined they are to identify themselves with EA. What’s going on here? I don’t know, but it’s an interesting trailhead to me. 

Plausibly there are some aspects of EA, the culture, norms, worldview, individuals, organisations etc. that people disagree with or don’t endorse, and so prefer to not identify as EAs. 

I’m unsure how much to treat this as reflective of a substantive issue vs. a quirk, or reflective of things being actually fine. At least in terms of EA being a ‘beacon for thoughtful, sincere, and selfless’, it seems a little bit worrying to me that some of the core members of the community aren’t willing to describe themselves as EA. 

Perhaps a way of getting to the heart of this is asking people something like: Imagine you’re talking to someone who is thoughtful, sincere and selfless. Would you recommend EA to them? Which parts? How strongly? Would you express any reservations? 

Looping back to the question of ‘What is it for EA to thrive?’, one answer is: It’s the kind of community that EA’s would strongly recommend to a thoughtful, sincere and selfless friend. 

(Maybe this is too strong - people will probably reasonably have disagreements about what aspects of EA are good and aren’t, and if everyone is very positive on EA in this way, this plausibly means that there’s not enough disagreement in the community. )

Incentives within EA

Here’s a story you could tell about academia. Academia, is in some sense supposed to be about generating knowledge. But it ends up being ineffective at doing this because of something something incentives. Eg. 

  • Academic jobs are highly competitive
  • In order to get an academic job, it’s more important to have done things like original research than things like replications. 
  • Things like replications are undersupplied, and the replication crisis happens. 

What are the incentives within EA? How does this affect how well EA ends up ‘doing the most good?’. I don’t have a full theory here, though I also suspect that there are ways in which incentives in EA can push against doing the most good. Professional EA group funding is one example:

  • Professional EA group organisers are often in a bit of a precarious position. Their job depends on their ability to get funding from organisations like CEA or EAIF. 
  • One of the main ways that EA group organisers are assessed is on the basis of things like how well they produce highly engaged EAs, or career plan changes or other such things (I think this is broadly true, though I don’t have a great insight into how CEA assesses groups).
  • Professional EA group organisers are incentivised to produce these kinds of things. Some potential problems here: It’s hard to assess what counts as a good eg. career, which pushes in the direction of non-standard career options being discounted, often it may make sense for someone to focus on building career capital over working at an EA organisation, but these kinds of things are less obviously/ legibly impactful…
Load more