GR

Guy Raveh

Software Engineer @ GE Healthcare
4416 karmaJoined Jun 2020Haifa, Israel

Bio

Participation
3

Working in healthcare technology and doing some independent AI alignment research on the side.

MSc in applied mathematics/theoretical ML.

Interested in increasing diversity, transparency and democracy in the EA movement. Would like to know how algorithm developers can help "neartermist" causes.

Comments
916

Treating each new person as a separate investment and trying to optimize for their marginal utility for EA, instead of looking at the aggregate effect on the movement of all the community building efforts.

Specifically in your comment, justifying diversifying investment in groups by saying "high quality group members" are the goal but top universities have bottlenecks which can't be easily solved by just pouring more money into them - instead of arguing that it's better to have a new group in Chile than a new group in Harvard, even if hypothetically people there were less qualified for existing EA jobs.

I'm on the one hand happy to hear that the groups team isn't as elite-focused as I had thought; on the other hand, I'm still troubled by the margin-based reasoning.

Thanks, I've never used shortform but I'll try tomorrow

top universities are the places with the highest concentrations of people who ultimately have a very large influence on the world

I think this as a piece of reasoning represents a major problem in the perceptions of EA. While it might be factually true, there are two problems with relying on it:

  1. It means surrendering ourselves to this existing state as opposed to trying to change it and create a more equal world.
  2. It means the goal of EA community building is regarded as a funnel trying to get individuals into existing positions determined by the system already in place. There is an alternative: building not a pool of individuals, each of which is separately regarded as a marginal talent contribution - but rather a diverse community that could think more robustly about how to change the world for better, and not be mostly confined to rich, white, technological, western perspectives. IMO this alternative is much more important than the funnel.

I feel like I am one of the most engaged EAs in my local community, but the beliefs Torres ascribes to EA are so far removed from my own

This might have to do with "how local" your local community is. It seems to me that the weirder sides of EA (which I usually consider bad, but others here might not) are common in the EA hubs (Bay Area, Oxbridge, London, and the cluster of large groups in Europe) but not as common in other places (like here in Israel).

You're describing a religious belief that, for some unknown reason, many EAs seem to share. A belief in a mystical state of being never scientifically documented. And you ask why there's no activity around this, in a community supposedly organized around following evidence to find good ways to improve the world. And that's your answer: a shared belief is not evidence. Same as a shared belief in God, even by billions of people, is not evidence.

It's good that nobody's talking about this. It would be no more sane than e.g. trying to make everyone religious because then God would eliminate suffering.

I definitely agree. But I think we're far from it being practically useful for dedicated EAs to do this themselves.

Do most charitable organizations have in-house people to examine donors? I'm not saying we shouldn't check, but rather that there shouldn't be people in EA organizations whose job is to do this - rather than organizations just hiring auditors or whomever to do it for them.

I'd argue that "checking whether businesses are run responsibly" is out of scope for EA in general.

Load more