P

Pancakes4all

21 karmaJoined

Comments
6

Yeah I guess I'm saying probably the rest is not relevant or important for EA and that's why I think little r rationality can be scrapped in favor of the important bits I highlighted. I realize I left out epistemology so like, just study epistemology and cognitive psych and that is the relevant bit for EA (in an admittedly oversimplified way to make a point).

I don't find the concept of small "r" rationalist helpful because what you describe to me actually sounds like "understand most of Kahneman and Traversky's work" and I wouldn't refer to that as rationalism but cognitive psychology. I think in general even small r rationalism tries to repackage concepts in ways that are only new or interesting to people who haven't studied psychology and in my opinion does so mostly in very distinct ways that tend to have non-stated underlying philosophical assumptions like objectivism and Kantian ideals. But cognitive psych doesn't (shouldn't?) have to be applied in those ways. Probably just read Joshua Greene's Moral Tribes and get on with your day? That's how I got into EA, and it does whatever ur describing as small r rationalism better than small r rationalism (if that makes sense?) without all the underlying non-stated assumptions that comes with small r rationality and the ties with the big R community

On 2) I haven't heard of phage vaccines being particularly neglected compared the the rest of society and haven't heard of it being prioritized in EA spaces. One thing you could do though is get into policy and work with the Africa CDC on biosecurity and biosafety practices. If you really want to do lab work I think talking to the folks formerly at Alvea about potential vaccine directions might be helpful.

--We should do fewer EAGs and prioritize funding more local groups. I think the limited data we have points in the direction that EAGs are significantly less cost effective than local groups. There was a survey of people doing longtermist work run by Open Phil in 2020 that asked respondents to rate various EA organizations/content/groups on a scale from “hindered my positive impact a lot” to “helped me a lot to have a positive impact”. They assigned point values and averaged responses, then calculating “net impact points” to create a standard metric to use to compare across and displaying a “% of net impact points” both weighted[9] and not weighted, which should give a sense of what proportion of impact each item was responsible for. As you can see in the chart here, CEA makes up 6% of these, and elsewhere it’s said that EAGs are responsible for roughly half of CEA’s impact, so assume EAGs are roughly 3% of net impact points (NIP). This means that EAGs are less valuable than everything else in the chart, and that local groups are responsible for over 2x the % of NIP. We only have rough data right now, but one estimate puts total expenditure on EA groups at ~$1 million in 2019[10]. In that same year, there were 3 EAGxs and 2 EAGs, which, taking estimates from the earlier post on costs,[11] likely cost around $5.6 million, over 5x the cost.

This might deserve it's own short commentary. It gets buried a bit in the piece but it's an important point/argument and I haven't seen others doing this sort of analysis.