Hi — I'm Alex! I run the 80k headhunting service, which provides organisations with lists of promising candidates for their open roles.
You can give me (anonymous) feedback here: admonymous.co/alex_ht
I really appreciate you writing this. Getting clear on one's own reasoning about AI seems really valuable, but for many people, myself included, it's too daunting to actually do.
If you think it's relevant to your overall point, I would suggest moving the first two footnotes (clarifying what you mean by short timelines and high risk) into the main text. Short timelines sometimes means <10 years and high risk sometimes means >95%.
I think you're expressing your attitude to the general cluster of EA/rationalist views around AI risk typified by eg. Holden and Ajeya's views (and maybe Paul Christiano's, I don't know) rather than a subset of those views typified by eg. Eliezer (and maybe other MIRI people and Daniel Kokotajlo, I don't know). To me, the main text implies you're thinking about the second kind of view, but the footnotes are about the first.
And different arguments in the post apply more strongly to different views. Eg
This seems like a good place to look for studies:
The research I’ve reviewed broadly supports this impression. For example:
- Rieber (2004) lists “training for calibration feedback” as his first recommendation for improving calibration, and summarizes a number of studies indicating both short- and long-term improvements on calibration.4 In particular, decades ago, Royal Dutch Shell began to provide calibration for their geologists, who are now (reportedly) quite well-calibrated when forecasting which sites will produce oil.5
- Since 2001, Hubbard Decision Research trained over 1,000 people across a variety of industries. Analyzing the data from these participants, Doug Hubbard reports that 80% of people achieve perfect calibration (on trivia questions) after just a few hours of training. He also claims that, according to his data and at least one controlled (but not randomized) trial, this training predicts subsequent real-world forecasting success.
[A quick babble based on your premise]
What are the best bets to take to fill the galaxies with meaningful value?
How can I personally contribute to the project of filling the universe with value, given other actors’ expected work and funding on the project?
What are the best expected-value strategies for influencing highly pivotal (eg galaxy-affecting) lock-in events?
What are the tractable ways of affecting the longterm trajectory of civilisation? Of those, which are the most labour-efficient?
How can we use our life’s work to guide the galaxies to better trajectories?
Themes I notice
We think most of them could reduce catastrophic biorisk by more than 1% or so on the current margin (in relative[1] terms).
Imagine all six of these projects was implemented to a high standard. How robust do you think the world would be to catastrophic biorisk? Ie. how sufficient do you think this list of projects is?
The job application for the Campus Specialist programme has been published. Apologies for the delay
Hi Elliot, thanks for your questions.
Is this indicative of your wider plans?/ Is CEA planning on keeping a narrow focus re: universities?
I’m on the Campus Specialist Manager team at CEA, which is a sub-team of the CEA Groups team, so this post does give a good overview of my plans, but it’s not necessarily indicative of CEA’s wider plans.
As well as the Campus Specialist programme, the Groups team runs a Broad University Group programme staffed by Jessica McCurdy with support from Jesse Rothman. This team provides support for all university groups regardless of ranking through general group funding and the EA Groups Resource Centre. The team is also launching UGAP (University Groups Accelerator Program) where they will be offering extra support to ~20 universities this semester. They plan to continue scaling the programme each semester.
Outside of university groups, Rob Gledhill joined the Groups team last year to work specifically on the city and national Community Building Grants programme, which was funding 10 total full-time equivalent staff (FTE) as of September (I think the number now is slightly higher).
Additionally, both university groups and city/national groups can apply to the EA Infrastructure Fund.
Besides the Groups team, CEA also has:
Basically, I see two options 1) A tiered approach whereby "Focus" universities get the majority of attention 2) "Focus" universities get all of CEA's attention at the exclusion of all of universities.
Across the Groups team, Focus universities currently get around half of the team's attention, and less than half of funding from grants. We’re planning to scale up most areas of the Groups team, so it’s hard to say exactly how the balance will change. Our guiding star is figuring out how to create the most “highly-engaged EAs” per FTE of staff capacity. However, we don’t anticipate Focus universities getting all of the Groups team’s attention at the exclusion of all other universities, and it’s not the status quo trajectory.
Do you plan on head hunting for these roles?
Off the top of my head there's a few incredibly successful university groups that have successfully flourished under their own volition (e.g. NTNU, PISE). There's likely people in these groups who would be exceptionally good at community growth if given the resources you've described above, but I suspect that they may not think to apply for these roles.
Some quick notes here:
Do you plan on comparing the success of the project, against similar organisations?
There are many organisations that aim to facilitate and build communities on University campuses. There are even EA adjacent organisations, i.e. GFI. It makes sense to me to measure the success of your project against these (especially GFI), as they essentially provide a free counterfactual regarding a change of tactics.
I ask this because I strongly suspect GFI will show stronger community building growth metrics than CEA. They provide comprehensive and beautifully designed resources for students. They public and personable (i.e. they have dedicated speakers who speak for any audience size (at least that's what it appears to me)). And they seem to have a broader global perspective (so perhaps I am a bit bias). But in general they seem to have "the full package" which CEA is currently missing.
I agree having clear benchmarks to compare our work to is important. I’m not familiar with GFI’s community building activities. It seems fairly likely to me that the Campus Specialist team at CEA has moderately different goals to GFI, such that our community growth metrics might be hard to compare directly.
To track the impact of our programmes, the Campus Specialist team looks at how many people at our Focus universities are becoming “highly-engaged EAs” - individuals that have a good understanding of EA principles, show high quality reasoning, and are taking significant actions, like career plans, based on these principles. As mentioned in the post, our current benchmark is that Campus Specialists can help at least eight people per year to become highly engaged.
One interesting component to point out is that while I think our end goal is clear - creating highly-engaged EAs - we believe we’re still pretty strongly in the ‘exploration mode’ of finding the most effective tactics to achieve this. As a result, we want to spend less of our time in the Campus Specialist Programme standardising resources, and more time encouraging innovation and comparing these innovations against the core model.
By contrast, our University Group Accelerator Programme is a bit more like GFI’s programme as it has more structured tactics and resources for group leaders to implement. Jessica, who is running the programme, has been in touch with GFI to exchange lessons learned and additional resources.
Can you expand on how much money you plan on spending on each campus?
I noticed you say "managing a multi-million dollar budget within three years of starting" can you explain what exactly this money is going to be spent on? Currently this appears to me (perhaps naively) to be an order of magnitude larger than the budget for the largest national organisations. How confident are you that you will follow through on this? And how confident are you that spending millions of dollars on one campus is more efficient than community building across 10 countries?
How confident are you that you will follow through on this?
Can you explain what exactly this money is going to be spent on?
I can’t tell you exactly what this money will be spent on, as this depends on what projects Campus Specialists identify as high priority. Some possible examples:
The ultimate goal across all of these programs is to find effective ways to create “highly-engaged EAs.”
And how confident are you that spending millions of dollars on one campus is more efficient than community building across 10 countries?
I’m not sure this is the right hypothetical to be comparing - CEA is supporting community building across 10 countries*. We are also looking to support 200+ universities. I think both of those things are great.
I think the relevant comparison is something like ‘how confident are you that spending millions of dollars on one campus is more efficient than the EA community’s last (interest-weighted) dollar?’
My answer depends exactly on what the millions of dollars would be spent on, but I feel pretty confident that some Campus Specialists will find ways of spending millions of dollars on one campus per year which are more efficient (in expectation) than the EA community’s last (interest-weighted) dollar.
*I listed out the first ten countries that came to mind where I know CEA supports groups: USA, Canada, Germany, Switzerland, UK, Malaysia, Hong Kong (via partnership), Netherlands, Israel, Czech Republic. (This is not an exhaustive list.)
Thanks for this comment and the discussion it’s generated! I’m afraid I don’t have time to give as detailed response as I would like, but here are some key considerations:
However, we don’t think this is particularly decision-relevant for us in the short term. This is because:
Thanks Vaidehi!
One set of caveats is that you might not be a good fit for this type of work (see what might make you a good fit above). For instance:
Some other things people considering this path might want to take into consideration:
How do you think people should do this?