Alex HT

Headhunting Lead @ 80,000 Hours
1088 karmaJoined


Hi — I'm Alex! I run the 80k headhunting service, which provides organisations with lists of promising candidates for their open roles.

You can give me (anonymous) feedback here:


Topic contributions

I claim that you can get near the frontier of alignment knowledge in ~6 months to a year. 

How do you think people should do this?

I really appreciate you writing this. Getting clear on one's own reasoning about AI seems really valuable, but for many people, myself included, it's too daunting to actually do. 

If you think it's relevant to your overall point, I would suggest moving the first two footnotes (clarifying what you mean by short timelines and high risk) into the main text. Short timelines sometimes means <10 years and high risk sometimes means >95%

I think you're expressing your attitude to the general cluster of EA/rationalist views around AI risk typified by eg. Holden and Ajeya's views (and maybe Paul Christiano's, I don't know) rather than a subset of those views typified by eg. Eliezer (and maybe other MIRI people and Daniel Kokotajlo, I don't know).  To me, the main text implies you're thinking about the second kind of view, but the footnotes are about the first. 

And different arguments in the post apply more strongly to different views. Eg

  • Fewer 'smart people disagree' about the numbers in your footnote than about the more extreme view. 
  • I'm not sure Eliezer having occasionally been overconfident, but got the general shape of things right is any evidence at all against >50% AGI in 30 years or >15% chance of catastrophe this century (though it could be evidence against Eliezer's very high risk view).
  • The Carlsmith post you say you roughly endorse seems to have 65% on AGI in 50 years, with a 10% chance of existential catastophe overall. So I'm not sure if that means your conclusion is 
    • 'I agree with this view I've been critically examining'  
    •  'I'm still skeptical of 30 year timelines with >15% risk, but I roughly endorse 50 year timelines with 10% risk'
    • 'I'm skeptical of 10 year timelines with >50% risk, but I roughly endorse 30-50 year timelines with 5-20% risk'
    • Or something else 
Answer by Alex HT1

This seems like a good place to look for studies:

The research I’ve reviewed broadly supports this impression. For example:

  • Rieber (2004) lists “training for calibration feedback” as his first recommendation for improving calibration, and summarizes a number of studies indicating both short- and long-term improvements on calibration.4 In particular, decades ago, Royal Dutch Shell began to provide calibration for their geologists, who are now (reportedly) quite well-calibrated when forecasting which sites will produce oil.5
  • Since 2001, Hubbard Decision Research trained over 1,000 people across a variety of industries. Analyzing the data from these participants, Doug Hubbard reports that 80% of people achieve perfect calibration (on trivia questions) after just a few hours of training. He also claims that, according to his data and at least one controlled (but not randomized) trial, this training predicts subsequent real-world forecasting success.

Are these roles visa eligible, or do candidates need a right to work in the US already? (Or can you pay contractors outside of the US?)

Answer by Alex HT5

[A quick babble based on your premise]

What are the best bets to take to fill the galaxies with meaningful value?

How can I personally contribute to the project of filling the universe with value, given other actors’ expected work and funding on the project?

What are the best expected-value strategies for influencing highly pivotal (eg galaxy-affecting) lock-in events?

What are the tractable ways of affecting the longterm trajectory of civilisation? Of those, which are the most labour-efficient?

How can we use our life’s work to guide the galaxies to better trajectories?

Themes I notice

  • Thinking in bets feels helpful epistemically, though the lack of feedback loops is annoying
  • The object of attention is something like ‘civilisation’, ‘our lightcone’, or ‘our local galaxies’
  • The key constraint isn’t money, but it’s less obvious what it is (just ‘labour’ or ‘careers’ doesn’t feel quite right)

We think most of them could reduce catastrophic biorisk by more than 1% or so on the current margin (in relative[1] terms).

Imagine all six of these projects was implemented to a high standard. How robust do you think the world would be to catastrophic biorisk? Ie. how sufficient do you think this list of projects is? 

The job application for the Campus Specialist programme has been published. Apologies for the delay

Hi Elliot, thanks for your questions.

Is this indicative of your wider plans?/ Is CEA planning on keeping a narrow focus re: universities?

I’m on the Campus Specialist Manager team at CEA, which is a sub-team of the CEA Groups team, so this post does give a good overview of my plans, but it’s not necessarily indicative of CEA’s wider plans. 

As well as the Campus Specialist programme, the Groups team runs a Broad University Group programme staffed by Jessica McCurdy with support from Jesse Rothman. This team provides support for all university groups regardless of ranking through general group funding and the EA Groups Resource Centre. The team is also launching UGAP (University Groups Accelerator Program) where they will be offering extra support to ~20 universities this semester. They plan to continue scaling the programme each semester.

Outside of university groups, Rob Gledhill joined the Groups team last year to work specifically on the city and national Community Building Grants programme, which was funding 10 total full-time equivalent staff (FTE) as of September (I think the number now is slightly higher). 

Additionally, both university groups and city/national groups can apply to the EA Infrastructure Fund

Besides the Groups team, CEA also has:

  • The Events team, which runs EAG(x)
  • The Online team, which runs this forum,, and EA virtual programmes
  • The Operations team, which enables the whole of CEA (and other organisations under the legal entity) to run smoothly
  • The Community Health team, which aims to reduce risks that could cause the EA community to lose out on a lot of value, and to preserve the community’s ability to grow and produce value in the future

Basically, I see two options 1) A tiered approach whereby "Focus" universities get the majority of attention 2) "Focus" universities get all of CEA's attention at the exclusion of all of universities. 

Across the Groups team, Focus universities currently get around half of the team's attention, and less than half of funding from grants. We’re planning to scale up most areas of the Groups team, so it’s hard to say exactly how the balance will change. Our guiding star is figuring out how to create the most “highly-engaged EAs” per FTE of staff capacity. However, we don’t anticipate Focus universities getting all of the Groups team’s attention at the exclusion of all other universities, and it’s not the status quo trajectory. 

Do you plan on head hunting for these roles? 

Off the top of my head there's a few incredibly successful university groups that have successfully flourished under their own volition (e.g. NTNU, PISE). There's likely people in these groups who would be exceptionally good at community growth if given the resources you've described above, but I suspect that they may not think to apply for these roles. 

Some quick notes here:

  • We are planning to do active outreach for these roles.
  • I agree that someone who has independently done excellent university group organising could be a great fit for this role.
  • CEA supports EA NTNU via a Community Building Grant (CBG) to EA Norway.
  • Also, quite a few group organisers have reached out to me since posting this, which makes me think people in this category might be quite likely to apply anyway.
  • But I think it’s still worth encouraging people to apply, and clarifying that you don’t need to have attended a focus university to be a Campus Specialist

Do you plan on comparing the success of the project, against similar organisations?

There are many organisations that aim to facilitate and build communities on University campuses. There are even EA adjacent organisations, i.e. GFI. It makes sense to me to measure the success of your project against these (especially GFI), as they essentially provide a free counterfactual regarding a change of tactics. 

I ask this because I strongly suspect GFI will show stronger community building growth metrics than CEA. They provide comprehensive and beautifully designed resources for students. They public and personable (i.e. they have dedicated speakers who speak for any audience size (at least that's what it appears to me)). And they seem to have a broader global perspective (so perhaps I am a bit bias). But in general they seem to have "the full package" which CEA is currently missing.

I agree having clear benchmarks to compare our work to is important. I’m not familiar with GFI’s community building activities. It seems fairly likely to me that the Campus Specialist team at CEA has moderately different goals to GFI, such that our community growth metrics might be hard to compare directly. 

To track the impact of our programmes, the Campus Specialist team looks at how many people at our Focus universities are becoming “highly-engaged EAs” - individuals that have a good understanding of EA principles, show high quality reasoning, and are taking significant actions, like career plans, based on these principles. As mentioned in the post, our current benchmark is that Campus Specialists can help at least eight people per year to become highly engaged. 

One interesting component to point out is that while I think our end goal is clear - creating highly-engaged EAs - we believe we’re still pretty strongly in the ‘exploration mode’ of finding the most effective tactics to achieve this. As a result, we want to spend less of our time in the Campus Specialist Programme standardising resources, and more time encouraging innovation and comparing these innovations against the core model. 

By contrast, our University Group Accelerator Programme is a bit more like GFI’s programme as it has more structured tactics and resources for group leaders to implement. Jessica, who is running the programme, has been in touch with GFI to exchange lessons learned and additional resources.

Can you expand on how much money you plan on spending on each campus? 

I noticed you say "managing a multi-million dollar budget within three years of starting" can you explain what exactly this money is going to be spent on? Currently this appears to me (perhaps naively) to be an order of magnitude larger than the budget for the largest national organisations. How confident are you that  you will follow through on this? And how confident are you that spending millions of dollars on one campus is more efficient than community building across 10 countries? 

How confident are you that  you will follow through on this?

  • This depends on what Campus Specialists do. It’s an entrepreneurial role and we’re looking for people to initiate ambitious projects. CEA would enthusiastically support a Campus Specialist in this scaling if it seemed like a good use of resources.
  • I’m pretty confident that if a Campus Specialist had a good use of $3mil/year in 2025 CEA would fund it.
  • Will a Campus Specialist have a good use of $3mil/year in 2025? Probably. One group is looking to spend about $1m/year already (with programmes that benefit both their campus and the global community, via online options).

Can you explain what exactly this money is going to be spent on? 

I can’t tell you exactly what this money will be spent on, as this depends on what projects Campus Specialists identify as high priority. Some possible examples:

  • Prestigious fellowships or scholarships
  • Lots of large, high-quality retreats e.g. using an external events company to save organiser time
  • Renting a space for students to co-work
  • Running a mini-conference every week (one group has done this already - they have coworking, seminar programmes, a talk, and a social every week of term, and it seems to have been very good for engagement, with attendance regularly around 70 people). I could imagine this being even bigger if there were even more concurrent ‘tracks’
  • Seed funding for students to start projects
  • Salaries for a team of ten
  • Travel expenses for speakers
  • Bootcamps for in-demand skills
  • Running an EAGx at the university
  • Research fellowships over the summer for students (like SERI or CERI, though they need not be in the -ERI format)

The ultimate goal across all of these programs is to find effective ways to create “highly-engaged EAs.” 

And how confident are you that spending millions of dollars on one campus is more efficient than community building across 10 countries? 

I’m not sure this is the right hypothetical to be comparing - CEA is supporting community building across 10 countries*. We are also looking to support 200+ universities. I think both of those things are great. 

I think the relevant comparison is something like ‘how confident are you that spending millions of dollars on one campus is more efficient than the EA community’s last (interest-weighted) dollar?’

My answer depends exactly on what the millions of dollars would be spent on, but I feel pretty confident that some Campus Specialists will find ways of spending millions of dollars on one campus per year which are more efficient (in expectation) than the EA community’s last (interest-weighted) dollar. 

*I listed out the first ten countries that came to mind where I know CEA supports groups: USA, Canada, Germany, Switzerland, UK, Malaysia, Hong Kong (via partnership), Netherlands, Israel, Czech Republic. (This is not an exhaustive list.)


Thanks for this comment and the discussion it’s generated! I’m afraid I don’t have time to give as detailed response as I would like, but here are some key considerations:

  • In terms of selecting focus universities, we mentioned our methodology here (which includes more than just university rankings, such as looking at alumni outcomes like number of politicians, high net worth individuals, and prize winners).
  • We are supporting other university groups - see my response to Elliot below for more detail on CEA’s work outside Focus universities.
  • You can view our two programmes as a ‘high touch’ programme and a ‘medium touch’ programme. We’re currently analysing which programme creates the most highly-engaged EAs per full-time equivalent staff member (FTE) (our org-wide metric).
  • In the medium term, this is the main model that will likely inform strategic decisions, such as whether to expand the focus university list.

However, we don’t think this is particularly decision-relevant for us in the short term. This is because:  

  • At the moment, most of our Focus universities don’t have Campus Specialists.
  • You don’t need to have gone to a Focus university to be a Campus Specialist.
  • So we think qualified Campus Specialists won’t be limited by the number of opportunities available.

Thanks Vaidehi!

One set of caveats is that you might not be a good fit for this type of work (see what might make you a good fit above). For instance: 

  • This is a role with a lot of autonomy, so if you prefer more externally set structure, this role probably isn’t a good fit for you
  • If you find talking to people about EA ideas difficult or uncomfortable, this may be a bad fit
  • You might be  a good fit for doing field building, but prefer doing so with another age range (e.g. mid career, high school)

Some other things people considering this path might want to take into consideration:

  • If you would like to enter a non-EA career that is looking for traditional markers of prestige, is extremely competitive, and you have a current opportunity that won’t come around later, then being a campus specialist might be less good than directly entering that career or doing more signalling (although we think that the career capital from this route is better than most people think). This might be true for some specific post-undergrad awards in policy or unusual entrepreneurial opportunities - like having a co-founder with seed funding.
  • If you think it’s likely we’re in a particularly pivotal moment in the next 5-10 years – for example if you have extremely short AI timelines (with a median distribution of <5-10 years), then you might think that the benefits of doing outreach to talented individuals might not come to fruition. (But we think that this option can be good even for people with relatively short timelines - i.e. - 15-20 years.)
  • You might not feel compelled by the data in multiplier arguments, or you might think you’ll crowd out someone who would be better at generating multipliers compared to you.


Load more