Hide table of contents

This is an anonymous account (Ávila is not a real person). I am posting on this account to avoid potentially negative effects on my future job prospects.

SUMMARY:

  • I've been rejected from 18 jobs or internships, 12 of which are "in EA."
  • I briefly spell out my background information and show all my rejections.
  • Then, I list some recommendations to EA orgs on how they can (hopefully) improve the hiring process.

This post probably falls under the category of "it's hard, even for high-achievers, to get an EA job." But there's still the (probably bigger) problem of what there is for "mediocre" EAs to do in a movement that prizes extremely high-achieving individuals.

If this post improves hiring a little bit at a few EA orgs, I will be a happy person. 

EDIT: I want to make *very clear* that I take full responsibility for how my job search is going. Putting my experience and "recommendations to EA orgs" in the same post could be seen as me thinking that bad hiring practices are to blame. This is not at all the case. I hope this was clear before I added this paragraph in!

EDIT 2 (June 2024): Since writing this, I received ~9 more rejections and 3 acceptances (!). I'm very happy with the 3 opportunities. Thank you all for your encouragement -- and hopefully others in a similar position found this post or the replies helpful.

BACKGROUND

Entry-level EA jobs and internships have been getting very competitive. It is common for current applicants to hear things like "out of 600, we can only take 20" (CHAI), or "only 3% of applicants made it this far" (IAPS) or "It's so competitive it's probably not even worth applying" (GovAI representative). So far, I haven't been accepted to any early-career AI safety opportunities, and I've mostly been rejected in the first round.

ABOUT ME

I'll keep this section somewhat vague to protect my anonymity. I'm mostly applying to AI safety-related jobs and internships.

[EDIT: deleted this section after realizing it may have made me too identifiable.]

JOBS/INTERNSHIPS/FUNDING I'VE APPLIED TO

Rejections

Horizon Junior Fellowship - Rejected on round 3/4
GovAI summer fellowship - Rejected first round 
ERA<>Krueger Lab - Rejected first round
fp21 internship - Never heard back
BERI (full-time job) - Rejected first round
MIT FutureTech (part-time) - Job filled before interview
PIBBSS Fellowship - Rejected first round
Berkeley Risk and Security Lab - Never heard back
CLR Fellowship - Rejected first round
ERA Fellowship - Rejected first round
CHAI Internship  - Rejected first round

UChicago XLab - Rejected first round

EA LTFF research grant - Rejected
Open Phil research grant - Rejected

Acceptances

None yet!

Note: I've also applied to jobs that align with my principles but are not at EA orgs. I'm also still applying to jobs, so this is not (yet) a pity party.

MY EXPECTATIONS

Although I expected these to be quite competitive, I was surprised to be eliminated during the first round for so many of them. That's because most of these are specifically meant for early-career people and I'd say I have a great resume/credentials/demonstrated skills for an early career person.

RECOMMENDATIONS TO EA ORGS

As someone who's spent a lot of time doing EA org applications, below are some tentative thoughts on how to (probably) improve them. Please let me know what you think in the comments.

Again, I do not mean to imply at all that my failures were caused by bad hiring practices. These are just thoughts that came to mind after applying to many similar programs.

  • Increase the required time-commitment as the application progresses.

By this I mean, start out with shorter applications and then increase the time commitment in successive application stages. If you plan to only progress 10% of applicants to the second round, wasting 90% of applicants' times on an hours-long questionnaire or work test seems like a bad policy. 

  • Start admissions earlier. 

For whatever reason, many of these programs don't expect to finish admissions until May. At least in the US, many colleges end the academic year in May. This makes the search very stressful, hard to get done early. And also, given that these programs are so competitive, it would be good to know if one will get in early so that one can apply to more non-EA jobs otherwise. The lack of financial security and knowing where one will end up (and finding housing on very short notice) can be awful.

  • Be clear on what you are looking for. 

Sometimes EA Orgs will say something like “we have no degree requirements” or "when in doubt apply" but in reality will mostly hire people with PhDs. I appreciate your open-mindedness regarding degree and experience requirements, but saying something like "we expect most successful applicants to have X" or "we may consider Y in exceptional circumstances" helps applicants assess whether the opportunity is worth spending time on.

  • Relatedly, show statistics for past application rounds.

A few orgs published some useful statistics on their past cohorts. I found this very helpful. E.g., "historically, around 50% of our cohort were PhD students, 20% X, ..."

  • Adopt evidence-based hiring practices

This is already done by many (great!). I personally am no authority on the matter, so please add a comment below if you know more about this. I would presume that blind scoring and having >1 person score each anonymous application reduces bias and noise.

  • Have paid work tests.

This is often already done, and greatly appreciated. This may allow many people to afford applying.

  • Have clear citizenship requirements.

This goes for EA orgs based anywhere. Don't just ask people where they are or are not legally allowed to work and then move on with the application. TELL applicants what you need. Is X citizenship required? Can you hire people with no citizenship but permanent residence / work permits? Can you hire people with temporary work permits (e.g., OPT, STEM OPT in the US)? Can you sponsor work visas? Do you allow remote work for people not allowed to work in your country?

  • Send rejection emails, don’t just not respond to an application.

Great, most already do this!

  • Increase information sharing transparency. 

Most EA orgs have a box you can check at the end of an application that says something like “would you like us to share your information with similar orgs that may be looking for talent?”

This sounds like a great idea, but I would like to know more details. What may you share? For example, I would say "yes" to my resume, but "no" to the results of my work tests or your evaluations. That's because I don't want one failure or mistake to cascade to every application. I would also say "yes" to sharing for up to a year, but "no" beyond that (because my experience and skills may change a lot in a year). Just one sentence to increase transparency would be great.

  • (When possible) allow opt-in for short feedback on the application.

This may be the hardest one to bring about (it's time-consuming and very out-there), so I'm very uncertain about it. I think I would have personally benefitted a lot from feedback on my applications. Why was I rejected? Was it something I could change?

Especially at later stages of the application process (when there are fewer applicants), I would love to be able to opt-in to something like ~1 sentence about the biggest reason I was rejected. Even if it's hurtful.

DISCUSSION I'D LIKE TO SEE IN THE COMMENTS

> Are you also going through something similar? Feel free to share your experience.

> Do you have recommendations of your own? Please add them.

> Are you someone in charge of hiring at an EA org? I'd love to hear general advice on what you see most applicants getting wrong, or how most could improve. I'd also love to see some discussion on why some of these recommendations may be infeasible. 

Final note: I decided to write this while still waiting for other applications to get back to me. That's because I thought that if I got a job I might lose the motivation to write this, and it seemed valuable. Just to be extra careful on the anonymity front, if I get one of these jobs, I will not mention it here.

Comments55
Sorted by Click to highlight new comments since:

Sometimes EA Orgs will say something like “we have no degree requirements”... but in reality will mostly hire people with PhDs. I appreciate your open-mindedness regarding degree and experience requirements, but saying something like "we expect most successful applicants to have X"... helps applicants assess whether the opportunity is worth spending time on.

 

I would advise against this specific policy, since I think it risks being misleading. My impression is that many EA employers genuinely assign ~0 weight to an applicant having a PhD, but nevertheless find themselves hiring disproportionately PhDs, simply because the pool of people who have the traits they are interested in (e.g. interested in autonomously completing very high-level research) disproportionately choose to complete PhDs.[1] If an org simply reports ">50% of successful applicants have PhDs", this will likely mislead applicants into thinking that having a PhD is important to success, even though it's assigned no weight (one could imagine similar dynamics by reporting other characteristics which play no role in selection, but which are heavily over-represented in successful applicants). 

To be clear, if employers actually do assign significant weight to a characteristic, but are open to considering exceptions, then I think it's good to be transparent about both sides of that.

 

  1. ^

     The EA community itself is very disproportionately skewed towards people with graduate degrees, which probably contributes to this. Last time we checked, in 2019, >45% of respondents had a graduate degree, and I would expect this number to have only increased since then, since many of the remainder were students, who will themselves likely go on to complete further degrees at quite high rates.

I think employers could just disclose that- "We do not assign any weight to having a PhD, but in previous application rounds, >50% of successful applicants had a PhD".

Right -- there's still a correlation between the legible external factor and outcome, even if there is no causal relationship.

Hypothetical example: Prestigious University does not consider test scores in determining admissions at all. However, test scores happen to be strongly correlated to academic ability, and it so happens that most admitted students have scores in the 99th percentile. This would still be useful information for someone with a test score in the 80th percentile even though there is zero direct causal relationship between test scores and admission.

Whether this information is good to include depends crucially on the causal relationships though. 

In the simple test score case, academic ability causes both test scores and admission success, and test scores serve as a strong proxy for academic ability and we assume no other causal relationships complicating matters. Here test scores serve as a useful proxy for academic ability, and are relatively innocent as an indicator for likelihood of admission (i.e. they serve as a pretty good indicator of whether one is likely to succeed).

But telling people about something which was strongly associated with success, but not causally connected with the factors which determine success in the right way would be misleading. 

In a more complex (but perhaps realistic) case, where completing a PhD is causally related to a bunch of other factors, then saying that most successful applicants have PhDs risks being misleading about one's chances of success / suitability for the role and about the practical utility of getting a PhD for success. 


 

David, I think that you've hit the nail on the head. I imagine how I would react if a job posting said "the majority of successful candidates grew up in families with either wealth or income in the top quartile of their home country," and I know even though that is predictive of success (and it might be a useful data point for estimating how likely m own application would be to succeed). I wouldn't want to see it as a candidate. We could substitute in something less controversial, such as height, and I think that my preference to not see it would remain the same.

Right, but there is definitely a way you can communicate this information without being misleading. You could say, "in previous rounds, >50% of successful applicants had a PhD, but we do not assign weight to PhDs and do not believe there is a direct causal relationship between having a PhD and receiving an offer".

I broadly agree that such a statement, taken completely literally, would not be misleading in itself. But it raises the questions:

  •  What useful information is being conveyed by such a statement?
    • If interpreted correctly, I think the applicant should take anything actionable away from the statement. But what I suspect many will do, will be to conclude that they probably shouldn't apply if they don't have a PhD, even if they meet all the requirements.
  • What is pragmatically implied by such a statement?
    • People don't typically go out of their way to state things which they don't think are relevant (without some reason). So if employers go out of their way to state ">50% of successful applicants had a PhD...", even with the caveat, people are reasonably going to wonder "Why are they telling me this?" and a natural interpretation is "They want to communicate that if I don't have a PhD, I'm probably not suited to the role, even if I meet all the requirements", which is exactly what employers don't want to communicate (and is not true) in the cases I'm describing.[1] 
  1. ^

    I think there are roles where unless you have a PhD, you are unlikely to meet the requirements of the role. In such cases, communicating that would be useful. But the cases I'm describing are not like that: in these cases, PhDs are really not relevant to the roles, but applicants will have very commonly undertaken PhDs. I imagine that part of the motivation for wanting to see the information are because people think that things are really like the former case, not the latter case.

Even if your current best guess is that it's not causal, if having a PhD meaningfully increases your chances of getting hired conditional on having applied, that information would help candidates get a better sense of their probability of getting hired

[edited to specify that I meant conditional on applying]

A relevant reframing here is whether having a PhD provides a high Bayes factor update to being hired. Eg, if people with and without PhDs have a 2% chance of being hired, but ">50% of successful applicants had a PhD" because most applicants have a PhD, then you should probably not include this, but if 1 in 50 applicants are hired, but it rises to 1 in 10 people if you have a PhD and falls to 1 in 100 if you don't, then the PhD is a massive evidential update even if there is no causal effect.

To further elaborate on what I think might be a crux here: 

I think that where the job requirements are clearly specified, predictive proxies like having a PhD may have no additional predictive power above what is transparent in the job requirements and transparent to the applicants themselves in terms of whether they have them or not.

For example:

  • Knowing programming language X may be necessary for a job and may be most common among people who studied computer science. But if 'knowing X' is listed in the job ad and the applicant knows they know X, then knowing they have a computer science degree and knowing the % successful applicants with such a degree adds no additional predictive power.
  • Having a degree in Y may be necessary for a job and because more men than women have degrees in Y, being a man may thereby be predictive of success. But if you are a woman and know you have a degree in Y, then you don't gain any additional predictive power from knowing the % successful female applicants.

My supposition is that possession of a PhD is mostly just a case like the above for many EA roles (though I'm sure it varies by org, role and proxy). But I imagine those who want the information about PhDs to be revealed think they are likely to be proxies for latent qualities of the applicant, which the applicants themselves don't know and which aren't transparent in the job ad.

I think this is one piece of information you would need to include to stop such a statement being misleading, but as I argue here, there are potentially lots of other pieces of information which would need to be included to make it non-misleading (i.e. information about any and all other confounders which explain the association).

Otherwise, applicants will not know that conditional on X, they are not less likely to be successful, if they do not have a PhD (even though disproportionately many people with X have a PhD).

Edit: TLDR, if you do not also condition on satisfying the role requirements, but only on applying, then this information will still be misleading (e.g. causing people who meet the requirements but lack the confounded proxy to underestimate their chances).

Exactly

As I suggested in my first comment, you could do the same "by reporting other characteristics which play no role in selection, but which are heavily over-represented in successful applicants": for example, you could report that >50% of successful applicants are male,[1] white, live in certain countries, >90% have liberal political beliefs, and probably a very disproportionately large number have read Harry Potter fan fic.[2] Presumably one could identify other traits which are associated with success via their association with these other traits e.g. if most successful applicants have PhDs and PhDs disproportionately tend to [drink red wine, ski etc.], then successful applicants may also disproportionately have these traits.

Of course, different people can disagree about whether or not each of these are causal. But even if they are predictive, I imagine that we would agree that at least one of these would likely mislead people. For example, having read Harry Potter fan fic is associated with being involved with communities interested in EA-related jobs for largely arbitrary historical reasons.[3] 

This concern is particularly acute when we take into account the pragmatics of employers highlighting some specific fact.[4] People typically don't offer irrelevant information for no reason. So if orgs go out of their way to say ">50% of successful applicants have PhDs", even with the caveat about this not being causal, applicants will still reasonably wonder "Why are they telling me this?" and many will reasonably infer "What they want to convey is that this is a very competitive position and I should not apply."

As I mentioned in the footnote of my comment above, there are jobs where this would be a reasonable inference. But I think most EA jobs are not like this.

If one wanted to provide applicants with full, non-misleading information, I think you would need to distinguish which of the cases applies, and provide a full account of the association which explains why successful applicants might often have PhDs, but that this is not the case when you control for x, y, z. That way (in theory), applicants would be able to know that conditional on them being a person who meets the requirements specified in the application (e.g. they can complete the coding test task), the fact that they don't have a PhD does or does not imply anything about their chances of success. But I think that in practice, providing such an account for any given trait is either very difficult or impossible.[5] 

  1. ^

    Though in EA Survey data, there is no significant gender difference in likelihood of having an EA job. In fact, a slightly larger proportion of women tend to have EA jobs.

  2. ^

    None of these reflect real numbers from any actual hiring rounds, though they do reflect general disparities observed in the wider community.

  3. ^

    Of course, you could describe a situation where having read Harry Potter fan fic actually serves as a useful indicator of some relevant trait like involvement in the EA community. But, again, I'm not referring to cases like this. Even in cases where involvement in the EA community is of no relevance to the role at all (e.g. all you need to do to be hired is to perform some technical, testable skill, like coding very well), applicants are likely to be disproportionately interested in EA, and successful applicants may be yet further disproportionately interested in EA, even if it has nothing to do with selection.

    This can happen if, for example, 50% of the applications are basically spam (e.g. applications from a large job site, who have barely read the job advert and don't have any relevant skills but are applying for everything they can click on). In such cases, the subset of applications who are actually vaguely relevant, will be disproportionately people with an interest in EA, people with degrees etc.

  4. ^

    In some countries there may be a norm of releasing information about certain characteristics, in which case this consideration doesn't apply for those characteristics, but would for others.

  5. ^

    And that is not taking into account the important question of whether all applicants would actually update on such information provided completely rationally, or would whether many would be irrationally inclined to be negative about their chances, and just conclude that they aren't good enough to apply if they don't have a PhD from a fancy institution.  

You all raise valid points. I agree that if they assign a low value to a PhD they shouldn't disclose that information. That said, people with PhDs also had the opportunity to show that they can do independent research well, and if you care about people having years of independent research under their belt, then you would expect to hire mostly PhDs.

Still, it doesn't have to be a PhD, but orgs could say that they expect the most competitive applicants to... "have shown the ability to do independent research (>1 year)" or "successfully manage a team (>6 months)" or whatever. 

Whatever markers you're actually looking for (not the ones that merely correlate with them): I'd love to know more!

Your first job out of college is the hardest to get. Later on you'll be able to apply for jobs while working, which is less stressful, and you'll have a portfolio of successful projects you can point to. So hopefully it's some small comfort that applying for jobs will probably never suck as much as it does for you right now. I know how hard it can be though, and I'm sorry. A few years ago after graduating from my Master's, I submitted almost 30 applications before getting an offer and accepting one.

I do notice that the things you're applying to all seem very competitive. Since they're attractive positions at prestigious orgs, the applicant pool is probably unbelievably strong. When there are hundreds of very strong applicants applying for a handful of places, many good candidates simply have to get rejected. Hopefully that's some more small comfort. 

It may also be worth suggesting, though, for anyone in a similar position who may be reading this, that it's also fine to look for less competitive opportunities (particularly early on in your career). Our lives will be very long and adventurous (hopefully), and you may find it easier to get jobs at the MITs and Horizons and GovAIs of the world after getting some experience at organisations which may seem somewhat less prestigious.

To speak on my own experience, among those ~30 places that rejected me were some of the same orgs you mention (e.g. GovAI, OpenPhil, etc.). The offer I ended up accepting was from Founders Pledge. I was proud to get that offer and the FP research team there was and is very strong, but I do think it's probably the case that it was a somewhat less competitive application process. But ultimately I loved working at FP. I got to do some cool and rigorous research, and I've had very interesting work opportunities since. It's probably even the case that, at that point in my career, FP was a better place for me to end up than some of the other places I applied.

Thank you very much Stephen, this was a nice comment to receive, and it does provide some much-needed reassurance and good advice. I'm going to widen my search now.

I also hope my post provided some reassurance to others in my situation.

Having been rejected for a few including most recently by Giving What We Can, I'd say their feedback process was a country mile ahead of any other org I've applied to, and other organisations should look to them as the gold standard for such a process. I hope they'll write it up on the forum, soon.

Thanks for the kind feedback about our hiring process! I'll encourage the team to write up how we have approached the hiring for some roles where we think we ran a good process!

[Edit: Actually Michael Townsend wrote this in the past about our hiring process, which is worth reading]

Thanks for pointing this out! It'd be great for EA orgs to converge on some best practices (to some degree, I think they already do).

If you haven't, you should talk to the 80k advising team with regard to feedback. We obviously aren't the hiring orgs ourselves, but I think we have reasonable priors on how they read certain experiences and proposals. We've also been through a bunch of EA hiring rounds ourselves and spoken to many, many people on both sides of them. 

Thank you! I'll try this.

Hey, Zack from XLab here. I'd be happy to provide a couple sentence feedback on your application if you send me an email. 

The most common reasons for rejection before an interview were things like no indication of having US citizenship or student visa, ChatGPT-seeming responses, responses to the exercise that didn't clearly and compellingly indicate how it was relevant for global catastrophic risk mitigation, or lack of clarity on how mission aligned the applicant was.

We appreciate the feedback, though.      

This is even more puzzling to me now, because I think I clearly satisfied all of these (I looked back over my responses, which I saved in a GDoc).

But thank you for the offer! If I get over my strong desire for anonymity I'll be sure to reach out.

Edit: I say "clearly" not to add emphasis to my response (I didn't mean for it to sound contrarian), but because these particular criteria seem easy to judge: they're mostly not "how good are you at X"  but rather "have you done X."

I've offered to review resumes previously, and if you'd like to send me a message I'd be happy to offer feedback. Heck, if you want to schedule a call to do a mock interview, or very informal advice/coaching/discussion relating to career and job applications (or even just to vent and express your feelings) I'd be happy to lend a hand. I can't offer any specific insider information for hiring rounds, but I can share plenty of broad/general information about how hiring rounds function and different factors or scenarios that might cause your application to be rejected.

Thank you so much Joseph! I really appreciate that.

I'll think about it, and if I get over my strong desire for anonymity I'll reach out.

Increase the required time-commitment as the application progresses.

I am a big fan of this concept. I'd estimate that between 25% and 50% of the hiring rounds from EA organizations that I've been involved in (as a candidate) have requested that I spend 2-3 hours of time as the second stage of the hiring round (the first stage is submitting an application). That strikes me as too big of an ask to occur that early in the process. I'd suggest proportional reciprocity (I concept I made up, which maybe someday I'll share a post on), and how there should be a gradual escalation of commitment, rather than a sudden/steep escalation.

I have several different Google Docs with rough drafts of posts that maybe I'll someday get around to relating to hiring rounds: actionable feedback, inviting candidates to apply, avoiding bad questions, validity, good and bad phrasing for rejecting candidates, letting people know what criteria they will be evaluated on, etc. None of them are fully fleshed out, but if anyone wants to see and comment on the rough drafts and the brainstorm versions, please let me know.

I'm happy to chat informally with anyone planning or designing a hiring round that wants to get input, ask for suggestions, or generally bounce around ideas.

Personally I think this is much less a concern if a high time commitment involves a decently paid work trial. Since the initial application is never trivial, it could actually increase the expected value of applying if the next stage is (e.g.) a 3-day trial. 

I just don't have the time. I'm often at absolute capacity with college assignments, clubs, work, etc.

Especially if many orgs decided to have longer work trials, I might be unable to apply to many or may end up submitting subpar work tests because of this. 

Also, I'll point out that oftentimes EA orgs have *initial applications* that take 2-4 hours. This seems like clearly too much. I think a quick first round that verifies the following would be best:

  • The applicant is legally able to work for us.
  • The applicant satisfies our minimum experience/knowledge cutoff (looking at the resume and asking one question about their degree of engagement with, e.g., AI safety).
  • The applicant seems value-aligned or understands our mission (one question about why they want to apply or work in X area).

Longer questions about career plans, work-trial-y questions, reasoning and IQ-test-y questions, research proposals, and everything else should belong in later stages, when you've already filtered out people that just really did not belong in the application process.

This is an aspect that I don't think of as often, but I do think it is very important. Some people have several hours free and can set aside three uninterrupted hours to focus on a single task. But not everyone can. I'm especially thinking of people who have children and work commitments. So in a sense it is unintentionally exclusionary.

To a certain extent, probably every hiring rounds is unintentionally exclusionary to varying extents, but I think that requiring candidates to spend three hours of uninterrupted time is a type of unintentionally exclusionary that can be relatively easily avoided. It is filtering out candidates based on something that is unrelated to how well/poorly they would perform on the job.

At a minimum, candidates should be invited to seek a waiver of any "complete in one sitting" requirement on an early-round work task for good cause, without any adverse consequences whether the waiver is granted or not. Speaking as an employed individual with a preschooler, three hours of uninterrupted time is a a big ask for an early-round job application process!

I find it’s very rare to have to do the work test in 1 sitting, and I at least usually do better if I can split it up a bit

From my own experience as an applicant for EA organizations, I'd estimate that maybe 50% to 60% of the work sample tests or the tasks that I've been assigned have either requested or required that I complete it in one sitting.

And I do think that there is a lot of benefit in limiting the time candidates can spend on it, otherwise we might end up assessing Candidate A's ten hours of work and Candidate B's three hours of work. We want to make sure it is a fair evaluation of what each of them can do when we control for as many variables as possible.

The work tests that don’t require a single sitting still do have a max number of hours

It sounds like you would benefit from greater prioritisation and focus. (Eg see: https://calnewport.com/dangerous-ideas-college-extracurriculars-are-meaningless/).

Thank you for your advice! I will say that my part-time job was research, which is crucial if I want to get research positions or into PhD programs in the near future. The clubs I lead are also very relevant to the jobs I'm applying to, and I think they may be quite impactful (so I'm willing to do them even if they harm my own odds). 

Regardless of my specific situation, I think EA orgs should conduct hiring under the assumption that a significant portion of their applicants don't have the time for multiple multi-hour work tests in early stages of the application process (where most will be weeded out).

I wanted to quickly say thank you for your suggestion that orgs increase information sharing transparency. I haven't seen this discussed elsewhere, and I think it's a good idea! It's now on our list of things to consider for future hiring rounds at 80k.

-- Sash, Head of Recruiting at 80k

I'm very glad to hear this -- thank you Sash!

Have you applied for 80k career advice?

Yes, I've had two calls with them. Maybe it wasn't very clear from my background but I've been pretty deeply involved with EA for about 2 years (also went to multiple EAGs). 

How do you think 80k career advice would help in my situation?

Mainly advice on intermediate steps to get more domain-relevant experience.

One thing I have heard is that having long-ish application stages provides value by getting more people to think about relevant topics (I have heard this from at least two orgs I think). E.g. having several hundred people spend an hour writing a paragraphs about an AI safety topic might be good by virtue of generally having more people think more about this being good. I haven't seen a write-up weighing up the pros and cons of this though. I agree this can be bad for applicants.

I hadn't thought of this before, and it does make me reconsider the value of these types of questions early on -- even if it burdens applicants.

At least where the acceptance rate is 3-5 percent, it seems plausible that there could be something like the "AI Safety Common Pre-Application" that would reduce the time burden for many applicants. In many cases it would seem possible to say, on information not customized to a specific program, that an applicant just isn't going to make that top 3-5%.

(Applicants meeting specified criteria would presumably be invited to skip the pre-app stage, eliminating the risk of those applicants being erroneously screened out on common information.)

By analogy: In some courts, you have to seek permission from the court of appeals prior to appealing. The bar for being allowed is much lower than for succeeding, which means that denials at permission stage save disappointed litigants the resources they'd otherwise use to prepare full appeals.

This is something that BERI has actually been discussing! If anyone is interested in talking to us more about this (either from a hiring/evaluation perspective or from a candidate perspective, you can send an email to contact@existence.org and I'll follow up. 

I would be excited about a common application. My sense is that the only reason it doesn't exist is that no one has put the time in to create it;  when I've talked to hiring managers, most were in favor of the project (though there are some concerns, e.g. the fact that applications are currently a costly signal is helpful for identifying the applicants who actually really want to apply).

Paul Graham had a nice take on this:

The more arbitrary college admissions criteria become, the more the students at elite universities will simply be those who were most determined to get in.

I think "actually really want to apply" is not enough of a correlation to base decisions on. The fact is that even qualified+motivated applicants would need to apply to a dozen+ places, and often EA application questions require a lot of thought anyway.

To give an example, lots of EAs are from top unis, and I'm pretty sure the meta-strategy for applying to selective unis is to not fall in love with any particular one and shotgun a lot of applications. This reduces the role of personal fit.

One thing I always thought was interesting would be to have an actual application limit like UK colleges do. I applied to 20 colleges, and I would've been fine applying to 5 with improved odds.

Are there antitrust concerns with multiple orgs (even if nonprofit) using a common screener? 

Many elite US universities do this -- the Common Application to which my comment indirectly alludes -- and law schools did something vaguely similar at least in the mid-2000s (showing my age here). So I am expecting the answer is negative.

Common evaluation would be trickier -- e.g., I vaguely remember some universities getting into trouble with allegations that they were divvying up choice applicants rather than competing for them. [Edit: It may have been this -- they were apparently colluding about financial aid offers, and reached a settlement with DOJ Antitrust to stop doing this.]

I vaguely remember some universities getting into trouble with allegations that they were divvying up choice applicants rather than competing for them.

This is explicitly the policy in the UK, and (I would guess) almost entirely eliminates offer acceptance uncertainty for Oxford and the other place:

... you can't apply to Oxford and Cambridge in the same year.

Hm I don't obviously see the analogy with the common app - hiring employees and admitting students seem quite different.

(Disclaimer: I'm not an antitrust lawyer, not that I can give anyone legal advice on the Forum anyway. Also, this is a US perspective.) 

The basic principle is that agreements "in restraint of trade" are illegal, with that term interpreted by reference to a "rule of reason" developed through over a century of caselaw. Neither student admissions nor employee hiring are really in the heartland of antitrust, although it has been applied to both in the past. 

I don't see how admissions and hiring are that different when it comes to determining whether use of a common application form would be in restraint of trade (i.e., whether it unreasonably impedes fair competition). I'm also unclear on what a good argument would be for an assertion that using the same first-stage application would unreasonably impede fair competition for employees in the first place. I'd argue that it would promote competition in the market for employees, by making it easier for employees to apply to more potential employers. But I didn't dig into any caselaw on that.

This would definitely reduce the time cost. 

I'd also worry, though, about the application having only certain kinds of questions which do not bring out the best in (some/many) people. I've definitely seen some applications where I thought I wasn't given the chance to show my worth, and others where I was. This app would have to be drafted with a lot of care. 

Also thank you @elizabethcooper for taking initiative on this!

Likely a reason this hasn't been done yet (and why we aren't saying 100% we'll do this)--it's very difficult to create a "good" application and process, nonetheless create one that multiple orgs would all find valuable and be able to fit into their workflows. I'm happy to listen to any feedback/input you may have on applications that you thought allowed you to present yourself best.

Yes, I think we would benefit from having organizations run the pre-app and standard app in parallel for one cycle (while compensating applicants for the additional work on the margin!). We'd be looking for a pre-app "score" for each organization at which very few people whose application would have survived the first round of the old process would be eliminated by the pre-app and/or ~no one who was ultimately accepted was screened out.

Thanks for sharing, Ávila! You may be interested in the post Rejection thread: stories and tips.

Curated and popular this week
Relevant opportunities