I'm on an (unintended) Gap Year at the moment and will study maths at university next year. Right now I'm exploring cause prioritisation.
Previously I was focused on Nuclear War, but I no longer think it's worth me working on, because it's very intractable and the extinction risk is very low. I've also explored AI Safety (doing the AI Safety Fundamentals Course) but my coding really isn't up to scratch at the moment.
The main thing I'm focusing on right now is cause prioritisation - I'm still quite sceptical of the theory of working on extinction risks.
Things I've done:
I'm looking for opportunities to gain career capital this summer, particularly in EA-related orgs. I'm open to many things, so if you think I might be a good fit, feel free to reach out!
If you'd like advice on Non-Trivial or are interested in talking about Cause prioritisation, send me a message!