Hi! Thanks for checking out my EA Forum Bio! My name is Coleman; I am the President of Cornell Effective Altruism, a Global Catastrophic Risk researcher, and a science communicator with a focus on AI, emerging technology policy, societal health/resilience, ethics, and long-term flourishing. In addition to my passion for community building and speaking with various folks (both seasoned veterans and completely new to EA/GCR), I am interested in improving clearing thinking surrounding this century's greatest challenges, in part through expanding upon the systemic risk approach.
My academic background is in Cognitive Science, Ethics, and Politics. My favorite thinkers from history are Aristotle, Hegel, and I especially admire the work of Derek Parfit, Vikctor Frankl, and the recent empirical studies out of Cornell's Purpose and Identity Lab.
Currently projects (updated semi-regularly):
21st Talks, a growing existential risk and EA-focused media organization with a podcast and youtube channel, is looking for a content admin/research assistant.
The role is not currently fully funded but the content admin will receive royalties.
The responsibilities include taking over the reins of strategizing potential podcast guests and setting up interviews via writing the guest directly or their PR team.
We are looking for someone who is also able to meet once a week to go over that upcoming content, and plan monthly content calendars, along with someone proficient in working on public communication-focused research projects.
The role will be 5 hours a week and should be a lot of fun. Current university students who are looking to be involved in EA communication seem like great fits. Bonus points if you're in or near Ithaca NY where our current production team (Coleman Snell, Justin Dmelio) attends university.
More details are available to anyone interested. You can shoot me an email at cjs386@cornell.edu, using the subject line "21st Talks Content Admin Inquiry"!
Have a great rest of your week!
In line with Khorton's answer, life coaching
Especially from someone within Effective Altruism, who understands one's longterm goals, having an individual to sit down with each week and talk through any and all issues is well worth the pay. When coupled with journaling, the issues that would otherwise impact one's EA work can be majorly minimized.
Even issues that may seem small or insignificant on the surface, if they are recurring, can be extremely taxing on one's sustainable EA motivation. The life coaching widely accessed within EA needs to extend far beyond just direct aspects of the career.
If you're interested in life coaching on better achieving your goals, romance sustained motivation, etc... feel free to DM me. I'd be happy to chat about options beyond just therapy!
Hi, I just wanted to say this is a great post! My background is in psychology and AI, so I am particularly excited about this piece and would be really excited to talk more about some of your points, especially relating to key questions that are important to investigate that may inform AI governance strategy (my current focus!)