Hide table of contents

TL;DR Applications are open for Catalyze's upcoming incubation program for technical AI safety research organizations. To join the program, apply by 8 September (extended). 

Apply now

What will this program involve?

The main components of the program are:

  • Finding a co-founder: Thoroughly test your co-founding fit with others who complement your skills and share your values.
  • Mentor & advisors: Access support from the Catalyze Impact team and external industry experts & experienced entrepreneurs.
  • Funding opportunities: Connect with our non-profit seed funding circle and potential investors, while being supported by a requestable stipend.
  • Network & community: Immerse yourself in a group of fellow AI safety founders and grow your network in the London AI safety ecosystem.
  • Building up your organization: Work on the priorities within your up-and-coming organization, while making use of external support whenever you want to.​

Phase 1 of the program (Nov/Dec, online) is focused on testing your co-founding fit with other participants, primarily through collaboration projects. Through these projects, you get a sense of how well you work together while further developing the research organization proposals.

Towards the end of Phase 1 you will evaluate whether you want to commit to moving to Phase 2 with the co-founder you found, and we will assess whether you and your co-founder are a good fit for Phase 2.

Throughout Phase 2 (January, London), you and your co-founder(s) will work together in-person to focus on the very early stages of building your organization. While you focus on taking the next steps in building your organization, preparing to fundraise, and further stress-testing your co-founding fit, we provide various forms of support. This includes: office hours with a network of experienced mentors and advisors, a requestable stipend, networking opportunities, and seed funding opportunities.

Who is this program for?

You would be a great fit for the program if:

  1. You are highly-committed, a self-starter, and have a lot of grit.
  2. You are motivated to contribute to AI Safety: You aspire to create meaningful, positive impact in the field and believe that it is a priority to prevent severely negative outcomes from AI.
  3. You have a scout mindset: You are open to alternative views, have an inclination to explore new information and arguments, and consider these critically to alter your course when new information is available.
  4. You either have a preliminary plan or agenda for an AI Safety research organization, or are willing to collaborate with someone who does. You will be able to develop this further during the program.

 Additionally, you ideally fit one of the following three profiles:​

  • Technical Profile: Technical research or engineering experience, combined with a good understanding of AI Safety (e.g. you have worked as a researcher within AI Safety or a closely related field, completed MATS, and/or conducted several research projects within AI).
  • Operational/Entrepreneurial Generalist: You have an operational and entrepreneurial skill set, such as finance, legal, HR, contracts, systems and processes, or have founded organizations in the past, and you have a reasonable understanding of AI Safety, equivalent to completing BlueDot’s Safety Fundamentals Alignment or Governance track.
  • All-Rounder: You have a mix of the technical and operational skills in the other profiles and at least a reasonable understanding of AI Safety.

​​During the program we will facilitate creating founding teams that possess strong entrepreneurial know-how and technical knowledge. If you are not sure whether you meet all of these characteristics, please apply anyway. The application process is designed to help both you and us gain more clarity on your fit for co-founding an AI safety research organization.

More information

For more information on the program & FAQ, please visit our website

11

0
0
1

Reactions

0
0
1

More posts like this

Comments
No comments on this post yet.
Be the first to respond.
Curated and popular this week
Relevant opportunities