
Our incubation program brings together founders, advisors, and funders to enable them to advance the AI safety field together. Our program is open to incubating both non-profit and for-profit companies.
The information below is about our last program. We are currently not accepting applications, but we encourage you to express your interest to take part in a future iteration of the program.
Join the next program
We are at a critical juncture in AI development. A recent survey of 2,778 AI researchers revealed that nearly half estimate at least a 10% chance of advanced AI leading to catastrophic outcomes, including human extinction [1]. Despite this, only 1-3% of AI publications focus on safety [2]. The world urgently needs more dedicated AI safety research to address these existential risks and shape the future of AI responsibly.
Our program is your starting point to make a positive impact in the AI Safety field. We facilitate:

Whether you're starting from scratch or already have a co-founder and initial concept, our program is designed to help you quickly move forward. We focus on equipping you with the skills, funding and network to truly make a difference in AI safety.
We are a non-profit focused on advancing the AI Safety field and our mission is to make you succeed by incubating and accelerating your idea. Catalyze will not take a stake in any incubated company, whether for-profit, or non-profit. Your organization will be entirely under your control.
You will receive tailored support from the Catalyze Impact team And from external experts, such as:

Founded several companies and has led teams of 5 to 8,200 in managerial, c-level before shifting his career to focus on AI safety full-time

COO of Apollo Research, an AI evals and interpretability research organization, and has launched and led several social impact businesses
This program is a pilot and consists of two phases. If you already have a co-founder and organization idea, you can join us for Phase 2 without joining Phase 1. If so, you will have the opportunity to indicate this in your application.
Location: Online.
Time commitment: 12-20 hours a week for 5 weeks
between 4 November and 8 December.
During the initial online program, you will be part of a cohort of around 15 individuals committed to improving AI Safety, which will help you expand your network of AI Safety professionals.
Participants will:
Participants have a strong say in who they test their fit with and will decide themselves who they want to co-found with.
Towards the end of Phase 1 you will evaluate whether you want to commit to moving to Phase 2 with the co-founder you found, and we will assess whether you and your co-founder are a good fit for Phase 2. Advancing to Phase 2 is primarily based on forming a promising founding team during Phase 1. While having a well-developed organization concept is beneficial, we value teams that are open to refining or pivoting their ideas.
After Phase 1, there will be a four-week break during which you will prepare for Phase 2 by further developing your organization plan with your identified co-founder and arrange to join us full-time in-person in London for Phase 2.
Location: London.
Time commitment: full-time for 4 weeks between 6 January and 2 February. For exceptional candidates we may allow part-time participation instead.
You and your co-founder(s) will work together in-person to focus on the very early stages of building your organization. While you focus on taking the next steps in building your organization, preparing to fundraise, and further stress-testing your co-founding fit, we provide various forms of support, such as:
After the program, we will provide continued individualized support for our graduates and continued access to the Catalyze community.
You are highly-committed, a self-starter, and have a lot of grit. Starting an organization is difficult, but it can have an incredible impact.
You aspire to create meaningful, positive impact in the field and believe that it is a priority to prevent severely negative outcomes from AI.
You are open to alternative views, have an inclination to explore new information and arguments, and consider these critically to alter your course when new information is available.
for an AI Safety organization, or are willing to collaborate with someone who does. You will be able to develop this further during the program
The application process consists of four steps. The process is designed to help you figure out if you would be a good fit for starting a new organization, so do not hesitate to apply if you are unsure.
We are no longer accepting applications for this round of the program. If you are interested in taking part in a future iteration, please express your interest below or sign up for our newsletter.
While we assume a basic understanding of AI Safety, equivalent to completing an introductory course such as BlueDot’s AI Safety Fundamentals, we encourage you to apply if you are passionate and willing to learn quickly. You can access AI Safety Fundamentals course materials and work through them in your own time. Additionally, you can find more resources to get started with AI Safety here and here.
Please check the “Who Are We Looking For” section for more details.
While we assume a basic understanding of AI Safety, equivalent to completing an introductory course such as BlueDot’s AI Safety Fundamentals, we encourage you to apply if you are passionate and willing to learn quickly. You can access AI Safety Fundamentals course materials and work through them in your own time. Additionally, you can find more resources to get started with AI Safety here and here.
Please check the “Who Are We Looking For” section for more details.
While we assume a basic understanding of AI Safety, equivalent to completing an introductory course such as BlueDot’s AI Safety Fundamentals, we encourage you to apply if you are passionate and willing to learn quickly. You can access AI Safety Fundamentals course materials and work through them in your own time. Additionally, you can find more resources to get started with AI Safety here and here.
Please check the “Who Are We Looking For” section for more details.
While we assume a basic understanding of AI Safety, equivalent to completing an introductory course such as BlueDot’s AI Safety Fundamentals, we encourage you to apply if you are passionate and willing to learn quickly. You can access AI Safety Fundamentals course materials and work through them in your own time. Additionally, you can find more resources to get started with AI Safety here and here.
Please check the “Who Are We Looking For” section for more details.
While we assume a basic understanding of AI Safety, equivalent to completing an introductory course such as BlueDot’s AI Safety Fundamentals, we encourage you to apply if you are passionate and willing to learn quickly. You can access AI Safety Fundamentals course materials and work through them in your own time. Additionally, you can find more resources to get started with AI Safety here and here.
Please check the “Who Are We Looking For” section for more details.
While we assume a basic understanding of AI Safety, equivalent to completing an introductory course such as BlueDot’s AI Safety Fundamentals, we encourage you to apply if you are passionate and willing to learn quickly. You can access AI Safety Fundamentals course materials and work through them in your own time. Additionally, you can find more resources to get started with AI Safety here and here.
Please check the “Who Are We Looking For” section for more details.
While we assume a basic understanding of AI Safety, equivalent to completing an introductory course such as BlueDot’s AI Safety Fundamentals, we encourage you to apply if you are passionate and willing to learn quickly. You can access AI Safety Fundamentals course materials and work through them in your own time. Additionally, you can find more resources to get started with AI Safety here and here.
Please check the “Who Are We Looking For” section for more details.