Open Philanthropy donations made (filtered to cause areas matching Global catastrophic risks)

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of July 2025. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Basic donor information

ItemValue
Country United States
Affiliated organizations (current or former; restricted to potential donees or others relevant to donation decisions)GiveWell Good Ventures
Best overview URLhttps://causeprioritization.org/Open%20Philanthropy%20Project
Facebook username openphilanthropy
Websitehttps://www.openphilanthropy.org/
Donations URLhttps://www.openphilanthropy.org/giving/grants
Twitter usernameopen_phil
PredictionBook usernameOpenPhilUnofficial
Page on philosophy informing donationshttps://www.openphilanthropy.org/about/vision-and-values
Grant application process pagehttps://www.openphilanthropy.org/giving/guide-for-grant-seekers
Regularity with which donor updates donations datacontinuous updates
Regularity with which Donations List Website updates donations data (after donor update)continuous updates
Lag with which donor updates donations datamonths
Lag with which Donations List Website updates donations data (after donor update)days
Data entry method on Donations List WebsiteManual (no scripts used)
Org Watch pagehttps://orgwatch.issarice.com/?organization=Open+Philanthropy

Brief history: Open Philanthropy (Open Phil for short) spun off from GiveWell, starting as GiveWell Labs in 2011, beginning to make strong progress in 2013, and formally separating from GiveWell as the "Open Philanthropy Project" in June 2017. In 2020, it started going by "Open Philanthropy" dropping the "Project" word.

Brief notes on broad donor philosophy and major focus areas: Open Philanthropy is focused on openness in two ways: open to ideas about cause selection, and open in explaining what they are doing. It has endorsed "hits-based giving" and is working on areas of AI risk, biosecurity and pandemic preparedness, and other global catastrophic risks, criminal justice reform (United States), animal welfare, and some other areas.

Notes on grant decision logistics: See https://www.openphilanthropy.org/blog/our-grantmaking-so-far-approach-and-process for the general grantmaking process and https://www.openphilanthropy.org/blog/questions-we-ask-ourselves-making-grant for more questions that grant investigators are encouraged to consider. Every grant has a grant investigator that we call the influencer here on Donations List Website; for focus areas that have Program Officers, the grant investigator is usually the Program Officer. The grant investigator has been included in grants published since around July 2017. Grants usually need approval from an executive; however, some grant investigators have leeway to make "discretionary grants" where the approval process is short-circuited; see https://www.openphilanthropy.org/giving/grants/discretionary-grants for more. Note that the term "discretionary grant" means something different for them compared to government agencies, see https://www.facebook.com/vipulnaik.r/posts/10213483361534364 for more.

Notes on grant publication logistics: Every publicly disclosed grant has a writeup published at the time of public disclosure, but the writeups vary significantly in length. Grant writeups are usually written by somebody other than the grant investigator, but approved by the grant investigator as well as the grantee. Grants have three dates associated with them: an internal grant decision date (that is not publicly revealed but is used in some statistics on total grant amounts decided by year), a grant date (which we call donation date; this is the date of the formal grant commitment, which is the published grant date), and a grant announcement date (which we call donation announcement date; the date the grant is announced to the mailing list and the grant page made publicly visible). Lags are a few months between decision and grant, and a few months between grant and announcement, due to time spent with grant writeup approval.

Notes on grant financing: See https://www.openphilanthropy.org/giving/guide-for-grant-seekers or https://www.openphilanthropy.org/about/who-we-are for more information. Grants generally come from the Open Philanthropy Fund, a donor-advised fund managed by the Silicon Valley Community Foundation, with most of its money coming from Good Ventures. Some grants are made directly by Good Ventures, and political grants may be made by the Open Philanthropy Action Fund. At least one grant https://www.openphilanthropy.org/focus/us-policy/criminal-justice-reform/working-families-party-prosecutor-reforms-new-york was made by Cari Tuna personally. The majority of grants are financed by the Open Philanthropy Project Fund; however, the source of financing of a grant is not always explicitly specified, so it cannot be confidently assumed that a grant with no explicit listed financing is financed through the Open Philanthropy Project Fund; see the comment https://www.openphilanthropy.org/blog/october-2017-open-thread?page=2#comment-462 for more information. Funding for multi-year grants is usually disbursed annually, and the amounts are often equal across years, but not always. The fact that a grant is multi-year, or the distribution of the grant amount across years, are not always explicitly stated on the grant page; see https://www.openphilanthropy.org/blog/october-2017-open-thread?page=2#comment-462 for more information. Some grants to universities are labeled "gifts" but this is a donee classification, based on different levels of bureaucratic overhead and funder control between grants and gifts; see https://www.openphilanthropy.org/blog/october-2017-open-thread?page=2#comment-462 for more information.

Miscellaneous notes: Most GiveWell-recommended grants made by Good Ventures and listed in the Open Philanthropy database are not listed on Donations List Website as being under Open Philanthropy. Specifically, GiveWell Incubation Grants are not included (these are listed at https://donations.vipulnaik.com/donor.php?donor=GiveWell+Incubation+Grants with donor GiveWell Incubation Grants), and grants made by Good Ventures to GiveWell top and standout charities are also not included (these are listed at https://donations.vipulnaik.com/donor.php?donor=Good+Ventures%2FGiveWell+top+and+standout+charities with donor Good Ventures/GiveWell top and standout charities). Grants to support GiveWell operations are not included here; they can be found at https://donations.vipulnaik.com/donor.php?donor=Good+Ventures%2FGiveWell+support with donor "Good Ventures/GiveWell support".The investment https://www.openphilanthropy.org/focus/us-policy/farm-animal-welfare/impossible-foods in Impossible Foods is not included because it does not fit our criteria for a donation, and also because no amount was included. All other grants publicly disclosed by open philanthropy that are not GiveWell Incubation Grants or GiveWell top and standout charity grants should be included. Grants disclosed by grantees but not yet disclosed by Open Philanthropy are not included; some of them may be listed at https://issarice.com/open-philanthropy-project-non-grant-funding

Donor donation statistics

Cause areaCountMedianMeanMinimum10th percentile 20th percentile 30th percentile 40th percentile 50th percentile 60th percentile 70th percentile 80th percentile 90th percentile Maximum
Overall 14 400,352 5,284,406 15,000 100,000 100,000 250,000 260,000 400,352 493,425 776,095 2,982,206 12,066,809 55,000,000
Global catastrophic risks 12 260,000 1,498,474 15,000 100,000 100,000 100,000 250,000 260,000 437,800 493,425 776,095 2,982,206 12,066,809
Global catastrophic risks|Global health|Animal welfare 1 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000
Security 1 55,000,000 55,000,000 55,000,000 55,000,000 55,000,000 55,000,000 55,000,000 55,000,000 55,000,000 55,000,000 55,000,000 55,000,000 55,000,000

Donation amounts by cause area and year

If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.

Note: Cause area classification used here may not match that used by donor for all cases.

Cause area Number of donations Number of donees Total 2019 2018 2017 2016
Security (filter this donor) 1 1 55,000,000.00 55,000,000.00 0.00 0.00 0.00
Global catastrophic risks (filter this donor) 12 8 17,981,686.93 552,800.00 12,717,160.93 4,118,301.00 593,425.00
Global catastrophic risks|Global health|Animal welfare (filter this donor) 1 1 1,000,000.00 1,000,000.00 0.00 0.00 0.00
Total 14 10 73,981,686.93 56,552,800.00 12,717,160.93 4,118,301.00 593,425.00

Graph of spending by cause area and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by cause area and year (cumulative)

Graph of spending should have loaded here

Donation amounts by subcause area and year

If you hover over a cell for a given subcause area and year, you will get a tooltip with the number of donees and the number of donations.

For the meaning of “classified” and “unclassified”, see the page clarifying this.

Subcause area Number of donations Number of donees Total 2019 2018 2017 2016
Security/Biosecurity and pandemic preparedness/Global catastrophic risks/AI safety 1 1 55,000,000.00 55,000,000.00 0.00 0.00 0.00
Global catastrophic risks 7 5 13,529,960.93 552,800.00 12,717,160.93 260,000.00 0.00
Global catastrophic risks/nuclear war 1 1 2,982,206.00 0.00 0.00 2,982,206.00 0.00
Global catastrophic risks|Global health|Animal welfare 1 1 1,000,000.00 1,000,000.00 0.00 0.00 0.00
Global catastrophic risks/geoengineering 1 1 776,095.00 0.00 0.00 776,095.00 0.00
Global catastrophic risks/geomagnetic currents and power systems 1 1 493,425.00 0.00 0.00 0.00 493,425.00
Global catastrophic risks/AI safety 1 1 100,000.00 0.00 0.00 100,000.00 0.00
Global catastrophic risks/general research 1 1 100,000.00 0.00 0.00 0.00 100,000.00
Classified total 14 10 73,981,686.93 56,552,800.00 12,717,160.93 4,118,301.00 593,425.00
Unclassified total 0 0 0.00 0.00 0.00 0.00 0.00
Total 14 10 73,981,686.93 56,552,800.00 12,717,160.93 4,118,301.00 593,425.00

Graph of spending by subcause area and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by subcause area and year (cumulative)

Graph of spending should have loaded here

Donation amounts by donee and year

Donee Cause area Metadata Total 2019 2018 2017 2016
Center for Security and Emerging Technology (filter this donor) 55,000,000.00 55,000,000.00 0.00 0.00 0.00
Future of Humanity Institute (filter this donor) Global catastrophic risks/AI safety/Biosecurity and pandemic preparedness FB Tw WP Site TW 12,066,808.93 0.00 12,066,808.93 0.00 0.00
Rutgers University (filter this donor) FB Tw WP Site 2,982,206.00 0.00 0.00 2,982,206.00 0.00
MIT Media Lab (filter this donor) 1,000,000.00 1,000,000.00 0.00 0.00 0.00
UCLA School of Law (filter this donor) Tw WP Site 776,095.00 0.00 0.00 776,095.00 0.00
Center for a New American Security (filter this donor) 660,352.00 0.00 400,352.00 260,000.00 0.00
Future of Life Institute (filter this donor) AI safety/other global catastrophic risks FB Tw WP Site 550,000.00 100,000.00 250,000.00 100,000.00 100,000.00
University of Cape Town (filter this donor) FB Tw WP Site 493,425.00 0.00 0.00 0.00 493,425.00
Ethan Alley (filter this donor) 437,800.00 437,800.00 0.00 0.00 0.00
Tampere University (filter this donor) 15,000.00 15,000.00 0.00 0.00 0.00
Total -- -- 73,981,686.93 56,552,800.00 12,717,160.93 4,118,301.00 593,425.00

Graph of spending by donee and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by donee and year (cumulative)

Graph of spending should have loaded here

Donation amounts by influencer and year

If you hover over a cell for a given influencer and year, you will get a tooltip with the number of donees and the number of donations.

For the meaning of “classified” and “unclassified”, see the page clarifying this.

Influencer Number of donations Number of donees Total 2019 2018 2017
Luke Muehlhauser 1 1 55,000,000.00 55,000,000.00 0.00 0.00
Nick Beckstead 3 2 12,416,808.93 0.00 12,316,808.93 100,000.00
Claire Zabel 4 4 1,853,152.00 1,452,800.00 400,352.00 0.00
Helen Toner 1 1 260,000.00 0.00 0.00 260,000.00
Daniel Dewey 1 1 100,000.00 100,000.00 0.00 0.00
Classified total 10 7 69,629,960.93 56,552,800.00 12,717,160.93 360,000.00
Unclassified total 4 4 4,351,726.00 0.00 0.00 3,758,301.00
Total 14 10 73,981,686.93 56,552,800.00 12,717,160.93 4,118,301.00

Graph of spending by influencer and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by influencer and year (cumulative)

Graph of spending should have loaded here

Donation amounts by disclosures and year

Sorry, we couldn't find any disclosures information.

Donation amounts by country and year

Sorry, we couldn't find any country information.

Full list of documents in reverse chronological order (9 documents)

Title (URL linked)Publication dateAuthorPublisherAffected donorsAffected doneesAffected influencersDocument scopeCause areaNotes
Message exchange with EAF2019-11-12Simon Knutsson Open Philanthropy Effective Altruism Foundation Reasoning supplementEffective altruism|Global catastrophic risksThis is a supplement to https://www.simonknutsson.com/problems-in-effective-altruism-and-existential-risk-and-what-to-do-about-them/ The supplement documents an email exchange between Knutsson and Stefan Torges of the Effective Altruism Foundation where Knutsson asks Torges for comment on some of the points in the article. Torges's reply is not quoted as he did not give permission to quote the replies, but Knutsson summarizes the replies as saying that EAF can't share further information, and does not wish to engage Knutsson on the issue.
Co-funding Partnership with Ben Delo2019-11-11Holden Karnofsky Open PhilanthropyOpen Philanthropy Ben Delo PartnershipAI safety|Biosecurity and pandemic preparedness|Global catastrophic risks|Effective altruismBen Delo, co-founder of the cryptocurrency trading platform BitMEX, recently signed the Giving Pledge. He is entering into a partnership with the Open Philanthropy Project, providing funds, initially in the $5 million per year range, to support Open Phil's longtermist grantmaking, in areas including AI safety, biosecurity and pandemic preparedness, global catastrophic risks, and effective altruism. Later, the Machine Intelligence Research Institute (MIRI) would reveal at https://intelligence.org/2020/04/27/miris-largest-grant-to-date/ that, of a $7.7 million grant from Open Phil, $1.46 million is coming from Ben Delo.
E-mail exchange with the Open Philanthropy Project2019-11-10Simon Knutsson Open Philanthropy Effective Altruism Foundation Reasoning supplementEffective altruism|Global catastrophic risksThis is a supplement to https://www.simonknutsson.com/problems-in-effective-altruism-and-existential-risk-and-what-to-do-about-them/ The supplement documents an email exchange between Knutsson and Michael Levine of the Open Philanthropy Project where Knutsson asks Levine for comment on some of the points in the article. Levine's reply is not quoted as he did not give permission to quote the replies, but Knutsson summarizes the replies as saying that "[Open Phil] do not have anything to add beyond the grant page https://www.openphilanthropy.org/giving/grants/effective-altruism-foundation-research-operations
Problems in effective altruism and existential risk and what to do about them2019-10-16Simon Knutsson Open Philanthropy Effective Altruism Foundation Centre for Effective Altruism Effective Altruism Foundation Future of Humanity Institute Miscellaneous commentaryEffective altruism|Global catastrophic risksSimon Knutsson, a Ph.D. student who previously worked at GiveWell and has, since then, worked on animal welfare and on s-risks, writes about what he sees as problematic dynamics in the effective altruism and x-risk communities. Specifically, he is critical of what he sees as behind-the-scenes coordination work on messaging, between many organizations in the space, notably the Open Philanthropy Project and the Effective Altruism Foundation, and the possible use of grant money to pressure EAF into pushing for guidelines for writers to not talk about s-risks in specific ways. He is also critical of what he sees as the one-sided nature of the syllabi and texts produced by the Centre for Effective Altruism (CEA). The author notes that people have had different reactions to his text, with some considering the behavior described as unproblematic, while others agreeing with him that it is problematic and deserves the spotlight. The post is also shared to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/EescnoaBJsQWz4rii/problems-in-effective-altruism-and-what-to-do-about-them (GW, IR) where it gets a lot of criticism in the comments from people including Peter Hurford and Holly Elmore.
Thanks for putting up with my follow-up questions. Out of the areas you mention, I'd be very interested in ... (GW, IR)2019-09-10Ryan Carey Effective Altruism ForumFounders Pledge Open Philanthropy OpenAI Machine Intelligence Research Institute Broad donor strategyAI safety|Global catastrophic risks|Scientific research|PoliticsRyan Carey replies to John Halstead's question on what Founders Pledge shoud research. He first gives the areas within Halstead's list that he is most excited about. He also discusses three areas not explicitly listed by Halstead: (a) promotion of effective altruism, (b) scholarships for people working on high-impact research, (c) more on AI safety -- specifically, funding low-mid prestige figures with strong AI safety interest (what he calls "highly-aligned figures"), a segment that he claims the Open Philanthropy Project is neglecting, with the exception of MIRI and a couple of individuals.
Important But Neglected: Why an Effective Altruist Funder Is Giving Millions to AI Security2019-03-20Tate Williams Inside PhilanthropyOpen Philanthropy Center for Security and Emerging Technology Third-party coverage of donor strategyAI safety|Biosecurity and pandemic preparedness|Global catastrophic risks|SecurityThe article focuses on grantmaking by the Open Philanthropy Project in the areas of global catastrophic risks and security, particularly in AI safety and biosecurity and pandemic preparedness. It includes quotes from Luke Muehlhauser, Senior Research Analyst at the Open Philanthropy Project and the investigator for the $55 million grant https://www.openphilanthropy.org/giving/grants/georgetown-university-center-security-and-emerging-technology to the Center for Security and Emerging Technology (CSET). Muehlhauser was previously Executive Director at the Machine Intelligence Research Institute. It also includes a quote from Holden Karnofsky, who sees the early interest of effective altruists in AI safety as prescient. The CSET grant is discussed in the context of the Open Philanthropy Project's hits-based giving approach, as well as the interest in the policy space in better understanding of safety and governance issues related to technology and AI.
The world’s most intellectual foundation is hiring. Holden Karnofsky, founder of GiveWell, on how philanthropy can have maximum impact by taking big risks.2018-02-27Robert Wiblin Kieran Harris Holden Karnofsky 80,000 HoursOpen Philanthropy Broad donor strategyAI safety|Global catastrophic risks|Biosecurity and pandemic preparedness|Global health and development|Animal welfare|Scientific researchThis interview, with full transcript, is an episode of the 80,000 Hours podcast. In the interview, Karnofsky provides an overview of the cause prioritization and grantmaking strategy of the Open Philanthropy Project, and also notes that the Open Philanthropy Project is hiring for a number of positions.
Open Philanthropy Project: Progress in 2014 and Plans for 20152015-03-12Holden Karnofsky Open PhilanthropyOpen Philanthropy Broad donor strategyGlobal catastrophic risks|Scientific research|Global health and developmentThe blog post compares progress made by the Open Philanthropy Project in 2015 against plans laid out in https://www.openphilanthropy.org/blog/givewell-labs-progress-2013-and-plans-2014 and lays out further plans for 2015. The post says that progress in the areas of U.S. policy and global catastrophic risks was substantial and matched expectations, but progress in scientific research and global health and development was less than hoped for. The plan for 2015 is to focus on growing more in the domain of scientific research and postpone work on global health and development (thus freeing up staff capacity). There is much more detail in the post.
Potential Global Catastrophic Risk Focus Areas2014-06-26Alexander Berger Open PhilanthropyOpen Philanthropy Broad donor strategyAI safety|Biosecurity and pandemic preparedness|Global catastrophic risksIn this blog post originally published at https://blog.givewell.org/2014/06/26/potential-global-catastrophic-risk-focus-areas/ Alexander Berger goes over a list of seven types of global catastrophic risks (GCRs) that the Open Philanthropy Project has considered. He details three promising areas that the Open Philanthropy Project is exploring more and may make grants in: (1) Biosecurity and pandemic preparedness, (2) Geoengineering research and governance, (3) AI safety. For the AI safety section, there is a note from Executive Director Holden Karnofsky saying that he sees AI safety as a more promising area than Berger does.

Full list of donations in reverse chronological order (14 donations)

Graph of top 10 donees (for donations with known year of donation) by amount, showing the timeframe of donations

Graph of donations and their timeframes
DoneeAmount (current USD)Amount rank (out of 14)Donation dateCause areaURLInfluencerNotes
Future of Life Institute100,000.00112019-10Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/future-life-institute-general-support-2019Daniel Dewey Intended use of funds (category): Organizational general support

Other notes: Announced: 2019-11-18.
Ethan Alley437,800.0072019-05Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/scholarship-support-2019Claire Zabel Intended use of funds (category): Living expenses during project

Intended use of funds: The grant page says the grant is "over four years in scholarship funds support to Ethan Alley to pursue a PhD at the Massachusetts Institute of Technology. The funding is intended to be used for his tuition, fees, healthcare, and a living stipend during his degree program

Donor reason for selecting the donee: The grant page says the grant "is part of an effort to support value-aligned and qualified early-career researchers interested in global catastrophic risks."

Donor reason for donating that amount (rather than a bigger or smaller amount): The grant page says: "The funding is intended to be used for his tuition, fees, healthcare, and a living stipend during his degree program [over four years]" so the amount is likely determined based on the sum of the costs of these over four years

Donor reason for donating at this time (rather than earlier or later): Likely determined by the start time of the grantee's PhD program
Intended funding timeframe in months: 48

Donor thoughts on making further donations to the donee: The grant page calls the grant "part of an effort to support value-aligned and qualified early-career researchers interested in global catastrophic risks" so it will likely be followed by other similar grants to other researchers

Other notes: Announced: 2019-07-18.
Tampere University (Earmark: Hiski Haukkala)15,000.00142019-04Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/tampere-university-2019Claire Zabel Donation process: Discretionary grant

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support Professor Hiski Haukkala’s efforts related to global catastrophic risks. Haukkala, a Finnish professor of international relations, plans to use the funding to bring speakers to Finland to discuss existential risks, to attend events related to existential risks, and to support networking and related projects."

Other notes: Announced: 2019-06-07.
MIT Media Lab1,000,000.0042019-03Global catastrophic risks|Global health|Animal welfarehttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/massachusetts-institute-technology-media-lab-kevin-esveltClaire Zabel Intended use of funds (category): Direct project expenses

Intended use of funds: Grant over two years to the MIT Media Lab to support the research of Professor Kevin Esvelt. Professor Esvelt plans to use this funding to conduct research on global catastrophic risks, global health, and animal welfare.

Other notes: Intended funding timeframe in months: 24; announced: 2019-06-26.
Center for Security and Emerging Technology55,000,000.0012019-01Security/Biosecurity and pandemic preparedness/Global catastrophic risks/AI safetyhttps://www.openphilanthropy.org/giving/grants/georgetown-university-center-security-and-emerging-technologyLuke Muehlhauser Intended use of funds (category): Organizational general support

Intended use of funds: Grant via Georgetown University for the Center for Security and Emerging Technology (CSET), a new think tank led by Jason Matheny, formerly of IARPA, dedicated to policy analysis at the intersection of national and international security and emerging technologies. CSET plans to provide nonpartisan technical analysis and advice related to emerging technologies and their security implications to the government, key media outlets, and other stakeholders.

Donor reason for selecting the donee: Open Phil thinks that one of the key factors in whether AI is broadly beneficial for society is whether policymakers are well-informed and well-advised about the nature of AI’s potential benefits, potential risks, and how these relate to potential policy actions. As AI grows more powerful, calls for government to play a more active role are likely to increase, and government funding and regulation could affect the benefits and risks of AI. Thus: "Overall, we feel that ensuring high-quality and well-informed advice to policymakers over the long run is one of the most promising ways to increase the benefits and reduce the risks from advanced AI, and that the team put together by CSET is uniquely well-positioned to provide such advice." Despite risks and uncertainty, the grant is described as worthwhile under Open Phil's hits-based giving framework

Donor reason for donating that amount (rather than a bigger or smaller amount): The large amount over an extended period (5 years) is explained at https://www.openphilanthropy.org/blog/questions-we-ask-ourselves-making-grant "In the case of the new Center for Security and Emerging Technology, we think it will take some time to develop expertise on key questions relevant to policymakers and want to give CSET the commitment necessary to recruit key people, so we provided a five-year grant."

Donor reason for donating at this time (rather than earlier or later): Likely determined by the timing that the grantee plans to launch. More timing details are not discussed
Intended funding timeframe in months: 60

Other notes: Donee is entered as Center for Security and Emerging Technology rather than as Georgetown University for consistency with future grants directly to the organization once it is set up. Founding members of CSET include Dewey Murdick from the Chan Zuckerberg Initiative, William Hannas from the CIA, and Helen Toner from the Open Philanthropy Project. The grant is discussed in the broader context of giving by the Open Philanthropy Project into global catastrophic risks and AI safety in the Inside Philanthropy article https://www.insidephilanthropy.com/home/2019/3/22/why-this-effective-altruist-funder-is-giving-millions-to-ai-security. Announced: 2019-02-28.
Center for a New American Security (Earmark: Richard Danzig)400,352.0082018-09Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/center-for-a-new-american-security-richard-danzig-outreach-on-technological-risk-2018Claire Zabel Grant to support outreach by Richard Danzig, former Secretary of the Navy, on technological risks. This is a renewal and expansion of the August 2017 grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/center-for-a-new-american-security-richard-danzig-outreach-on-technological-risk#footnote1_ix4f0ts which allowed Dr. Danzig to produce Technology Roulette https://www.cnas.org/publications/reports/technology-roulette a report intended for the national security community detailing the management of risks from losing control of advanced technology. Dr. Danzig intends to use these new funds to continue sharing these ideas with U.S. government officials, as well as spreading them to national security leaders abroad. Announced: 2018-10-20.
Future of Humanity Institute12,066,808.9322018-07Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/future-humanity-institute-work-on-global-catastrophic-risksNick Beckstead Donation process: This is a series of awards totaling £13,428,434 ($16,200,062.78 USD at market rate on September 2, 2019); as of September 18, 2019, $12,066,808.93 of the amount has been allocated

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to support work on risks from advanced artificial intelligence, biosecurity and pandemic preparedness, and macrostrategy. The grant page says: "The largest pieces of the omnibus award package will allow FHI to recruit and hire for an education and training program led by Owen Cotton­Barratt, and retain and attract talent in biosecurity research and FHI’s Governance of AI program."

Other notes: Intended funding timeframe in months: 36; announced: 2018-09-01.
Future of Life Institute250,000.00102018-06Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/future-life-institute-general-support-2018Nick Beckstead Intended use of funds (category): Organizational general support

Intended use of funds: Grant for general support. It is a renewal of the May 2017 grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/future-life-institute-general-support-2017 whose primary purpose to administer a request for proposals in AI safety similar to a request for proposals in 2015 https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/update-fli-grant

Donor retrospective of the donation: The followup grant in 2019 suggests that Open Phil would continue to stand by its assessment of the grantee.

Other notes: Announced: 2018-07-05.
Center for a New American Security (Earmark: Richard Danzig)260,000.0092017-08Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/center-for-a-new-american-security-richard-danzig-outreach-on-technological-riskHelen Toner Grant awarded to support outreach by Richard Danzig,1 former Secretary of the Navy, on technological risks. Specifically, this funding will allow Mr. Danzig to revise and publish an already-drafted manuscript exploring and providing guidance on issues facing the US government related to potential risks from advanced technology (e.g., biosecurity, cybersecurity, and artificial intelligence risks). The funding would be used by Dr. Danzig to produce Technology Roulette https://www.cnas.org/publications/reports/technology-roulette a report intended for the national security community detailing the management of risks from losing control of advanced technology. Announced: 2017-10-16.
Future of Life Institute100,000.00112017-05Global catastrophic risks/AI safetyhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/future-life-institute-general-support-2017Nick Beckstead Intended use of funds (category): Organizational general support

Intended use of funds: Grant for general support. However, the primary use of the grant will be to administer a request for proposals in AI safety similar to a request for proposals in 2015 https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/update-fli-grant

Donor retrospective of the donation: The followup grants in 2018 and 2019, for similar or larger amounts, suggest that Open Phil would continue to stand by its assessment of the grantee.

Other notes: Announced: 2017-09-27.
UCLA School of Law (Earmark: Edward Parson)776,095.0052017-03Global catastrophic risks/geoengineeringhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/ucla-climate-engineering-governance-- Donation process: Grantee submitted a proposal https://www.openphilanthropy.org/files/Grants/UCLA_Climate_Engineering/Parson_CE_Governance_Proposal_Narrative_02-10-17.pdf on 2017-02-10 with sections: (1) Background, Need, and Opportunity (2) Examples of questions to be addressed (3) Activities.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support research and meetings on climate engineering governance led by Professor Edward Parson. Professor Parson plans to use this grant to hire one or two fellows for three years to do academic research, publish papers, and hold meetings and workshops on climate engineering governance with relevant policymakers." The grant proposal https://www.openphilanthropy.org/files/Grants/UCLA_Climate_Engineering/Parson_CE_Governance_Proposal_Narrative_02-10-17.pdf also has an Activities section, though it's not clear how much the plans changed between the proposal and the final grant.

Donor reason for selecting the donee: The grant page says: "We hope that this grant will positively influence the future of climate engineering governance and policy."

Donor reason for donating that amount (rather than a bigger or smaller amount): The grant proposal https://www.openphilanthropy.org/files/Grants/UCLA_Climate_Engineering/Parson_CE_Governance_Proposal_Narrative_02-10-17.pdf includes an attached budget, though this is not publicly available. The budget likely influenced the amount granted.

Donor reason for donating at this time (rather than earlier or later): The timing was likely influenced by the timing of the submission of the grant proposal https://www.openphilanthropy.org/files/Grants/UCLA_Climate_Engineering/Parson_CE_Governance_Proposal_Narrative_02-10-17.pdf (2017-02-10, a month before the grant).
Intended funding timeframe in months: 36

Other notes: Announced: 2017-04-19.
Rutgers University (Earmark: Alan Robock)2,982,206.0032017-03Global catastrophic risks/nuclear warhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/rutgers-university-nuclear-conflict-climate-modeling-- Grant over three years to support a series of modeling studies on the climatological and subsequent ecological and social effects of large nuclear conflict. Research to beconducted by Alan Robock of Rutgers University and Owen Brian Toon of the University of Colorado Boulder. Announced: 2017-04-19.
University of Cape Town (Earmark: Trevor Gaunt)493,425.0062016-09Global catastrophic risks/geomagnetic currents and power systemshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/university-cape-town-geomagnetics-research-- Grant over three years to Professor Trevor Gaunt, Emeritus Professor of Electrical Engineering at the University of Cape Town (UCT) in South Africa, to research the potential risks that geomagnetic storms could pose to the electric power system. Announced: 2016-10-06.
Future of Life Institute100,000.00112016-03Global catastrophic risks/general researchhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/future-life-institute-general-support-- Donation process: According to https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/future-life-institute-general-support#Our_process "Following our collaboration last year, we kept in touch with FLI regarding its funding situation and plans for future activities."

Intended use of funds (category): Organizational general support

Intended use of funds: Main planned activities for 2016 include: news operation, nuclear weapons campaign, AI safety conference, and AI conference travel.

Donor reason for selecting the donee: https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/future-life-institute-general-support#The_case_for_the_grant says: "In organizing its 2015 [Puerto Rico] AI safety conference (which we attended), FLI demonstrated a combination of network, ability to execute, and values that impressed us. We felt that the conference was well-organized, attracted the attention of high-profile individuals who had not previously demonstrated an interest in AI safety, and seemed to lead many of those individuals to take the issue more seriously." There is more detail in the grant page, as well as a list of reservations about the grant.

Donor reason for donating at this time (rather than earlier or later): Open Phil needed enough time to evaluate the results of its first Future of Life Institute grant that was focused on AI safety, and to see the effects of the Puerto Rico 2015 AI safety conference. Timing also likely determined by FLI explicitly seeking more money to meet its budget.

Donor thoughts on making further donations to the donee: According to https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/future-life-institute-general-support#Key_questions_for_follow-up "We expect to have a conversation with FLI staff every 3-6 months for the next 12 months. After that, we plan to consider renewal." A list of questions is included.

Donor retrospective of the donation: The followup grants in 2017, 2018, and 2019, for similar or larger amounts, suggest that Open Phil would continue to stand by its assessment of the grantee.

Other notes: Announced: 2016-03-18.

Similarity to other donors

Sorry, we couldn't find any similar donors.