Claire Zabel|Committee for Effective Altruism Support money moved

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of March 2022. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Full list of documents in reverse chronological order (3 documents)

Title (URL linked)Publication dateAuthorPublisherAffected donorsAffected doneesDocument scopeNotes
Suggestions for Individual Donors from Open Philanthropy Staff - 20192019-12-18Holden Karnofsky Open Philanthropy ProjectChloe Cockburn Jesse Rothman Michelle Crentsil Amanda Hungerfold Lewis Bollard Persis Eskander Alexander Berger Chris Somerville Heather Youngs Claire Zabel National Council for Incarcerated and Formerly Incarcerated Women and Girls Life Comes From It Worth Rises Wild Animal Initiative Sinergia Animal Center for Global Development International Refugee Assistance Project California YIMBY Engineers Without Borders 80,000 Hours Centre for Effective Altruism Future of Humanity Institute Global Priorities Institute Machine Intelligence Research Institute Ought Donation suggestion listContinuing an annual tradition started in 2015, Open Philanthropy Project staff share suggestions for places that people interested in specific cause areas may consider donating. The sections are roughly based on the focus areas used by Open Phil internally, with the contributors to each section being the Open Phil staff who work in that focus area. Each recommendation includes a "Why we recommend it" or "Why we suggest it" section, and with the exception of the criminal justice reform recommendations, each recommendation includes a "Why we haven't fully funded it" section. Section 5, Assorted recomendations by Claire Zabel, includes a list of "Organizations supported by our Committed for Effective Altruism Support" which includes a list of organizations that are wiithin the purview of the Committee for Effective Altruism Support. The section is approved by the committee and represents their views
Staff Members’ Personal Donations for Giving Season 20172017-12-18Holden Karnofsky Open Philanthropy ProjectHolden Karnofsky Alexander Berger Nick Beckstead Helen Toner Claire Zabel Lewis Bollard Ajeya Cotra Morgan Davis Michael Levine GiveWell top charities GiveWell GiveDirectly EA Giving Group Berkeley Existential Risk Initiative Effective Altruism Funds: Meta Fund Effective Altruism Funds: Long-Term Future Fund Effective Altruism Funds: Animal Welfare Fund Effective Altruism Funds: Global Health and Development Fund Sentience Institute Encompass The Humane League The Good Food Institute Mercy For Animals Compassion in World Farming USA Animal Equality Donor lottery Against Malaria Foundation GiveDirectly Periodic donation list documentationOpen Philanthropy Project staff members describe where they are donating this year, and the considerations that went into the donation decision. By policy, amounts are not disclosed. This is the first standalone blog post of this sort by the Open Philanthropy Project; in previous years, the corresponding donations were documented in the GiveWell staff members donation post
Staff members’ personal donations for giving season 20152015-12-09Elie Hassenfeld GiveWellElie Hassenfeld Holden Karnofsky Natalie Crispin Alexander Berger Timothy Telleen-Lawton Sean Conley Josh Rosenberg Jake Marcus Rebecca Raible Milan Griffes Helen Toner Sophie Monahan Laura Muñoz Catherine Hollander Andrew Martin Claire Zabel Nicole Ross Lewis Bollard GiveWell top charities Against Malaria Foundation GiveWell GiveDirectly Wikimedia Foundation Center for Global Development Martha’s Table Country Dance and Song Society Northwest Health Law Advocates Mercy For Animals The Humane League Animal Charity Evaluators Raising for Effective Giving Humane Society of te United States Periodic donation list documentationGiveWell and Open Philanthropy Project staff describe their annual donation plans for 2015. Some of these are tentative and get superseded by further events. Also, not all employees are present in the document (participation is optional). Amounts donated are not included, per a decision by GiveWell

Full list of donations in reverse chronological order (26 donations)

DonorDoneeAmount (current USD)Donation dateCause areaURLNotes
Open Philanthropy Project80,000 Hours3,457,284.002020-02Effective altruism/movement growth/career counselinghttps://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2020 Donation process: The decision of whether to donate seems to have followed the Open Philanthropy Project's usual process, but the exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support

Intended use of funds (category): Organizational general support

Intended use of funds: 80,000 Hours aims to solve skill bottlenecks for career paths in what it considers to be the world’s most pressing problems. It does this by providing online research, in-person advice, and support with the goal of helping talented graduates age 20-40 enter high-impact careers.

Donor reason for selecting the donee: Open Phil's grant writeup says: "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" with the most recent similar grant being https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2019 (February 2019) and the most recent grant with a detailed writeup being https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2018 (February 2018)

Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public.

Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but this is likely the time when the Committee for Effective Altruism Support does its 2020 allocation. Two other grants decided by CEAS at around the same time are: Machine Intelligence Research Institute ($7,703,750),Centre for Effective Altruism ($4,146,795), and Ought ($1,593,333) Announced: 2020-03-09.
Open Philanthropy ProjectCenter for Applied Rationality375,000.002020-02Rationality improvementhttps://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2020 Intended use of funds (category): Organizational general support

Intended use of funds: The grant page says: "CFAR is an adult education nonprofit that seeks to find and develop cognitive tools and to deliver these tools to promising individuals, groups, organizations, and networks focused on solving large and pressing problems. [...] They introduce people to and/or strengthen their connections with the effective altruism (EA) community and way of thinking, which we hope results in people with outstanding potential pursuing more impactful career trajectories. CFAR is particularly interested in growing the talent pipeline for work on potential risks from advanced artificial intelligence (AI)."

Donor reason for selecting the donee: The grant page says: "Our primary interest in these workshops is that we believe they introduce people to and/or strengthen their connections with the effective altruism (EA) community and way of thinking, which we hope results in people with outstanding potential pursuing more impactful career trajectories." Also: "CFAR is particularly interested in growing the talent pipeline for work on potential risks from advanced artificial intelligence (AI). More on our interest in supporting work [...]"

Donor reason for donating that amount (rather than a bigger or smaller amount): Amount chosen to provide one year of operating support

Donor reason for donating at this time (rather than earlier or later): Timing determind by the end of the funding timeframe of the previous two-year grant https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018 made January 2018
Intended funding timeframe in months: 12

Donor thoughts on making further donations to the donee: This is an exit grant, so Open Phil does not plan to make further grants to CFAR. Announced: 2020-04-20.
Open Philanthropy ProjectMachine Intelligence Research Institute7,703,750.002020-02AI safetyhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2020 Donation process: The decision of whether to donate seems to have followed the Open Philanthropy Project's usual process, but the exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support

Intended use of funds (category): Organizational general support

Intended use of funds: MIRI plans to use these funds for ongoing research and activities related to AI safety

Donor reason for selecting the donee: The grant page says "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" with the most similar previous grant being https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2019 (February 2019). Past writeups include the grant pages for the October 2017 three-year support https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2017 and the August 2016 one-year support https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support

Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Three other grants decided by CEAS at around the same time are: Centre for Effective Altruism ($4,146,795), 80,000 Hours ($3,457,284), and Ought ($1,593,333).

Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but this is likely the time when the Committee for Effective Altruism Support does its 2020 allocation.
Intended funding timeframe in months: 24

Other notes: The donee describes the grant in the blog post https://intelligence.org/2020/04/27/miris-largest-grant-to-date/ (2020-04-27) along with other funding it has received ($300,000 from the Berkeley Existential Risk Initiative and $100,000 from the Long-Term Future Fund). The fact that the grant is a two-year grant is mentioned here, but not in the grant page on Open Phil's website. The page also mentions that of the total grant amount of $7.7 million, $6.24 million is coming from Open Phil's normal funders (Good Ventures) and the remaining $1.46 million is coming from Ben Delo, co-founder of the cryptocurrency trading platform BitMEX, as part of a funding partnership https://www.openphilanthropy.org/blog/co-funding-partnership-ben-delo announced November 11, 2019. Announced: 2020-04-10.
Open Philanthropy ProjectCentre for Effective Altruism4,146,795.002020-01Effective altruism/movement growthhttps://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-and-community-building-grants-2020 Donation process: The exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support

Intended use of funds (category): Organizational general support

Intended use of funds: Grant for a mix of organizational general support and supporting the Effective Altruism Community Building Grants program operated by CEA

Donor reason for selecting the donee: Open Phil's grant writeup says: "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" and links to the September 2019 support https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-and-community-building-grants-2019 that had the same intended use of funds (general support + Community Building Grants)

Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support (CEAS) https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Two other grants decided by CEAS at around the same time are: Machine Intelligence Research Institute ($7,703,750), 80,000 Hours ($3,457,284), and Ought ($1,593,333)

Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but this is likely the time when the Committee for Effective Altruism Support does its 2020 allocation Announced: 2020-03-09.
Open Philanthropy ProjectOught1,593,333.002020-01AI safetyhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/ought-general-support-2020 Donation process: The grant was recommended by the Committee for Effective Altruism Support following its process https://www.openphilanthropy.org/committee-effective-altruism-support

Intended use of funds (category): Organizational general support

Intended use of funds: The grant page says: "Ought conducts research on factored cognition, which we consider relevant to AI alignment and to reducing potential risks from advanced artificial intelligence."

Donor reason for selecting the donee: The grant page says "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter"

Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Three other grants decided by CEAS at around the same time are: Machine Intelligence Research Institute ($7,703,750), Centre for Effective Altruism ($4,146,795), and 80,000 Hours ($3,457,284)

Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but this is likely the time when the Committee for Effective Altruism Support does its 2020 allocation Announced: 2020-02-14.
Open Philanthropy ProjectCentre for Effective Altruism1,755,921.002019-09Effective altruism/movement growthhttps://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-and-community-building-grants-2019 Donation process: The exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support

Intended use of funds (category): Organizational general support

Intended use of funds: Grant for a mix of organizational general support and supporting the Effective Altruism Community Building Grants program operated by CEA

Donor reason for selecting the donee: Open Phil's grant writeup says: "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" and links to the February 2019 support https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2019

Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support (CEAS) https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. No other grants seem to have been decided by CEAS close in time to this grant Announced: 2019-11-08.
Open Philanthropy ProjectEffective Altruism Foundation1,000,000.002019-07Effective altruismhttps://www.openphilanthropy.org/giving/grants/effective-altruism-foundation-research-operations Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support research and operations"

Donor reason for selecting the donee: The grant page says "A major purpose of this grant is to encourage and support EAF and our other grantees in the space in taking approaches to longtermism with greater emphasis on shared objectives between different value systems. We conceive of this grant as falling under our work aimed at growing and supporting the EA community." Earlier in the document, past reservations that Open Phil has had about EAF are described: "EAF is an organization whose values put a particular emphasis on trying to reduce the risks of future suffering. While preventing suffering is a value we share, we also believe that the speculative and suffering-focused nature of this work means that it needs to be communicated about carefully, and could be counterproductive otherwise. As a result, we have felt ambivalent about EAF’s work to date (despite feeling unambiguously positively about some of their projects)."

Other notes: The grant would be discussed further by Simon Knutsson in his critical post https://www.simonknutsson.com/problems-in-effective-altruism-and-existential-risk-and-what-to-do-about-them/ that also includes discussion of guidelines that Nick Beckstead of the Open Philanthropy Project developed, and that EAF was now adopting and encouraging others to adopt. Knutsson sees the adoption of the guidelines as being linked to the grant money, due to both the timing matching and the language of the grant page. On separate pages, Knutsson publishes correspondence between him and people at Open Phil and EAF where he tried to get more specific information from the two organizations: https://www.simonknutsson.com/e-mail-exchange-with-the-open-philanthropy-project and https://www.simonknutsson.com/message-exchange-with-eaf/. Intended funding timeframe in months: 1; announced: 2019-07-30.
Open Philanthropy ProjectAltruistic Technology Labs440,525.002019-05Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/altruistic-technology-labs-biological-risk-prevention Intended use of funds (category): Organizational general support

Intended use of funds: The grantee, "AltLabs", a new organization, intends to use these funds to hire initial staff and pursue various research projects related to catastrophic risk reduction, including machine-learning-based attribution of engineered DNA and broad-spectrum infectious disease diagnostics. Announced: 2019-07-18.
Open Philanthropy ProjectEthan Alley437,800.002019-05Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/scholarship-support-2019 Intended use of funds (category): Living expenses during research project

Intended use of funds: The grant page says the grant is "over four years in scholarship funds support to Ethan Alley to pursue a PhD at the Massachusetts Institute of Technology. The funding is intended to be used for his tuition, fees, healthcare, and a living stipend during his degree program

Donor reason for selecting the donee: The grant page says the grant "is part of an effort to support value-aligned and qualified early-career researchers interested in global catastrophic risks."

Donor reason for donating that amount (rather than a bigger or smaller amount): The grant page says: "The funding is intended to be used for his tuition, fees, healthcare, and a living stipend during his degree program [over four years]" so the amount is likely determined based on the sum of the costs of these over four years

Donor reason for donating at this time (rather than earlier or later): Likely determined by the start time of the grantee's PhD program
Intended funding timeframe in months: 48

Donor thoughts on making further donations to the donee: The grant page calls the grant "part of an effort to support value-aligned and qualified early-career researchers interested in global catastrophic risks" so it will likely be followed by other similar grants to other researchers Announced: 2019-07-18.
Open Philanthropy ProjectTampere University15,000.002019-04Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/tampere-university-2019 Donation process: Discretionary grant

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support Professor Hiski Haukkala’s efforts related to global catastrophic risks. Haukkala, a Finnish professor of international relations, plans to use the funding to bring speakers to Finland to discuss existential risks, to attend events related to existential risks, and to support networking and related projects." Announced: 2019-06-07.
Open Philanthropy ProjectMIT Media Lab1,000,000.002019-03Global catastrophic risks|Global health|Animal welfarehttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/massachusetts-institute-technology-media-lab-kevin-esvelt Intended use of funds (category): Direct project expenses

Intended use of funds: Grant over two years to the MIT Media Lab to support the research of Professor Kevin Esvelt. Professor Esvelt plans to use this funding to conduct research on global catastrophic risks, global health, and animal welfare. Intended funding timeframe in months: 1; announced: 2019-06-26.
Open Philanthropy Project80,000 Hours4,795,803.002019-02Effective altruism/movement growth/career counselinghttps://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2019 Donation process: The decision of whether to donate seems to have followed the Open Philanthropy Project's usual process, but the exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support

Intended use of funds (category): Organizational general support

Intended use of funds: 80,000 Hours aims to solve skill bottlenecks for career paths in what it considers to be the world’s most pressing problems. It does this by providing online research, in-person advice, and support with the goal of helping talented graduates age 20-40 enter high-impact careers.

Donor reason for selecting the donee: Open Phil's grant writeup says: "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" and links to the February 2018 support https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2018

Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Two other grants with amounts decided by the Committee for Effective Altruism Support, made at the same time and therefore likely drawing from the same money pot, are to the Machine Intelligence Research Institute ($2,112,5000) and Centre for Effective Altruism ($2,756,250)

Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but likely include (1) It is about a year since the last grant to 80,000 Hours, and the grants are generally expected to last a year, so a renewal is due, (2) The Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support did its first round of money allocation, so the timing is determined by the timing of that allocation round
Intended funding timeframe in months: 24

Donor retrospective of the donation: The February 2020 grant https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2020 with very similar reasoning suggests that the Open Philanthropy Project and Committee for Effective Altruism Support would continue to stand by the reasoning behind the grant Announced: 2019-03-28.
Open Philanthropy ProjectCentre for Effective Altruism2,756,250.002019-02Effective altruism/movement growthhttps://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2019 Donation process: The exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support

Intended use of funds (category): Organizational general support

Intended use of funds: The grant writeup says: "CEA is a central organization within the effective altruism (EA) community that engages in a variety of activities aimed at helping the EA community."

Donor reason for selecting the donee: Open Phil's grant writeup says: "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter"

Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Two other grants with amounts decided by the Committee for Effective Altruism Support, made at the same time and therefore likely drawing from the same money pot, are to the Machine Intelligence Research Institute ($2,112,5000) and 80,000 Hours ($4,795,803)

Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but likely include (1) It is about a year since the last grant to the Centre for Effective Altruism, and the grants are generally expected to last a year, so a renewal is due, (2) The Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support did its first round of money allocation, so the timing is determined by the timing of that allocation round
Intended funding timeframe in months: 24

Donor retrospective of the donation: The followup September 2019 grant https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-and-community-building-grants-2019 and January 2020 grant https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-and-community-building-grants-2020 suggest that Open Phil would continue to stand behind the reasoning for this grant, and in fact, that it would consider the original grant amount inadequate for the grantee Announced: 2019-04-18.
Open Philanthropy ProjectMachine Intelligence Research Institute2,652,500.002019-02AI safetyhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2019 Donation process: The decision of whether to donate seems to have followed the Open Philanthropy Project's usual process, but the exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support

Intended use of funds (category): Organizational general support

Intended use of funds: MIRI plans to use these funds for ongoing research and activities related to AI safety. Planned activities include alignment research, a summer fellows program, computer scientist workshops, and internship programs.

Donor reason for selecting the donee: The grant page says: "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" Past writeups include the grant pages for the October 2017 three-year support https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2017 and the August 2016 one-year support https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support

Donor reason for donating that amount (rather than a bigger or smaller amount): Amount decided by the Committee for Effective Altruism Support (CEAS) https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Two other grants with amounts decided by CEAS, made at the same time and therefore likely drawing from the same money pot, are to the Center for Effective Altruism ($2,756,250) and 80,000 Hours ($4,795,803). The original amount of $2,112,500 is split across two years, and therefore ~$1.06 million per year. https://intelligence.org/2019/04/01/new-grants-open-phil-beri/ clarifies that the amount for 2019 is on top of the third year of three-year $1.25 million/year support announced in October 2017, and the total $2.31 million represents Open Phil's full intended funding for MIRI for 2019, but the amount for 2020 of ~$1.06 million is a lower bound, and Open Phil may grant more for 2020 later. In November 2019, additional funding would bring the total award amount to $2,652,500

Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but likely reasons include: (1) The original three-year funding period https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2017 is coming to an end, (2) Even though there is time before the funding period ends, MIRI has grown in budget and achievements, so a suitable funding amount could be larger, (3) The Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support did its first round of money allocation, so the timing is determined by the timing of that allocation round
Intended funding timeframe in months: 24

Donor thoughts on making further donations to the donee: According to https://intelligence.org/2019/04/01/new-grants-open-phil-beri/ Open Phil may increase its level of support for 2020 beyond the ~$1.06 million that is part of this grant

Donor retrospective of the donation: The much larger followup grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2020 with a very similar writeup suggests that Open Phil and the Committee for Effective Altruism Support would continue to stand by the reasoning for the grant

Other notes: The grantee, MIRI, discusses the grant on its website at https://intelligence.org/2019/04/01/new-grants-open-phil-beri/ along with a $600,000 grant from the Berkeley Existential Risk Initiative. Announced: 2019-04-01.
Open Philanthropy ProjectCenter for International Security and Cooperation1,625,000.002019-01Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/center-international-security-and-cooperation-biosecurity-research-2019 Grant over three years to Stanford University’s Center for International Security and Cooperation (CISAC) to support Megan Palmer’s work on biosecurity. This research is focused on developing ways to improve governance of biological science and to reduce the risk of misuse of advanced biotechnology. This funding is intended to allow Dr. Palmer to continue and extend a study on the attitudes of participants in International Genetically Engineered Machine (iGEM), to better understand how institutional environments, safety practices or competition incentives might motivate young scientists and engineers. The grant is a renewal of the October 2016 grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/center-international-security-and-cooperation-biosecurity-research. Announced: 2019-02-12.
Open Philanthropy ProjectUniversity of Sydney32,621.002018-12Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/university-of-sydney-global-health-security-conference Grant of $45,000 AUD ($32,620.50 at the time of conversion) to the University of Sydney to support the 2019 Global Health Security Conference in Sydney, Australia. The funds are intended for general support of the conference, and to support travel bursaries to allow participants from low-income countries to attend a gathering of the global health security community, including academics, policymakers, and practitioners. Announced: 2019-01-17.
Open Philanthropy ProjectBiosecure Ltd25,000.002018-12Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/biosecure-campaign-against-bioweapons-research Donation process: Discretionary grant structured through a contractor agreement. The grant page says: "While we do not typically publish pages for contractor agreements, we chose to write about this funding because we view it as conceptually similar to an ordinary grant, despite its structure as a contract due to the recipient’s organizational form."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grantee "intends to use these funds to explore different models for strengthening the societal norm against biological weapons and reducing the likelihood of an arms race involving biological weapons, as well as investigating the feasibility, costs, and potential benefits of the various models." Announced: 2019-06-07.
Open Philanthropy ProjectNuclear Threat Initiative1,904,942.002018-11Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/nuclear-threat-initiative-projects-to-reduce-global-catastrophic-biological-risks Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support projects to reduce Global Catastrophic Biological Risks (GCBRs). NTI intends to use these funds to support projects including, among others, strengthening the Biological Weapons Convention and reducing state biological threats and additional GCBRs through international dialogues." Intended funding timeframe in months: 1; announced: 2018-12-13.
Open Philanthropy ProjectInternational Genetically Engineered Medicine Foundation420,000.002018-11Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/igem-synthetic-biology-safety-and-security-2018 Grant over two years to the International Genetically Engineered Machine (iGEM) Foundation for its work on safety and security, led by Piers Millett. iGEM is an international synthetic biology competition for students. Donor believes that supporting iGEM’s safety and security work could help raise awareness about biosecurity among current and future synthetic biologists. Renewal of May 2016 grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/igem-synthetic-biology-safety-and-security. Announced: 2019-01-31.
Open Philanthropy ProjectUniversity of Oxford26,086.002018-10Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/oxford-university-dphil-support-for-andrew-snyder-beattie Gift of £20,000 ($26,086 at time of conversion) to the University of Oxford to support the research of the Mathematical Ecology Research Group and the research costs of Andrew Snyder-Beattie, who recently served as Director of Research at the Future of Humanity Institute and a member of FHI’s Biotechnology Research Team. Announced: 2018-10-30.
Open Philanthropy ProjectCenter for a New American Security400,352.002018-09Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/center-for-a-new-american-security-richard-danzig-outreach-on-technological-risk-2018 Grant to support outreach by Richard Danzig, former Secretary of the Navy, on technological risks. This is a renewal and expansion of the August 2017 grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/center-for-a-new-american-security-richard-danzig-outreach-on-technological-risk#footnote1_ix4f0ts which allowed Dr. Danzig to produce Technology Roulette https://www.cnas.org/publications/reports/technology-roulette a report intended for the national security community detailing the management of risks from losing control of advanced technology. Dr. Danzig intends to use these new funds to continue sharing these ideas with U.S. government officials, as well as spreading them to national security leaders abroad. Announced: 2018-10-20.
Open Philanthropy ProjectMachine Intelligence Research Institute150,000.002018-06AI safetyhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-ai-safety-retraining-program Donation process: The grant is a discretionary grant, so the approval process is short-circuited; see https://www.openphilanthropy.org/giving/grants/discretionary-grants for more

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to suppport the artificial intelligence safety retraining project. MIRI intends to use these funds to provide stipends, structure, and guidance to promising computer programmers and other technically proficient individuals who are considering transitioning their careers to focus on potential risks from advanced artificial intelligence. MIRI believes the stipends will make it easier for aligned individuals to leave their jobs and focus full-time on safety. MIRI expects the transition periods to range from three to six months per individual. The MIRI blog post https://intelligence.org/2018/09/01/summer-miri-updates/ says: "Buck [Shlegeris] is currently selecting candidates for the program; to date, we’ve made two grants to individuals."

Other notes: The grant is mentioned by MIRI in https://intelligence.org/2018/09/01/summer-miri-updates/. Announced: 2018-06-27.
Open Philanthropy ProjectEarly-Career Funding for Global Catastrophic Biological Risks570,000.002018-05Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/early-career-funding-global-catastrophic-biological-risks Total over three years in flexible support to enable five early-career people to pursue work and study related to global catastrophic biological risks. Original grant amount $515,000; $55,000 was added on top in October 2018. Announced: 2018-08-24.
Open Philanthropy ProjectDavid Manheim65,308.002017-11Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/david-manheim-research-existential-risk Grant to perform a research and analysis project, "Eliciting Evaluations of Existential Risk from Infectious Disease.". Announced: 2018-01-30.
Open Philanthropy ProjectSolar Radiation Management Governance Initiative2,000,000.002017-09Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/solar-radiation-management-governance-initiative-general-support-2017 Grant awarded via the Environmental Defense Fund for general support. The funding is intended to help support SRMGI’s on-going governance work related to solar radiation management (SRM), and will additionally help support a new research fund for modeling the impacts of SRM across the developing world, called the Developing Country Impacts Modeling Analysis for SRM (DECIMALS). Announced: 2017-10-09.
Open Philanthropy ProjectNew Partnership for Africa’s Development2,350,000.002017-04Scientific research/malaria/gene drivehttps://www.openphilanthropy.org/focus/scientific-research/miscellaneous/new-partnership-africa-s-development-general-support Grant to the Planning and Coordinating Agency, the technical arm of the African Union, to support the evaluation, preparation, and potential deployment of gene drive technologies in some African regions. Part of a set of grants related to gene drives; see https://www.openphilanthropy.org/focus/scientific-research/miscellaneous/target-malaria-general-support for a larger grant to Target Malaria in the same domain and at around the same time. Announced: 2017-05-26.

Donation amounts by donee and year

Donee Donors influenced Cause area Metadata Total 2020 2019 2018 2017
Machine Intelligence Research Institute Open Philanthropy Project (filter this donor) AI safety FB Tw WP Site CN GS TW 10,506,250.00 7,703,750.00 2,652,500.00 150,000.00 0.00
Centre for Effective Altruism Open Philanthropy Project (filter this donor) Effective altruism/movement growth FB Site 8,658,966.00 4,146,795.00 4,512,171.00 0.00 0.00
80,000 Hours Open Philanthropy Project (filter this donor) Career coaching/life guidance FB Tw WP Site 8,253,087.00 3,457,284.00 4,795,803.00 0.00 0.00
New Partnership for Africa’s Development Open Philanthropy Project (filter this donor) 2,350,000.00 0.00 0.00 0.00 2,350,000.00
Solar Radiation Management Governance Initiative Open Philanthropy Project (filter this donor) 2,000,000.00 0.00 0.00 0.00 2,000,000.00
Nuclear Threat Initiative Open Philanthropy Project (filter this donor) FB Tw WP Site 1,904,942.00 0.00 0.00 1,904,942.00 0.00
Center for International Security and Cooperation Open Philanthropy Project (filter this donor) WP 1,625,000.00 0.00 1,625,000.00 0.00 0.00
Ought Open Philanthropy Project (filter this donor) AI safety Site 1,593,333.00 1,593,333.00 0.00 0.00 0.00
Effective Altruism Foundation Open Philanthropy Project (filter this donor) Effective altruism/movement growth FB Tw Site 1,000,000.00 0.00 1,000,000.00 0.00 0.00
MIT Media Lab Open Philanthropy Project (filter this donor) 1,000,000.00 0.00 1,000,000.00 0.00 0.00
Early-Career Funding for Global Catastrophic Biological Risks Open Philanthropy Project (filter this donor) 570,000.00 0.00 0.00 570,000.00 0.00
Altruistic Technology Labs Open Philanthropy Project (filter this donor) 440,525.00 0.00 440,525.00 0.00 0.00
Ethan Alley Open Philanthropy Project (filter this donor) 437,800.00 0.00 437,800.00 0.00 0.00
International Genetically Engineered Medicine Foundation Open Philanthropy Project (filter this donor) 420,000.00 0.00 0.00 420,000.00 0.00
Center for a New American Security Open Philanthropy Project (filter this donor) 400,352.00 0.00 0.00 400,352.00 0.00
Center for Applied Rationality Open Philanthropy Project (filter this donor) Rationality FB Tw WP Site TW 375,000.00 375,000.00 0.00 0.00 0.00
David Manheim Open Philanthropy Project (filter this donor) 65,308.00 0.00 0.00 0.00 65,308.00
University of Sydney Open Philanthropy Project (filter this donor) 32,621.00 0.00 0.00 32,621.00 0.00
University of Oxford Open Philanthropy Project (filter this donor) FB Tw WP Site 26,086.00 0.00 0.00 26,086.00 0.00
Biosecure Ltd Open Philanthropy Project (filter this donor) 25,000.00 0.00 0.00 25,000.00 0.00
Tampere University Open Philanthropy Project (filter this donor) 15,000.00 0.00 15,000.00 0.00 0.00
Total ---- -- 41,699,270.00 17,276,162.00 16,478,799.00 3,529,001.00 4,415,308.00

Graph of spending by donee and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by donee and year (cumulative)

Graph of spending should have loaded here

Donation amounts by donor and year for influencer Claire Zabel|Committee for Effective Altruism Support

Donor Donees Total 2020 2019 2018 2017
Open Philanthropy Project (filter this donee) 80,000 Hours (filter this donee), Altruistic Technology Labs (filter this donee), Center for a New American Security (filter this donee), Center for Applied Rationality (filter this donee), Center for International Security and Cooperation (filter this donee), Centre for Effective Altruism (filter this donee), David Manheim (filter this donee), Early-Career Funding for Global Catastrophic Biological Risks (filter this donee), Effective Altruism Foundation (filter this donee), Ethan Alley (filter this donee), International Genetically Engineered Medicine Foundation (filter this donee), Machine Intelligence Research Institute (filter this donee), MIT Media Lab (filter this donee), New Partnership for Africa’s Development (filter this donee), Nuclear Threat Initiative (filter this donee), Ought (filter this donee), Solar Radiation Management Governance Initiative (filter this donee), Tampere University (filter this donee), University of Oxford (filter this donee), University of Sydney (filter this donee) 41,674,270.00 17,276,162.00 16,478,799.00 3,504,001.00 4,415,308.00
Open Philanthropy Project (filter this donee) Biosecure Ltd (filter this donee) 25,000.00 0.00 0.00 25,000.00 0.00
Total -- 41,699,270.00 17,276,162.00 16,478,799.00 3,529,001.00 4,415,308.00

Graph of spending by donee and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by donee and year (cumulative)

Graph of spending should have loaded here