Claire Zabel money moved

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of March 2023. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Full list of donations in reverse chronological order (23 donations)

DonorDoneeAmount (current USD)Donation dateCause areaURLNotes
Open PhilanthropyEffective Altruism Funds: Effective Altruism Infrastructure Fund500,000.002021-10Effective altruismhttps://www.openphilanthropy.org/giving/grants/centre-effective-altruism-infrastructure-fund Donation process: https://funds.effectivealtruism.org/funds/payouts/may-august-2021-ea-infrastructure-fund-grants says: "the EAIF has a large funding gap of $3M+ per year that currently looks like it may only be possible to fill by grants from large donors such as Open Philanthropy. [...] For this reason, we have started to apply for grants from large funders"

Intended use of funds (category): Regranting

Intended use of funds: The funding is to be used for the EAIF's ongoing grantmaking. Since the EAIF expects to grant $3 million+ per year, this funding would effectively get spent within the next few months. The grant page says: "The EAIF intends to re-grant this funding to interventions that aim to increase the impact of projects related to effective altruism, by increasing those projects’ access to talent, capital, and knowledge." https://funds.effectivealtruism.org/funds/payouts/may-august-2021-ea-infrastructure-fund-grants also says: "by default, the EAIF will not fund organizations that are Open Philanthropy grantees and that plan to apply for renewed funding from Open Philanthropy in the future."
Open PhilanthropyCenter for Applied Rationality375,000.002020-02Rationality improvementhttps://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2020 Intended use of funds (category): Organizational general support

Intended use of funds: The grant page says: "CFAR is an adult education nonprofit that seeks to find and develop cognitive tools and to deliver these tools to promising individuals, groups, organizations, and networks focused on solving large and pressing problems. [...] They introduce people to and/or strengthen their connections with the effective altruism (EA) community and way of thinking, which we hope results in people with outstanding potential pursuing more impactful career trajectories. CFAR is particularly interested in growing the talent pipeline for work on potential risks from advanced artificial intelligence (AI)."

Donor reason for selecting the donee: The grant page says: "Our primary interest in these workshops is that we believe they introduce people to and/or strengthen their connections with the effective altruism (EA) community and way of thinking, which we hope results in people with outstanding potential pursuing more impactful career trajectories." Also: "CFAR is particularly interested in growing the talent pipeline for work on potential risks from advanced artificial intelligence (AI). More on our interest in supporting work [...]"

Donor reason for donating that amount (rather than a bigger or smaller amount): Amount chosen to provide one year of operating support

Donor reason for donating at this time (rather than earlier or later): Timing determind by the end of the funding timeframe of the previous two-year grant https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018 made January 2018
Intended funding timeframe in months: 12

Donor thoughts on making further donations to the donee: This is an exit grant, so Open Phil does not plan to make further grants to CFAR.

Other notes: Announced: 2020-04-20.
Open PhilanthropyMachine Intelligence Research Institute7,703,750.002020-02AI safetyhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2020 Donation process: The decision of whether to donate seems to have followed the Open Philanthropy Project's usual process, but the exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support

Intended use of funds (category): Organizational general support

Intended use of funds: MIRI plans to use these funds for ongoing research and activities related to AI safety

Donor reason for selecting the donee: The grant page says "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" with the most similar previous grant being https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2019 (February 2019). Past writeups include the grant pages for the October 2017 three-year support https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2017 and the August 2016 one-year support https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support

Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Three other grants decided by CEAS at around the same time are: Centre for Effective Altruism ($4,146,795), 80,000 Hours ($3,457,284), and Ought ($1,593,333).

Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but this is likely the time when the Committee for Effective Altruism Support does its 2020 allocation.
Intended funding timeframe in months: 24

Other notes: The donee describes the grant in the blog post https://intelligence.org/2020/04/27/miris-largest-grant-to-date/ (2020-04-27) along with other funding it has received ($300,000 from the Berkeley Existential Risk Initiative and $100,000 from the Long-Term Future Fund). The fact that the grant is a two-year grant is mentioned here, but not in the grant page on Open Phil's website. The page also mentions that of the total grant amount of $7.7 million, $6.24 million is coming from Open Phil's normal funders (Good Ventures) and the remaining $1.46 million is coming from Ben Delo, co-founder of the cryptocurrency trading platform BitMEX, as part of a funding partnership https://www.openphilanthropy.org/blog/co-funding-partnership-ben-delo announced November 11, 2019. Announced: 2020-04-10.
Open PhilanthropyBerkeley Existential Risk Initiative150,000.002020-01AI safetyhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/berkeley-existential-risk-initiative-general-support Intended use of funds (category): Organizational general support

Intended use of funds: The grant page says: "BERI seeks to reduce existential risks to humanity, and collaborates with other long-termist organizations, including the Center for Human-Compatible AI at UC Berkeley. This funding is intended to help BERI establish new collaborations."
Open PhilanthropyEffective Altruism Foundation1,000,000.002019-07Effective altruismhttps://www.openphilanthropy.org/giving/grants/effective-altruism-foundation-research-operations Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support research and operations"

Donor reason for selecting the donee: The grant page says "A major purpose of this grant is to encourage and support EAF and our other grantees in the space in taking approaches to longtermism with greater emphasis on shared objectives between different value systems. We conceive of this grant as falling under our work aimed at growing and supporting the EA community." Earlier in the document, past reservations that Open Phil has had about EAF are described: "EAF is an organization whose values put a particular emphasis on trying to reduce the risks of future suffering. While preventing suffering is a value we share, we also believe that the speculative and suffering-focused nature of this work means that it needs to be communicated about carefully, and could be counterproductive otherwise. As a result, we have felt ambivalent about EAF’s work to date (despite feeling unambiguously positively about some of their projects)."

Other notes: The grant would be discussed further by Simon Knutsson in his critical post https://www.simonknutsson.com/problems-in-effective-altruism-and-existential-risk-and-what-to-do-about-them/ that also includes discussion of guidelines that Nick Beckstead of the Open Philanthropy Project developed, and that EAF was now adopting and encouraging others to adopt. Knutsson sees the adoption of the guidelines as being linked to the grant money, due to both the timing matching and the language of the grant page. On separate pages, Knutsson publishes correspondence between him and people at Open Phil and EAF where he tried to get more specific information from the two organizations: https://www.simonknutsson.com/e-mail-exchange-with-the-open-philanthropy-project and https://www.simonknutsson.com/message-exchange-with-eaf/. Intended funding timeframe in months: 24; announced: 2019-07-30.
Open PhilanthropyAltruistic Technology Labs440,525.002019-05Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/altruistic-technology-labs-biological-risk-prevention Intended use of funds (category): Organizational general support

Intended use of funds: The grantee, "AltLabs", a new organization, intends to use these funds to hire initial staff and pursue various research projects related to catastrophic risk reduction, including machine-learning-based attribution of engineered DNA and broad-spectrum infectious disease diagnostics.

Other notes: Announced: 2019-07-18.
Open PhilanthropyEthan Alley437,800.002019-05Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/scholarship-support-2019 Intended use of funds (category): Living expenses during research project

Intended use of funds: The grant page says the grant is "over four years in scholarship funds support to Ethan Alley to pursue a PhD at the Massachusetts Institute of Technology. The funding is intended to be used for his tuition, fees, healthcare, and a living stipend during his degree program

Donor reason for selecting the donee: The grant page says the grant "is part of an effort to support value-aligned and qualified early-career researchers interested in global catastrophic risks."

Donor reason for donating that amount (rather than a bigger or smaller amount): The grant page says: "The funding is intended to be used for his tuition, fees, healthcare, and a living stipend during his degree program [over four years]" so the amount is likely determined based on the sum of the costs of these over four years

Donor reason for donating at this time (rather than earlier or later): Likely determined by the start time of the grantee's PhD program
Intended funding timeframe in months: 48

Donor thoughts on making further donations to the donee: The grant page calls the grant "part of an effort to support value-aligned and qualified early-career researchers interested in global catastrophic risks" so it will likely be followed by other similar grants to other researchers

Other notes: Announced: 2019-07-18.
Open PhilanthropyTampere University15,000.002019-04Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/tampere-university-2019 Donation process: Discretionary grant

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support Professor Hiski Haukkala’s efforts related to global catastrophic risks. Haukkala, a Finnish professor of international relations, plans to use the funding to bring speakers to Finland to discuss existential risks, to attend events related to existential risks, and to support networking and related projects."

Other notes: Announced: 2019-06-07.
Open PhilanthropyMIT Media Lab1,000,000.002019-03Global catastrophic risks|Global health|Animal welfarehttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/massachusetts-institute-technology-media-lab-kevin-esvelt Intended use of funds (category): Direct project expenses

Intended use of funds: Grant over two years to the MIT Media Lab to support the research of Professor Kevin Esvelt. Professor Esvelt plans to use this funding to conduct research on global catastrophic risks, global health, and animal welfare.

Other notes: Intended funding timeframe in months: 24; announced: 2019-06-26.
Open PhilanthropyMachine Intelligence Research Institute2,652,500.002019-02AI safetyhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2019 Donation process: The decision of whether to donate seems to have followed the Open Philanthropy Project's usual process, but the exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support

Intended use of funds (category): Organizational general support

Intended use of funds: MIRI plans to use these funds for ongoing research and activities related to AI safety. Planned activities include alignment research, a summer fellows program, computer scientist workshops, and internship programs.

Donor reason for selecting the donee: The grant page says: "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" Past writeups include the grant pages for the October 2017 three-year support https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2017 and the August 2016 one-year support https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support

Donor reason for donating that amount (rather than a bigger or smaller amount): Amount decided by the Committee for Effective Altruism Support (CEAS) https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Two other grants with amounts decided by CEAS, made at the same time and therefore likely drawing from the same money pot, are to the Centre for Effective Altruism ($2,756,250) and 80,000 Hours ($4,795,803). The original amount of $2,112,500 is split across two years, and therefore ~$1.06 million per year. https://intelligence.org/2019/04/01/new-grants-open-phil-beri/ clarifies that the amount for 2019 is on top of the third year of three-year $1.25 million/year support announced in October 2017, and the total $2.31 million represents Open Phil's full intended funding for MIRI for 2019, but the amount for 2020 of ~$1.06 million is a lower bound, and Open Phil may grant more for 2020 later. In November 2019, additional funding would bring the total award amount to $2,652,500.

Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but likely reasons include: (1) The original three-year funding period https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2017 is coming to an end, (2) Even though there is time before the funding period ends, MIRI has grown in budget and achievements, so a suitable funding amount could be larger, (3) The Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support did its first round of money allocation, so the timing is determined by the timing of that allocation round.
Intended funding timeframe in months: 24

Donor thoughts on making further donations to the donee: According to https://intelligence.org/2019/04/01/new-grants-open-phil-beri/ Open Phil may increase its level of support for 2020 beyond the ~$1.06 million that is part of this grant.

Donor retrospective of the donation: The much larger followup grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2020 with a very similar writeup suggests that Open Phil and the Committee for Effective Altruism Support would continue to stand by the reasoning for the grant.

Other notes: The grantee, MIRI, discusses the grant on its website at https://intelligence.org/2019/04/01/new-grants-open-phil-beri/ along with a $600,000 grant from the Berkeley Existential Risk Initiative. Announced: 2019-04-01.
Open PhilanthropyCenter for International Security and Cooperation1,625,000.002019-01Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/center-international-security-and-cooperation-biosecurity-research-2019 Grant over three years to Stanford University’s Center for International Security and Cooperation (CISAC) to support Megan Palmer’s work on biosecurity. This research is focused on developing ways to improve governance of biological science and to reduce the risk of misuse of advanced biotechnology. This funding is intended to allow Dr. Palmer to continue and extend a study on the attitudes of participants in International Genetically Engineered Machine (iGEM), to better understand how institutional environments, safety practices or competition incentives might motivate young scientists and engineers. The grant is a renewal of the October 2016 grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/center-international-security-and-cooperation-biosecurity-research. Announced: 2019-02-12.
Open PhilanthropyBiosecure Ltd25,000.002018-12Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/biosecure-campaign-against-bioweapons-research Donation process: Discretionary grant structured through a contractor agreement. The grant page says: "While we do not typically publish pages for contractor agreements, we chose to write about this funding because we view it as conceptually similar to an ordinary grant, despite its structure as a contract due to the recipient’s organizational form."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grantee "intends to use these funds to explore different models for strengthening the societal norm against biological weapons and reducing the likelihood of an arms race involving biological weapons, as well as investigating the feasibility, costs, and potential benefits of the various models."

Other notes: Announced: 2019-06-07.
Open PhilanthropyUniversity of Sydney32,621.002018-12Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/university-of-sydney-global-health-security-conference Grant of $45,000 AUD ($32,620.50 at the time of conversion) to the University of Sydney to support the 2019 Global Health Security Conference in Sydney, Australia. The funds are intended for general support of the conference, and to support travel bursaries to allow participants from low-income countries to attend a gathering of the global health security community, including academics, policymakers, and practitioners. Announced: 2019-01-17.
Open PhilanthropyNuclear Threat Initiative1,904,942.002018-11Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/nuclear-threat-initiative-projects-to-reduce-global-catastrophic-biological-risks Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support projects to reduce Global Catastrophic Biological Risks (GCBRs). NTI intends to use these funds to support projects including, among others, strengthening the Biological Weapons Convention and reducing state biological threats and additional GCBRs through international dialogues."

Other notes: Intended funding timeframe in months: 36; announced: 2018-12-13.
Open PhilanthropyInternational Genetically Engineered Medicine Foundation420,000.002018-11Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/igem-synthetic-biology-safety-and-security-2018 Grant over two years to the International Genetically Engineered Machine (iGEM) Foundation for its work on safety and security, led by Piers Millett. iGEM is an international synthetic biology competition for students. Donor believes that supporting iGEM’s safety and security work could help raise awareness about biosecurity among current and future synthetic biologists. Renewal of May 2016 grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/igem-synthetic-biology-safety-and-security. Announced: 2019-01-31.
Open PhilanthropyUniversity of Oxford26,086.002018-10Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/oxford-university-dphil-support-for-andrew-snyder-beattie Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support the research of the Mathematical Ecology Research Group and the research costs of Andrew Snyder-Beattie, who recently served as Director of Research at the Future of Humanity Institute and a member of FHI’s Biotechnology Research Team."

Donor retrospective of the donation: Andrew Snyder-Beattie, whose work the grant would fund, would later be hired by Open Philanthropy and lead its biosecurity and pandemic preparedness program. This probably indicates continued satisfaction with him.

Other notes: Currency info: donation given as 20,000.00 GBP (conversion done via donor calculation); announced: 2018-10-29.
Open PhilanthropyCenter for a New American Security400,352.002018-09Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/center-for-a-new-american-security-richard-danzig-outreach-on-technological-risk-2018 Grant to support outreach by Richard Danzig, former Secretary of the Navy, on technological risks. This is a renewal and expansion of the August 2017 grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/center-for-a-new-american-security-richard-danzig-outreach-on-technological-risk#footnote1_ix4f0ts which allowed Dr. Danzig to produce Technology Roulette https://www.cnas.org/publications/reports/technology-roulette a report intended for the national security community detailing the management of risks from losing control of advanced technology. Dr. Danzig intends to use these new funds to continue sharing these ideas with U.S. government officials, as well as spreading them to national security leaders abroad. Announced: 2018-10-20.
Open PhilanthropyMachine Intelligence Research Institute150,000.002018-06AI safetyhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-ai-safety-retraining-program Donation process: The grant is a discretionary grant, so the approval process is short-circuited; see https://www.openphilanthropy.org/giving/grants/discretionary-grants for more

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to suppport the artificial intelligence safety retraining project. MIRI intends to use these funds to provide stipends, structure, and guidance to promising computer programmers and other technically proficient individuals who are considering transitioning their careers to focus on potential risks from advanced artificial intelligence. MIRI believes the stipends will make it easier for aligned individuals to leave their jobs and focus full-time on safety. MIRI expects the transition periods to range from three to six months per individual. The MIRI blog post https://intelligence.org/2018/09/01/summer-miri-updates/ says: "Buck [Shlegeris] is currently selecting candidates for the program; to date, we’ve made two grants to individuals."

Other notes: The grant is mentioned by MIRI in https://intelligence.org/2018/09/01/summer-miri-updates/. Announced: 2018-06-27.
Open PhilanthropyEarly-Career Funding for Global Catastrophic Biological Risks570,000.002018-05Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/early-career-funding-global-catastrophic-biological-risks Total over three years in flexible support to enable five early-career people to pursue work and study related to global catastrophic biological risks. Original grant amount $515,000; $55,000 was added on top in October 2018. Announced: 2018-08-24.
Open PhilanthropyDavid Manheim65,308.002017-11Biosecurity and pandemic preparednesshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/david-manheim-research-existential-risk Grant to perform a research and analysis project, "Eliciting Evaluations of Existential Risk from Infectious Disease.". Announced: 2018-01-30.
Open PhilanthropyThe Degrees Initiative2,000,000.002017-09Climate change/geoengineering/solar radiation managementhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/solar-radiation-management-governance-initiative-general-support-2017 Intended use of funds (category): Organizational general support

Intended use of funds: The grant page says: "The funding is intended to help support SRMGI’s on-going governance work related to solar radiation management (SRM), and will additionally help support a new research fund for modeling the impacts of SRM across the developing world, called the Developing Country Impacts Modeling Analysis for SRM (DECIMALS)."

Donor reason for selecting the donee: The grant page says: "In short, our impression is that there is globally very little work focused on improving the governance of potential SRM implementation, and we therefore consider funding SRMGI as an opportunity to build a sparse field. Additionally, we are inclined to agree with SRMGI’s case that it is important to increase the engagement of experts in the developing world with SRM governance issues." It links to the previous grant page https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/solar-radiation-management-governance-initiative-general-support for more details.

Donor retrospective of the donation: The followup grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/degrees-initiative-general-support-2021 suggests continued satisfaction with the grantee.

Other notes: Grant via the Environmental Defense Fund. At the time, the grantee is called the Solar Radiation Management Governance Initiative. Intended funding timeframe in months: 36; announced: 2017-10-09.
Open PhilanthropyNew Partnership for Africa’s Development2,350,000.002017-04Scientific research/malaria/gene drivehttps://www.openphilanthropy.org/focus/scientific-research/miscellaneous/new-partnership-africa-s-development-general-support Grant to the Planning and Coordinating Agency, the technical arm of the African Union, to support the evaluation, preparation, and potential deployment of gene drive technologies in some African regions. Part of a set of grants related to gene drives; see https://www.openphilanthropy.org/focus/scientific-research/miscellaneous/target-malaria-general-support for a larger grant to Target Malaria in the same domain and at around the same time. Announced: 2017-05-26.
Open PhilanthropyHarvard University2,500,000.002016-12Climate change/geoengineering/solar radiation managementhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/harvard-university-solar-geoengineering-research-program Donation process: https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/harvard-university-solar-geoengineering-research-program#Our_process says: "Claire Zabel, a Research Analyst for the Open Philanthropy Project, spoke and exchanged emails with Professor [David] Keith and Dr. [Gernot] Wagner."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support the founding of the Solar Geoengineering Research Program (SGRP) as part of the Harvard University Center for the Environment. This program will be a coordinated research effort focusing on solar geoengineering research, governance, and advocacy led by Professor David Keith and Dr. Gernot Wagner (formerly the Environmental Defense Fund’s lead senior economist)." https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/harvard-university-solar-geoengineering-research-program#Budget_and_proposed_activities lists the kinds of expenses: executive director's salary, creation of a comprehensive blueprint, outreach and convening, advancing science and technology, assessing efficacy and risks, governance and social implications, and Harvard-wide faculty grants.

Donor reason for selecting the donee: https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/harvard-university-solar-geoengineering-research-program#Case_for_the_grant says: "We see this grant as a good opportunity to build and advance the field of solar geoengineering, especially given SGRP’s emphasis on cooperation and collaboration between researchers and on developing solar geoengineering technology in the manner that is most likely to affect the world positively. [...] we consider Professor Keith to be a top scientist who is relatively aligned with us in terms of being pragmatic, cognizant of tradeoffs, and focused on global rather than national interests. It seems to us that earlier and more research on solar geoengineering will make it more likely that the global community will have an in-depth understanding of technological options and risks in the event that climate engineering is seriously considered as an approach to reducing harms from climate change at some point in the future.

Donor reason for donating at this time (rather than earlier or later): Timing determined by the timing of the founding of the institute. The founding is not solely funded by Open Philanthropy, so Open Philanthropy didn't determine the timing. The grant page says: "Other founding funders include Bill Gates, the Hewlett Foundation, and the Alfred P. Sloan Foundation."
Intended funding timeframe in months: 60

Other notes: The grant page says: "Other founding funders include Bill Gates, the Hewlett Foundation, and the Alfred P. Sloan Foundation." https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/harvard-university-solar-geoengineering-research-program#Risks_and_reservations lists risks and reservations, both specific to the grant and general to geoengineering. Announced: 2017-04-14.

Donation amounts by donee and year

Donee Donors influenced Cause area Metadata Total 2021 2020 2019 2018 2017 2016
Machine Intelligence Research Institute Open Philanthropy (filter this donor) AI safety FB Tw WP Site CN GS TW 10,506,250.00 0.00 7,703,750.00 2,652,500.00 150,000.00 0.00 0.00
Harvard University Open Philanthropy (filter this donor) FB Tw WP Site 2,500,000.00 0.00 0.00 0.00 0.00 0.00 2,500,000.00
New Partnership for Africa’s Development Open Philanthropy (filter this donor) 2,350,000.00 0.00 0.00 0.00 0.00 2,350,000.00 0.00
The Degrees Initiative Open Philanthropy (filter this donor) 2,000,000.00 0.00 0.00 0.00 0.00 2,000,000.00 0.00
Nuclear Threat Initiative Open Philanthropy (filter this donor) FB Tw WP Site 1,904,942.00 0.00 0.00 0.00 1,904,942.00 0.00 0.00
Center for International Security and Cooperation Open Philanthropy (filter this donor) WP 1,625,000.00 0.00 0.00 1,625,000.00 0.00 0.00 0.00
MIT Media Lab Open Philanthropy (filter this donor) 1,000,000.00 0.00 0.00 1,000,000.00 0.00 0.00 0.00
Effective Altruism Foundation Open Philanthropy (filter this donor) Effective altruism/movement growth FB Tw Site 1,000,000.00 0.00 0.00 1,000,000.00 0.00 0.00 0.00
Early-Career Funding for Global Catastrophic Biological Risks Open Philanthropy (filter this donor) 570,000.00 0.00 0.00 0.00 570,000.00 0.00 0.00
Effective Altruism Funds: Effective Altruism Infrastructure Fund Open Philanthropy (filter this donor) 500,000.00 500,000.00 0.00 0.00 0.00 0.00 0.00
Altruistic Technology Labs Open Philanthropy (filter this donor) 440,525.00 0.00 0.00 440,525.00 0.00 0.00 0.00
Ethan Alley Open Philanthropy (filter this donor) 437,800.00 0.00 0.00 437,800.00 0.00 0.00 0.00
International Genetically Engineered Medicine Foundation Open Philanthropy (filter this donor) 420,000.00 0.00 0.00 0.00 420,000.00 0.00 0.00
Center for a New American Security Open Philanthropy (filter this donor) 400,352.00 0.00 0.00 0.00 400,352.00 0.00 0.00
Center for Applied Rationality Open Philanthropy (filter this donor) Rationality FB Tw WP Site TW 375,000.00 0.00 375,000.00 0.00 0.00 0.00 0.00
Berkeley Existential Risk Initiative Open Philanthropy (filter this donor) AI safety/other global catastrophic risks Site TW 150,000.00 0.00 150,000.00 0.00 0.00 0.00 0.00
David Manheim Open Philanthropy (filter this donor) 65,308.00 0.00 0.00 0.00 0.00 65,308.00 0.00
University of Sydney Open Philanthropy (filter this donor) 32,621.00 0.00 0.00 0.00 32,621.00 0.00 0.00
University of Oxford Open Philanthropy (filter this donor) FB Tw WP Site 26,086.00 0.00 0.00 0.00 26,086.00 0.00 0.00
Biosecure Ltd Open Philanthropy (filter this donor) 25,000.00 0.00 0.00 0.00 25,000.00 0.00 0.00
Tampere University Open Philanthropy (filter this donor) 15,000.00 0.00 0.00 15,000.00 0.00 0.00 0.00
Total ---- -- 26,343,884.00 500,000.00 8,228,750.00 7,170,825.00 3,529,001.00 4,415,308.00 2,500,000.00

Graph of spending by donee and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by donee and year (cumulative)

Graph of spending should have loaded here

Donation amounts by donor and year for influencer Claire Zabel

Donor Donees Total 2021 2020 2019 2018 2017 2016
Open Philanthropy (filter this donee) Altruistic Technology Labs (filter this donee), Berkeley Existential Risk Initiative (filter this donee), Biosecure Ltd (filter this donee), Center for a New American Security (filter this donee), Center for Applied Rationality (filter this donee), Center for International Security and Cooperation (filter this donee), David Manheim (filter this donee), Early-Career Funding for Global Catastrophic Biological Risks (filter this donee), Effective Altruism Foundation (filter this donee), Effective Altruism Funds: Effective Altruism Infrastructure Fund (filter this donee), Ethan Alley (filter this donee), Harvard University (filter this donee), International Genetically Engineered Medicine Foundation (filter this donee), Machine Intelligence Research Institute (filter this donee), MIT Media Lab (filter this donee), New Partnership for Africa’s Development (filter this donee), Nuclear Threat Initiative (filter this donee), Tampere University (filter this donee), The Degrees Initiative (filter this donee), University of Oxford (filter this donee), University of Sydney (filter this donee) 26,343,884.00 500,000.00 8,228,750.00 7,170,825.00 3,529,001.00 4,415,308.00 2,500,000.00
Total -- 26,343,884.00 500,000.00 8,228,750.00 7,170,825.00 3,529,001.00 4,415,308.00 2,500,000.00

Graph of spending by donee and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by donee and year (cumulative)

Graph of spending should have loaded here