, ,
This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of July 2025. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.
There are no documents associated with this influencer.
Donor | Donee | Amount (current USD) | Donation date | Cause area | URL | Notes |
---|---|---|---|---|---|---|
Open Philanthropy | Effective Altruism Funds: Effective Altruism Infrastructure Fund | 500,000.00 | Effective altruism | https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-infrastructure-fund | Donation process: https://funds.effectivealtruism.org/funds/payouts/may-august-2021-ea-infrastructure-fund-grants says: "the EAIF has a large funding gap of $3M+ per year that currently looks like it may only be possible to fill by grants from large donors such as Open Philanthropy. [...] For this reason, we have started to apply for grants from large funders" Intended use of funds (category): Regranting Intended use of funds: The funding is to be used for the EAIF's ongoing grantmaking. Since the EAIF expects to grant $3 million+ per year, this funding would effectively get spent within the next few months. The grant page says: "The EAIF intends to re-grant this funding to interventions that aim to increase the impact of projects related to effective altruism, by increasing those projects’ access to talent, capital, and knowledge." https://funds.effectivealtruism.org/funds/payouts/may-august-2021-ea-infrastructure-fund-grants also says: "by default, the EAIF will not fund organizations that are Open Philanthropy grantees and that plan to apply for renewed funding from Open Philanthropy in the future." |
|
Open Philanthropy | Berkeley Existential Risk Initiative | 210,000.00 | AI safety/technical research/talent pipeline | https://www.openphilanthropy.org/grants/berkeley-existential-risk-initiative-seri-summer-fellowships/ | Intended use of funds (category): Direct project expenses Intended use of funds: Grant "to provide stipends for the Stanford Existential Risks Initiative (SERI) summer research fellowship program." Donor retrospective of the donation: The multiple future grants https://www.openphilanthropy.org/grants/berkeley-existential-risk-initiative-seri-mats-program-2/ https://www.openphilanthropy.org/grants/berkeley-existential-risk-initiative-seri-mats-program/ and https://www.openphilanthropy.org/grants/berkeley-existential-risk-initiative-machine-learning-alignment-theory-scholars/ from Open Philanthropy to BERI for the SERI-MATS program, a successor of sorts to this program, suggests satisfaction with the outcome of this grant. Other notes: Intended funding timeframe in months: 2. |
|
Open Philanthropy | Centre for the Governance of AI | 450,000.00 | AI safety/governance | https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/gov-ai-general-support | Donation process: The grant was recommended by the Committee for Effective Altruism Support following its process https://www.openphilanthropy.org/committee-effective-altruism-support Intended use of funds (category): Organizational general support Intended use of funds: The grant page says: "GovAI intends to use these funds to support the visit of two senior researchers and a postdoc researcher." Donor reason for selecting the donee: The grant page says "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" but does not link to specific past writeups (Open Phil has not previously made grants directly to GovAI). Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Donor retrospective of the donation: The much larger followup grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/gov-ai-field-building (December 2021) suggests continued satisfaction with the grantee. Other notes: Grant made via the Berkeley Existential Risk Initiative. |
|
Open Philanthropy | Machine Intelligence Research Institute | 7,703,750.00 | AI safety/technical research | https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2020 | Donation process: The decision of whether to donate seems to have followed the Open Philanthropy Project's usual process, but the exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support Intended use of funds (category): Organizational general support Intended use of funds: MIRI plans to use these funds for ongoing research and activities related to AI safety Donor reason for selecting the donee: The grant page says "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" with the most similar previous grant being https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2019 (February 2019). Past writeups include the grant pages for the October 2017 three-year support https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2017 and the August 2016 one-year support https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Three other grants decided by CEAS at around the same time are: Centre for Effective Altruism ($4,146,795), 80,000 Hours ($3,457,284), and Ought ($1,593,333). Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but this is likely the time when the Committee for Effective Altruism Support does its 2020 allocation. Intended funding timeframe in months: 24 Other notes: The donee describes the grant in the blog post https://intelligence.org/2020/04/27/miris-largest-grant-to-date/ (2020-04-27) along with other funding it has received ($300,000 from the Berkeley Existential Risk Initiative and $100,000 from the Long-Term Future Fund). The fact that the grant is a two-year grant is mentioned here, but not in the grant page on Open Phil's website. The page also mentions that of the total grant amount of $7.7 million, $6.24 million is coming from Open Phil's normal funders (Good Ventures) and the remaining $1.46 million is coming from Ben Delo, co-founder of the cryptocurrency trading platform BitMEX, as part of a funding partnership https://www.openphilanthropy.org/blog/co-funding-partnership-ben-delo announced November 11, 2019. Announced: 2020-04-10. |
|
Open Philanthropy | Center for Applied Rationality | 375,000.00 | Epistemic institutions | https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2020 | Intended use of funds (category): Organizational general support Intended use of funds: The grant page says: "CFAR is an adult education nonprofit that seeks to find and develop cognitive tools and to deliver these tools to promising individuals, groups, organizations, and networks focused on solving large and pressing problems. [...] They introduce people to and/or strengthen their connections with the effective altruism (EA) community and way of thinking, which we hope results in people with outstanding potential pursuing more impactful career trajectories. CFAR is particularly interested in growing the talent pipeline for work on potential risks from advanced artificial intelligence (AI)." Donor reason for selecting the donee: The grant page says: "Our primary interest in these workshops is that we believe they introduce people to and/or strengthen their connections with the effective altruism (EA) community and way of thinking, which we hope results in people with outstanding potential pursuing more impactful career trajectories." Also: "CFAR is particularly interested in growing the talent pipeline for work on potential risks from advanced artificial intelligence (AI). More on our interest in supporting work [...]" Donor reason for donating that amount (rather than a bigger or smaller amount): Amount chosen to provide one year of operating support Donor reason for donating at this time (rather than earlier or later): Timing determind by the end of the funding timeframe of the previous two-year grant https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018 made January 2018 Intended funding timeframe in months: 12 Donor thoughts on making further donations to the donee: This is an exit grant, so Open Phil does not plan to make further grants to CFAR. Other notes: Announced: 2020-04-20. |
|
Open Philanthropy | 80,000 Hours | 3,457,284.00 | Effective altruism/movement growth/career counseling | https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2020 | Donation process: The decision of whether to donate seems to have followed the Open Philanthropy Project's usual process, but the exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support Intended use of funds (category): Organizational general support Intended use of funds: 80,000 Hours aims to solve skill bottlenecks for career paths in what it considers to be the world’s most pressing problems. It does this by providing online research, in-person advice, and support with the goal of helping talented graduates age 20-40 enter high-impact careers. Donor reason for selecting the donee: Open Phil's grant writeup says: "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" with the most recent similar grant being https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2019 (February 2019) and the most recent grant with a detailed writeup being https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2018 (February 2018) Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but this is likely the time when the Committee for Effective Altruism Support does its 2020 allocation. Two other grants decided by CEAS at around the same time are: Machine Intelligence Research Institute ($7,703,750),Centre for Effective Altruism ($4,146,795), and Ought ($1,593,333) Other notes: Announced: 2020-03-09. |
|
Open Philanthropy | Berkeley Existential Risk Initiative | 150,000.00 | AI safety/technical research | https://www.openphilanthropy.org/grants/berkeley-existential-risk-initiative-general-support/ | Intended use of funds (category): Organizational general support Intended use of funds: The grant page says: "BERI seeks to reduce existential risks to humanity, and collaborates with other long-termist organizations, including the Center for Human-Compatible AI at UC Berkeley. This funding is intended to help BERI establish new collaborations." Donor retrospective of the donation: The followup grant https://www.openphilanthropy.org/grants/berkeley-existential-risk-initiative-general-support-2/ suggests continued satisfaction with the grantee. |
|
Open Philanthropy | Ought | 1,593,333.00 | AI safety/technical research | https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/ought-general-support-2020 | Donation process: The grant was recommended by the Committee for Effective Altruism Support following its process https://www.openphilanthropy.org/committee-effective-altruism-support Intended use of funds (category): Organizational general support Intended use of funds: The grant page says: "Ought conducts research on factored cognition, which we consider relevant to AI alignment and to reducing potential risks from advanced artificial intelligence." Donor reason for selecting the donee: The grant page says "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Three other grants decided by CEAS at around the same time are: Machine Intelligence Research Institute ($7,703,750), Centre for Effective Altruism ($4,146,795), and 80,000 Hours ($3,457,284). Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but this is likely the time when the Committee for Effective Altruism Support does its 2020 allocation Other notes: Announced: 2020-02-14. |
|
Open Philanthropy | Centre for Effective Altruism | 4,146,795.00 | Effective altruism/movement growth | https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-and-community-building-grants-2020 | Donation process: The exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support Intended use of funds (category): Organizational general support Intended use of funds: Grant for a mix of organizational general support and supporting the Effective Altruism Community Building Grants program operated by CEA Donor reason for selecting the donee: Open Phil's grant writeup says: "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" and links to the September 2019 support https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-and-community-building-grants-2019 that had the same intended use of funds (general support + Community Building Grants) Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support (CEAS) https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Two other grants decided by CEAS at around the same time are: Machine Intelligence Research Institute ($7,703,750), 80,000 Hours ($3,457,284), and Ought ($1,593,333) Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but this is likely the time when the Committee for Effective Altruism Support does its 2020 allocation Other notes: Announced: 2020-03-09. |
|
Open Philanthropy | Centre for Effective Altruism | 1,755,921.00 | Effective altruism/movement growth | https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-and-community-building-grants-2019 | Donation process: The exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support Intended use of funds (category): Organizational general support Intended use of funds: Grant for a mix of organizational general support and supporting the Effective Altruism Community Building Grants program operated by CEA Donor reason for selecting the donee: Open Phil's grant writeup says: "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" and links to the February 2019 support https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2019 Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support (CEAS) https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. No other grants seem to have been decided by CEAS close in time to this grant Other notes: Announced: 2019-11-08. |
|
Open Philanthropy | Effective Altruism Foundation | 1,000,000.00 | Effective altruism | https://www.openphilanthropy.org/giving/grants/effective-altruism-foundation-research-operations | Intended use of funds (category): Direct project expenses Intended use of funds: Grant "to support research and operations" Donor reason for selecting the donee: The grant page says "A major purpose of this grant is to encourage and support EAF and our other grantees in the space in taking approaches to longtermism with greater emphasis on shared objectives between different value systems. We conceive of this grant as falling under our work aimed at growing and supporting the EA community." Earlier in the document, past reservations that Open Phil has had about EAF are described: "EAF is an organization whose values put a particular emphasis on trying to reduce the risks of future suffering. While preventing suffering is a value we share, we also believe that the speculative and suffering-focused nature of this work means that it needs to be communicated about carefully, and could be counterproductive otherwise. As a result, we have felt ambivalent about EAF’s work to date (despite feeling unambiguously positively about some of their projects)." Other notes: The grant would be discussed further by Simon Knutsson in his critical post https://www.simonknutsson.com/problems-in-effective-altruism-and-existential-risk-and-what-to-do-about-them/ that also includes discussion of guidelines that Nick Beckstead of the Open Philanthropy Project developed, and that EAF was now adopting and encouraging others to adopt. Knutsson sees the adoption of the guidelines as being linked to the grant money, due to both the timing matching and the language of the grant page. On separate pages, Knutsson publishes correspondence between him and people at Open Phil and EAF where he tried to get more specific information from the two organizations: https://www.simonknutsson.com/e-mail-exchange-with-the-open-philanthropy-project and https://www.simonknutsson.com/message-exchange-with-eaf/. Intended funding timeframe in months: 24; announced: 2019-07-30. |
|
Open Philanthropy | Ethan Alley | 437,800.00 | Global catastrophic risks | https://www.openphilanthropy.org/focus/global-catastrophic-risks/scholarship-support-2019 | Intended use of funds (category): Living expenses during project Intended use of funds: The grant page says the grant is "over four years in scholarship funds support to Ethan Alley to pursue a PhD at the Massachusetts Institute of Technology. The funding is intended to be used for his tuition, fees, healthcare, and a living stipend during his degree program Donor reason for selecting the donee: The grant page says the grant "is part of an effort to support value-aligned and qualified early-career researchers interested in global catastrophic risks." Donor reason for donating that amount (rather than a bigger or smaller amount): The grant page says: "The funding is intended to be used for his tuition, fees, healthcare, and a living stipend during his degree program [over four years]" so the amount is likely determined based on the sum of the costs of these over four years Donor reason for donating at this time (rather than earlier or later): Likely determined by the start time of the grantee's PhD program Intended funding timeframe in months: 48 Donor thoughts on making further donations to the donee: The grant page calls the grant "part of an effort to support value-aligned and qualified early-career researchers interested in global catastrophic risks" so it will likely be followed by other similar grants to other researchers Other notes: Announced: 2019-07-18. |
|
Open Philanthropy | Altruistic Technology Labs | 440,525.00 | Biosecurity and pandemic preparedness | https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/altruistic-technology-labs-biological-risk-prevention | Intended use of funds (category): Organizational general support Intended use of funds: The grantee, "AltLabs", a new organization, intends to use these funds to hire initial staff and pursue various research projects related to catastrophic risk reduction, including machine-learning-based attribution of engineered DNA and broad-spectrum infectious disease diagnostics. Other notes: Announced: 2019-07-18. |
|
Open Philanthropy | Tampere University | 15,000.00 | Global catastrophic risks | https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/tampere-university-2019 | Donation process: Discretionary grant Intended use of funds (category): Direct project expenses Intended use of funds: Grant "to support Professor Hiski Haukkala’s efforts related to global catastrophic risks. Haukkala, a Finnish professor of international relations, plans to use the funding to bring speakers to Finland to discuss existential risks, to attend events related to existential risks, and to support networking and related projects." Other notes: Announced: 2019-06-07. |
|
Open Philanthropy | MIT Media Lab | 1,000,000.00 | Global catastrophic risks|Global health|Animal welfare | https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/massachusetts-institute-technology-media-lab-kevin-esvelt | Intended use of funds (category): Direct project expenses Intended use of funds: Grant over two years to the MIT Media Lab to support the research of Professor Kevin Esvelt. Professor Esvelt plans to use this funding to conduct research on global catastrophic risks, global health, and animal welfare. Other notes: Intended funding timeframe in months: 24; announced: 2019-06-26. |
|
Open Philanthropy | 80,000 Hours | 4,795,803.00 | Effective altruism/movement growth/career counseling | https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2019 | Donation process: The decision of whether to donate seems to have followed the Open Philanthropy Project's usual process, but the exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support Intended use of funds (category): Organizational general support Intended use of funds: 80,000 Hours aims to solve skill bottlenecks for career paths in what it considers to be the world’s most pressing problems. It does this by providing online research, in-person advice, and support with the goal of helping talented graduates age 20-40 enter high-impact careers. Donor reason for selecting the donee: Open Phil's grant writeup says: "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" and links to the February 2018 support https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2018 Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Two other grants with amounts decided by the Committee for Effective Altruism Support, made at the same time and therefore likely drawing from the same money pot, are to the Machine Intelligence Research Institute ($2,112,5000) and Centre for Effective Altruism ($2,756,250) Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but likely include (1) It is about a year since the last grant to 80,000 Hours, and the grants are generally expected to last a year, so a renewal is due, (2) The Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support did its first round of money allocation, so the timing is determined by the timing of that allocation round Intended funding timeframe in months: 24 Donor retrospective of the donation: The February 2020 grant https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2020 with very similar reasoning suggests that the Open Philanthropy Project and Committee for Effective Altruism Support would continue to stand by the reasoning behind the grant Other notes: Announced: 2019-03-28. |
|
Open Philanthropy | Centre for Effective Altruism | 2,756,250.00 | Effective altruism/movement growth | https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2019 | Donation process: The exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support Intended use of funds (category): Organizational general support Intended use of funds: The grant writeup says: "CEA is a central organization within the effective altruism (EA) community that engages in a variety of activities aimed at helping the EA community." Donor reason for selecting the donee: Open Phil's grant writeup says: "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is decided by the Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Two other grants with amounts decided by the Committee for Effective Altruism Support, made at the same time and therefore likely drawing from the same money pot, are to the Machine Intelligence Research Institute ($2,112,5000) and 80,000 Hours ($4,795,803) Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but likely include (1) It is about a year since the last grant to the Centre for Effective Altruism, and the grants are generally expected to last a year, so a renewal is due, (2) The Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support did its first round of money allocation, so the timing is determined by the timing of that allocation round Intended funding timeframe in months: 24 Donor retrospective of the donation: The followup September 2019 grant https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-and-community-building-grants-2019 and January 2020 grant https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-and-community-building-grants-2020 suggest that Open Phil would continue to stand behind the reasoning for this grant, and in fact, that it would consider the original grant amount inadequate for the grantee Other notes: Announced: 2019-04-18. |
|
Open Philanthropy | Machine Intelligence Research Institute | 2,652,500.00 | AI safety/technical research | https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2019 | Donation process: The decision of whether to donate seems to have followed the Open Philanthropy Project's usual process, but the exact amount to donate was determined by the Committee for Effective Altruism Support using the process described at https://www.openphilanthropy.org/committee-effective-altruism-support Intended use of funds (category): Organizational general support Intended use of funds: MIRI plans to use these funds for ongoing research and activities related to AI safety. Planned activities include alignment research, a summer fellows program, computer scientist workshops, and internship programs. Donor reason for selecting the donee: The grant page says: "we see the basic pros and cons of this support similarly to what we’ve presented in past writeups on the matter" Past writeups include the grant pages for the October 2017 three-year support https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2017 and the August 2016 one-year support https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support Donor reason for donating that amount (rather than a bigger or smaller amount): Amount decided by the Committee for Effective Altruism Support (CEAS) https://www.openphilanthropy.org/committee-effective-altruism-support but individual votes and reasoning are not public. Two other grants with amounts decided by CEAS, made at the same time and therefore likely drawing from the same money pot, are to the Centre for Effective Altruism ($2,756,250) and 80,000 Hours ($4,795,803). The original amount of $2,112,500 is split across two years, and therefore ~$1.06 million per year. https://intelligence.org/2019/04/01/new-grants-open-phil-beri/ clarifies that the amount for 2019 is on top of the third year of three-year $1.25 million/year support announced in October 2017, and the total $2.31 million represents Open Phil's full intended funding for MIRI for 2019, but the amount for 2020 of ~$1.06 million is a lower bound, and Open Phil may grant more for 2020 later. In November 2019, additional funding would bring the total award amount to $2,652,500. Donor reason for donating at this time (rather than earlier or later): Reasons for timing are not discussed, but likely reasons include: (1) The original three-year funding period https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2017 is coming to an end, (2) Even though there is time before the funding period ends, MIRI has grown in budget and achievements, so a suitable funding amount could be larger, (3) The Committee for Effective Altruism Support https://www.openphilanthropy.org/committee-effective-altruism-support did its first round of money allocation, so the timing is determined by the timing of that allocation round. Intended funding timeframe in months: 24 Donor thoughts on making further donations to the donee: According to https://intelligence.org/2019/04/01/new-grants-open-phil-beri/ Open Phil may increase its level of support for 2020 beyond the ~$1.06 million that is part of this grant. Donor retrospective of the donation: The much larger followup grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2020 with a very similar writeup suggests that Open Phil and the Committee for Effective Altruism Support would continue to stand by the reasoning for the grant. Other notes: The grantee, MIRI, discusses the grant on its website at https://intelligence.org/2019/04/01/new-grants-open-phil-beri/ along with a $600,000 grant from the Berkeley Existential Risk Initiative. Announced: 2019-04-01. |
|
Open Philanthropy | Center for International Security and Cooperation | 1,625,000.00 | Biosecurity and pandemic preparedness | https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/center-international-security-and-cooperation-biosecurity-research-2019 | Grant over three years to Stanford University’s Center for International Security and Cooperation (CISAC) to support Megan Palmer’s work on biosecurity. This research is focused on developing ways to improve governance of biological science and to reduce the risk of misuse of advanced biotechnology. This funding is intended to allow Dr. Palmer to continue and extend a study on the attitudes of participants in International Genetically Engineered Machine (iGEM), to better understand how institutional environments, safety practices or competition incentives might motivate young scientists and engineers. The grant is a renewal of the October 2016 grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/center-international-security-and-cooperation-biosecurity-research. Announced: 2019-02-12. | |
Open Philanthropy | University of Sydney | 32,621.00 | Biosecurity and pandemic preparedness | https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/university-of-sydney-global-health-security-conference | Grant of $45,000 AUD ($32,620.50 at the time of conversion) to the University of Sydney to support the 2019 Global Health Security Conference in Sydney, Australia. The funds are intended for general support of the conference, and to support travel bursaries to allow participants from low-income countries to attend a gathering of the global health security community, including academics, policymakers, and practitioners. Announced: 2019-01-17. | |
Open Philanthropy | Biosecure Ltd | 25,000.00 | Biosecurity and pandemic preparedness | https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/biosecure-campaign-against-bioweapons-research | Donation process: Discretionary grant structured through a contractor agreement. The grant page says: "While we do not typically publish pages for contractor agreements, we chose to write about this funding because we view it as conceptually similar to an ordinary grant, despite its structure as a contract due to the recipient’s organizational form." Intended use of funds (category): Direct project expenses Intended use of funds: Grantee "intends to use these funds to explore different models for strengthening the societal norm against biological weapons and reducing the likelihood of an arms race involving biological weapons, as well as investigating the feasibility, costs, and potential benefits of the various models." Other notes: Announced: 2019-06-07. |
|
Open Philanthropy | International Genetically Engineered Medicine Foundation | 420,000.00 | Biosecurity and pandemic preparedness | https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/igem-synthetic-biology-safety-and-security-2018 | Grant over two years to the International Genetically Engineered Machine (iGEM) Foundation for its work on safety and security, led by Piers Millett. iGEM is an international synthetic biology competition for students. Donor believes that supporting iGEM’s safety and security work could help raise awareness about biosecurity among current and future synthetic biologists. Renewal of May 2016 grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/igem-synthetic-biology-safety-and-security. Announced: 2019-01-31. | |
Open Philanthropy | Nuclear Threat Initiative | 1,904,942.00 | Biosecurity and pandemic preparedness | https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/nuclear-threat-initiative-projects-to-reduce-global-catastrophic-biological-risks | Intended use of funds (category): Direct project expenses Intended use of funds: Grant "to support projects to reduce Global Catastrophic Biological Risks (GCBRs). NTI intends to use these funds to support projects including, among others, strengthening the Biological Weapons Convention and reducing state biological threats and additional GCBRs through international dialogues." Other notes: Intended funding timeframe in months: 36; announced: 2018-12-13. |
|
Open Philanthropy | University of Oxford | 26,086.00 | Biosecurity and pandemic preparedness | https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/oxford-university-dphil-support-for-andrew-snyder-beattie | Intended use of funds (category): Direct project expenses Intended use of funds: Grant "to support the research of the Mathematical Ecology Research Group and the research costs of Andrew Snyder-Beattie, who recently served as Director of Research at the Future of Humanity Institute and a member of FHI’s Biotechnology Research Team." Donor retrospective of the donation: Andrew Snyder-Beattie, whose work the grant would fund, would later be hired by Open Philanthropy and lead its biosecurity and pandemic preparedness program. This probably indicates continued satisfaction with him. Other notes: Currency info: donation given as 20,000.00 GBP (conversion done via donor calculation); announced: 2018-10-29. |
|
Open Philanthropy | Center for a New American Security | 400,352.00 | Global catastrophic risks | https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/center-for-a-new-american-security-richard-danzig-outreach-on-technological-risk-2018 | Grant to support outreach by Richard Danzig, former Secretary of the Navy, on technological risks. This is a renewal and expansion of the August 2017 grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/center-for-a-new-american-security-richard-danzig-outreach-on-technological-risk#footnote1_ix4f0ts which allowed Dr. Danzig to produce Technology Roulette https://www.cnas.org/publications/reports/technology-roulette a report intended for the national security community detailing the management of risks from losing control of advanced technology. Dr. Danzig intends to use these new funds to continue sharing these ideas with U.S. government officials, as well as spreading them to national security leaders abroad. Announced: 2018-10-20. | |
Open Philanthropy | Machine Intelligence Research Institute | 150,000.00 | AI safety | https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-ai-safety-retraining-program | Donation process: The grant is a discretionary grant, so the approval process is short-circuited; see https://www.openphilanthropy.org/giving/grants/discretionary-grants for more Intended use of funds (category): Direct project expenses Intended use of funds: Grant to suppport the artificial intelligence safety retraining project. MIRI intends to use these funds to provide stipends, structure, and guidance to promising computer programmers and other technically proficient individuals who are considering transitioning their careers to focus on potential risks from advanced artificial intelligence. MIRI believes the stipends will make it easier for aligned individuals to leave their jobs and focus full-time on safety. MIRI expects the transition periods to range from three to six months per individual. The MIRI blog post https://intelligence.org/2018/09/01/summer-miri-updates/ says: "Buck [Shlegeris] is currently selecting candidates for the program; to date, we’ve made two grants to individuals." Other notes: The grant is mentioned by MIRI in https://intelligence.org/2018/09/01/summer-miri-updates/. Announced: 2018-06-27. |
|
Open Philanthropy | Early-Career Funding for Global Catastrophic Biological Risks | 570,000.00 | Biosecurity and pandemic preparedness | https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/early-career-funding-global-catastrophic-biological-risks | Total over three years in flexible support to enable five early-career people to pursue work and study related to global catastrophic biological risks. Original grant amount $515,000; $55,000 was added on top in October 2018. Announced: 2018-08-24. | |
Open Philanthropy | David Manheim | 65,308.00 | Biosecurity and pandemic preparedness | https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/david-manheim-research-existential-risk | Grant to perform a research and analysis project, "Eliciting Evaluations of Existential Risk from Infectious Disease.". Announced: 2018-01-30. | |
Open Philanthropy | The Degrees Initiative | 2,000,000.00 | Climate change/geoengineering/solar radiation management | https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/solar-radiation-management-governance-initiative-general-support-2017 | Intended use of funds (category): Organizational general support Intended use of funds: The grant page says: "The funding is intended to help support SRMGI’s on-going governance work related to solar radiation management (SRM), and will additionally help support a new research fund for modeling the impacts of SRM across the developing world, called the Developing Country Impacts Modeling Analysis for SRM (DECIMALS)." Donor reason for selecting the donee: The grant page says: "In short, our impression is that there is globally very little work focused on improving the governance of potential SRM implementation, and we therefore consider funding SRMGI as an opportunity to build a sparse field. Additionally, we are inclined to agree with SRMGI’s case that it is important to increase the engagement of experts in the developing world with SRM governance issues." It links to the previous grant page https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/solar-radiation-management-governance-initiative-general-support for more details. Donor retrospective of the donation: The followup grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/degrees-initiative-general-support-2021 suggests continued satisfaction with the grantee. Other notes: Grant via the Environmental Defense Fund. At the time, the grantee is called the Solar Radiation Management Governance Initiative. Intended funding timeframe in months: 36; announced: 2017-10-09. |
|
Open Philanthropy | New Partnership for Africa’s Development | 2,350,000.00 | Scientific research/malaria/gene drive | https://www.openphilanthropy.org/focus/scientific-research/miscellaneous/new-partnership-africa-s-development-general-support | Grant to the Planning and Coordinating Agency, the technical arm of the African Union, to support the evaluation, preparation, and potential deployment of gene drive technologies in some African regions. Part of a set of grants related to gene drives; see https://www.openphilanthropy.org/focus/scientific-research/miscellaneous/target-malaria-general-support for a larger grant to Target Malaria in the same domain and at around the same time. Announced: 2017-05-26. | |
Open Philanthropy | Harvard University | 2,500,000.00 | Climate change/geoengineering/solar radiation management | https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/harvard-university-solar-geoengineering-research-program | Donation process: https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/harvard-university-solar-geoengineering-research-program#Our_process says: "Claire Zabel, a Research Analyst for the Open Philanthropy Project, spoke and exchanged emails with Professor [David] Keith and Dr. [Gernot] Wagner." Intended use of funds (category): Direct project expenses Intended use of funds: Grant "to support the founding of the Solar Geoengineering Research Program (SGRP) as part of the Harvard University Center for the Environment. This program will be a coordinated research effort focusing on solar geoengineering research, governance, and advocacy led by Professor David Keith and Dr. Gernot Wagner (formerly the Environmental Defense Fund’s lead senior economist)." https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/harvard-university-solar-geoengineering-research-program#Budget_and_proposed_activities lists the kinds of expenses: executive director's salary, creation of a comprehensive blueprint, outreach and convening, advancing science and technology, assessing efficacy and risks, governance and social implications, and Harvard-wide faculty grants. Donor reason for selecting the donee: https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/harvard-university-solar-geoengineering-research-program#Case_for_the_grant says: "We see this grant as a good opportunity to build and advance the field of solar geoengineering, especially given SGRP’s emphasis on cooperation and collaboration between researchers and on developing solar geoengineering technology in the manner that is most likely to affect the world positively. [...] we consider Professor Keith to be a top scientist who is relatively aligned with us in terms of being pragmatic, cognizant of tradeoffs, and focused on global rather than national interests. It seems to us that earlier and more research on solar geoengineering will make it more likely that the global community will have an in-depth understanding of technological options and risks in the event that climate engineering is seriously considered as an approach to reducing harms from climate change at some point in the future. Donor reason for donating at this time (rather than earlier or later): Timing determined by the timing of the founding of the institute. The founding is not solely funded by Open Philanthropy, so Open Philanthropy didn't determine the timing. The grant page says: "Other founding funders include Bill Gates, the Hewlett Foundation, and the Alfred P. Sloan Foundation." Intended funding timeframe in months: 60 Other notes: The grant page says: "Other founding funders include Bill Gates, the Hewlett Foundation, and the Alfred P. Sloan Foundation." https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/harvard-university-solar-geoengineering-research-program#Risks_and_reservations lists risks and reservations, both specific to the grant and general to geoengineering. Announced: 2017-04-14. |
Donee | Donors influenced | Cause area | Metadata | Total | 2021 | 2020 | 2019 | 2018 | 2017 | 2016 |
---|---|---|---|---|---|---|---|---|---|---|
Machine Intelligence Research Institute | Open Philanthropy (filter this donor) | AI safety | FB Tw WP Site CN GS TW | 10,506,250.00 | 0.00 | 7,703,750.00 | 2,652,500.00 | 150,000.00 | 0.00 | 0.00 |
Centre for Effective Altruism | Open Philanthropy (filter this donor) | Effective altruism/movement growth | FB Site | 8,658,966.00 | 0.00 | 4,146,795.00 | 4,512,171.00 | 0.00 | 0.00 | 0.00 |
80,000 Hours | Open Philanthropy (filter this donor) | Career coaching/life guidance | FB Tw WP Site | 8,253,087.00 | 0.00 | 3,457,284.00 | 4,795,803.00 | 0.00 | 0.00 | 0.00 |
Harvard University | Open Philanthropy (filter this donor) | FB Tw WP Site | 2,500,000.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 2,500,000.00 | |
New Partnership for Africa’s Development | Open Philanthropy (filter this donor) | 2,350,000.00 | 0.00 | 0.00 | 0.00 | 0.00 | 2,350,000.00 | 0.00 | ||
The Degrees Initiative | Open Philanthropy (filter this donor) | 2,000,000.00 | 0.00 | 0.00 | 0.00 | 0.00 | 2,000,000.00 | 0.00 | ||
Nuclear Threat Initiative | Open Philanthropy (filter this donor) | FB Tw WP Site | 1,904,942.00 | 0.00 | 0.00 | 0.00 | 1,904,942.00 | 0.00 | 0.00 | |
Center for International Security and Cooperation | Open Philanthropy (filter this donor) | WP | 1,625,000.00 | 0.00 | 0.00 | 1,625,000.00 | 0.00 | 0.00 | 0.00 | |
Ought | Open Philanthropy (filter this donor) | AI safety | Site | 1,593,333.00 | 0.00 | 1,593,333.00 | 0.00 | 0.00 | 0.00 | 0.00 |
Effective Altruism Foundation | Open Philanthropy (filter this donor) | Effective altruism/movement growth | FB Tw Site | 1,000,000.00 | 0.00 | 0.00 | 1,000,000.00 | 0.00 | 0.00 | 0.00 |
MIT Media Lab | Open Philanthropy (filter this donor) | 1,000,000.00 | 0.00 | 0.00 | 1,000,000.00 | 0.00 | 0.00 | 0.00 | ||
Early-Career Funding for Global Catastrophic Biological Risks | Open Philanthropy (filter this donor) | 570,000.00 | 0.00 | 0.00 | 0.00 | 570,000.00 | 0.00 | 0.00 | ||
Effective Altruism Funds: Effective Altruism Infrastructure Fund | Open Philanthropy (filter this donor) | 500,000.00 | 500,000.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | ||
Centre for the Governance of AI | Open Philanthropy (filter this donor) | 450,000.00 | 0.00 | 450,000.00 | 0.00 | 0.00 | 0.00 | 0.00 | ||
Altruistic Technology Labs | Open Philanthropy (filter this donor) | 440,525.00 | 0.00 | 0.00 | 440,525.00 | 0.00 | 0.00 | 0.00 | ||
Ethan Alley | Open Philanthropy (filter this donor) | 437,800.00 | 0.00 | 0.00 | 437,800.00 | 0.00 | 0.00 | 0.00 | ||
International Genetically Engineered Medicine Foundation | Open Philanthropy (filter this donor) | 420,000.00 | 0.00 | 0.00 | 0.00 | 420,000.00 | 0.00 | 0.00 | ||
Center for a New American Security | Open Philanthropy (filter this donor) | 400,352.00 | 0.00 | 0.00 | 0.00 | 400,352.00 | 0.00 | 0.00 | ||
Center for Applied Rationality | Open Philanthropy (filter this donor) | Rationality | FB Tw WP Site TW | 375,000.00 | 0.00 | 375,000.00 | 0.00 | 0.00 | 0.00 | 0.00 |
Berkeley Existential Risk Initiative | Open Philanthropy (filter this donor) | AI safety/other global catastrophic risks | Site TW | 360,000.00 | 210,000.00 | 150,000.00 | 0.00 | 0.00 | 0.00 | 0.00 |
David Manheim | Open Philanthropy (filter this donor) | 65,308.00 | 0.00 | 0.00 | 0.00 | 0.00 | 65,308.00 | 0.00 | ||
University of Sydney | Open Philanthropy (filter this donor) | 32,621.00 | 0.00 | 0.00 | 0.00 | 32,621.00 | 0.00 | 0.00 | ||
University of Oxford | Open Philanthropy (filter this donor) | FB Tw WP Site | 26,086.00 | 0.00 | 0.00 | 0.00 | 26,086.00 | 0.00 | 0.00 | |
Biosecure Ltd | Open Philanthropy (filter this donor) | 25,000.00 | 0.00 | 0.00 | 0.00 | 25,000.00 | 0.00 | 0.00 | ||
Tampere University | Open Philanthropy (filter this donor) | 15,000.00 | 0.00 | 0.00 | 15,000.00 | 0.00 | 0.00 | 0.00 | ||
Total | -- | -- | -- | 45,509,270.00 | 710,000.00 | 17,876,162.00 | 16,478,799.00 | 3,529,001.00 | 4,415,308.00 | 2,500,000.00 |
Graph of spending by donee and year (incremental, not cumulative)
Graph of spending by donee and year (cumulative)
Graph of spending by donee and year (incremental, not cumulative)
Graph of spending by donee and year (cumulative)