, ,
This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of July 2025. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.
Item | Value |
---|---|
Country | |
Wikipedia page | https://en.wikipedia.org/wiki/Center_for_International_Security_and_Cooperation |
Cause area | Count | Median | Mean | Minimum | 10th percentile | 20th percentile | 30th percentile | 40th percentile | 50th percentile | 60th percentile | 70th percentile | 80th percentile | 90th percentile | Maximum |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Overall | 3 | 643,415 | 778,472 | 67,000 | 67,000 | 67,000 | 67,000 | 643,415 | 643,415 | 643,415 | 1,625,000 | 1,625,000 | 1,625,000 | 1,625,000 |
AI safety | 1 | 67,000 | 67,000 | 67,000 | 67,000 | 67,000 | 67,000 | 67,000 | 67,000 | 67,000 | 67,000 | 67,000 | 67,000 | 67,000 |
Biosecurity and pandemic preparedness | 2 | 643,415 | 1,134,208 | 643,415 | 643,415 | 643,415 | 643,415 | 643,415 | 643,415 | 1,625,000 | 1,625,000 | 1,625,000 | 1,625,000 | 1,625,000 |
Donor | Total | 2020 | 2019 | 2016 |
---|---|---|---|---|
Open Philanthropy (filter this donee) | 2,335,415.00 | 67,000.00 | 1,625,000.00 | 643,415.00 |
Total | 2,335,415.00 | 67,000.00 | 1,625,000.00 | 643,415.00 |
Title (URL linked) | Publication date | Author | Publisher | Affected donors | Affected donees | Affected influencers | Document scope | Cause area | Notes |
---|---|---|---|---|---|---|---|---|---|
Suggestions for Individual Donors from Open Philanthropy Project Staff - 2017 | 2017-12-21 | Holden Karnofsky | Open Philanthropy | Jaime Yassif Chloe Cockburn Lewis Bollard Nick Beckstead Daniel Dewey | Center for International Security and Cooperation Johns Hopkins Center for Health Security Good Call Court Watch NOLA Compassion in World Farming USA Wild-Animal Suffering Research Effective Altruism Funds Donor lottery Future of Humanity Institute Center for Human-Compatible AI Machine Intelligence Research Institute Berkeley Existential Risk Initiative Centre for Effective Altruism 80,000 Hours Alliance to Feed the Earth in Disasters | Donation suggestion list | Animal welfare|AI safety|Biosecurity and pandemic preparedness|Effective altruism|Criminal justice reform | Open Philanthropy Project staff give suggestions on places that might be good for individuals to donate to. Each suggestion includes a section "Why I suggest it", a section explaining why the Open Philanthropy Project has not funded (or not fully funded) the opportunity, and links to relevant writeups. | |
Catastrophic Global Risks: A Silicon Valley Funder Thinks the Unthinkable | 2016-11-30 | Sue Lynn-Moses | Inside Philanthropy | Open Philanthropy | Center for International Security and Cooperation | Third-party coverage of donor strategy | Biosecurity and pandemic preparedness | A discussion of the overall work done by the Open Philanthropy Project on global catastrophic risks, with a particular focus on biosecurity. Comparisons are made with the Skoll Global Threats Fund, and the historical work of the Rockefeller Foundation in disease surveillance (that it recently pulled out of) is referenced. | |
Open Philanthropy Project: Grants for Global Security | Inside Philanthropy | Open Philanthropy | Center for International Security and Cooperation | Third-party coverage of donor strategy | Biosecurity and pandemic preparedness | An overview by Inside Philanthropy of the Open Philanthropy Project and its work on biosecurity grants. |
Graph of top 10 donors (for donations with known year of donation) by amount, showing the timeframe of donations
Donor | Amount (current USD) | Amount rank (out of 3) | Donation date | Cause area | URL | Influencer | Notes |
---|---|---|---|---|---|---|---|
Open Philanthropy | 67,000.00 | 3 | AI safety/strategy | https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/center-for-international-security-and-cooperation-ai-accident-risk-and-technology-competition | Luke Muehlhauser | Intended use of funds (category): Direct project expenses Intended use of funds: Grant "to explore possible projects related to AI accident risk in the context of technology competition." Donor reason for selecting the donee: No specific reasons are provided, but two other grants https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/center-for-strategic-and-international-studies-ai-accident-risk-and-technology-competition and https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/rice-hadley-gates-manuel-ai-risk made at about the same time for the same intended use suggests interest from the donor in this particular use case at this time. Donor reason for donating at this time (rather than earlier or later): No specific reasons are provided, but two other grants https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/center-for-strategic-and-international-studies-ai-accident-risk-and-technology-competition and https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/rice-hadley-gates-manuel-ai-risk made at about the same time for the same intended use suggests interest from the donor in this particular use case at this time. |
|
Open Philanthropy | 1,625,000.00 | 1 | Biosecurity and pandemic preparedness | https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/center-international-security-and-cooperation-biosecurity-research-2019 | Claire Zabel | Grant over three years to Stanford University’s Center for International Security and Cooperation (CISAC) to support Megan Palmer’s work on biosecurity. This research is focused on developing ways to improve governance of biological science and to reduce the risk of misuse of advanced biotechnology. This funding is intended to allow Dr. Palmer to continue and extend a study on the attitudes of participants in International Genetically Engineered Machine (iGEM), to better understand how institutional environments, safety practices or competition incentives might motivate young scientists and engineers. The grant is a renewal of the October 2016 grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/center-international-security-and-cooperation-biosecurity-research. Announced: 2019-02-12. | |
Open Philanthropy | 643,415.00 | 2 | Biosecurity and pandemic preparedness | https://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/center-international-security-and-cooperation-biosecurity-research | -- | In support of research by Megan Palmer. Her policy research is focused on developing ways to improve the governance of biological science and technology. One of the projects she intends to focus on in the next few years is a study of past, current and future iGEM competitions to better understand how to motivate young scientists and engineers to take biosafety and biosecurity seriously and how to instill those values in a way that lasts throughout their careers. Announced: 2016-11-03. |