Berkeley Existential Risk Initiative donations made (filtered to cause areas matching AI safety)

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of July 2024. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Basic donor information

ItemValue
Country United States
Facebook username Berkeley-Existential-Risk-Initiative-1875638366085846
Websitehttp://existence.org/
Data entry method on Donations List WebsiteManual (no scripts used)
Org Watch pagehttps://orgwatch.issarice.com/?organization=Berkeley+Existential+Risk+Initiative

This entity is also a donee.

Donor donation statistics

Cause areaCountMedianMeanMinimum10th percentile 20th percentile 30th percentile 40th percentile 50th percentile 60th percentile 70th percentile 80th percentile 90th percentile Maximum
Overall 33 50,000 129,031 1,900 10,000 20,000 25,000 50,000 50,000 100,000 100,000 200,000 300,000 800,000
AI safety 33 50,000 129,031 1,900 10,000 20,000 25,000 50,000 50,000 100,000 100,000 200,000 300,000 800,000

Donation amounts by cause area and year

If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.

Note: Cause area classification used here may not match that used by donor for all cases.

Cause area Number of donations Number of donees Total 2020 2019 2018 2017
AI safety (filter this donor) 33 23 4,258,032.00 300,000.00 650,000.00 2,838,032.00 470,000.00
Total 33 23 4,258,032.00 300,000.00 650,000.00 2,838,032.00 470,000.00

Graph of spending by cause area and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by cause area and year (cumulative)

Graph of spending should have loaded here

Donation amounts by subcause area and year

If you hover over a cell for a given subcause area and year, you will get a tooltip with the number of donees and the number of donations.

For the meaning of “classified” and “unclassified”, see the page clarifying this.

Subcause area Number of donations Number of donees Total 2020 2019 2018 2017
AI safety 33 23 4,258,032.00 300,000.00 650,000.00 2,838,032.00 470,000.00
Classified total 33 23 4,258,032.00 300,000.00 650,000.00 2,838,032.00 470,000.00
Unclassified total 0 0 0.00 0.00 0.00 0.00 0.00
Total 33 23 4,258,032.00 300,000.00 650,000.00 2,838,032.00 470,000.00

Graph of spending by subcause area and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by subcause area and year (cumulative)

Graph of spending should have loaded here

Donation amounts by donee and year

Donee Cause area Metadata Total 2020 2019 2018 2017
Center for Applied Rationality (filter this donor) Rationality FB Tw WP Site TW 1,200,000.00 0.00 0.00 1,100,000.00 100,000.00
Machine Intelligence Research Institute (filter this donor) AI safety FB Tw WP Site CN GS TW 1,100,000.00 300,000.00 600,000.00 0.00 200,000.00
Future of Life Institute (filter this donor) AI safety/other global catastrophic risks FB Tw WP Site 500,000.00 0.00 50,000.00 300,000.00 150,000.00
UC Berkeley Foundation (filter this donor) 400,000.00 0.00 0.00 400,000.00 0.00
Centre for the Study of Existential Risk (filter this donor) 200,000.00 0.00 0.00 200,000.00 0.00
Lightcone Infrastructure (filter this donor) Epistemic institutions FB WP Site 150,000.00 0.00 0.00 150,000.00 0.00
Ben Goldhaber (filter this donor) 150,000.00 0.00 0.00 150,000.00 0.00
Stephanie Zolayvar (filter this donor) 100,000.00 0.00 0.00 100,000.00 0.00
Baeo Maltinsky (filter this donor) 55,000.00 0.00 0.00 55,000.00 0.00
Colleen McKenzie (filter this donor) 51,000.00 0.00 0.00 51,000.00 0.00
Theiss Research (filter this donor) 50,000.00 0.00 0.00 50,000.00 0.00
Institute for Philosophical Research (filter this donor) 50,000.00 0.00 0.00 50,000.00 0.00
Roxanne Heston (filter this donor) 49,532.00 0.00 0.00 49,532.00 0.00
Association for the Advancement of Artificial Intelligence (filter this donor) 45,000.00 0.00 0.00 25,000.00 20,000.00
Miles Brundage (filter this donor) 32,500.00 0.00 0.00 32,500.00 0.00
Zoe Cremer (filter this donor) 25,200.00 0.00 0.00 25,200.00 0.00
David Manheim (filter this donor) 25,000.00 0.00 0.00 25,000.00 0.00
Bryce Hidysmith (filter this donor) 20,500.00 0.00 0.00 20,500.00 0.00
Luca Rade (filter this donor) 20,400.00 0.00 0.00 20,400.00 0.00
Sebastian Farquhar (filter this donor) 12,000.00 0.00 0.00 12,000.00 0.00
Cambridge in America (filter this donor) 10,000.00 0.00 0.00 10,000.00 0.00
Jessica Taylor (filter this donor) 10,000.00 0.00 0.00 10,000.00 0.00
Jordan Alexander (filter this donor) 1,900.00 0.00 0.00 1,900.00 0.00
Total -- -- 4,258,032.00 300,000.00 650,000.00 2,838,032.00 470,000.00

Graph of spending by donee and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by donee and year (cumulative)

Graph of spending should have loaded here

Donation amounts by influencer and year

Sorry, we couldn't find any influencer information.

Donation amounts by disclosures and year

Sorry, we couldn't find any disclosures information.

Donation amounts by country and year

Sorry, we couldn't find any country information.

Full list of documents in reverse chronological order (6 documents)

Title (URL linked)Publication dateAuthorPublisherAffected donorsAffected doneesAffected influencersDocument scopeCause areaNotes
New grants from the Open Philanthropy Project and BERI2019-04-01Rob Bensinger Machine Intelligence Research InstituteOpen Philanthropy Berkeley Existential Risk Initiative Machine Intelligence Research Institute Donee periodic updateAI safetyMIRI announces two grants to it: a two-year grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2019 totaling $2,112,500 from the Open Philanthropy Project, with half of it disbursed in 2019 and the other half disbursed in 2020. The amount disbursed in 2019 (of a little over $1.06 million) is on top of the $1.25 million already committed by the Open Philanthropy Project as part of the 3-year $3.75 million grant https://intelligence.org/2017/11/08/major-grant-open-phil/ The $1.06 million in 2020 may be supplemented by further grants from the Open Philanthropy Project. The grant size from the Open Philanthropy Project was determined by the Committee for Effective Altruism Support. The post also notes that the Open Philanthropy Project plans to determine future grant sizes using the Committee. MIRI expects the grant money to play an important role in decision-making as it executes on growing its research team as described in its 2018 strategy update post https://intelligence.org/2018/11/22/2018-update-our-new-research-directions/ and fundraiser post https://intelligence.org/2018/11/26/miris-2018-fundraiser/
A model I use when making plans to reduce AI x-risk (GW, IR)2018-01-18Ben Pace LessWrongBerkeley Existential Risk Initiative Review of current state of cause areaAI safetyThe author describes his implicit model of AI risk, with four parts: (1) Alignment is hard, (2) Getting alignment right accounts for most of the variance in whether an AGI system will be positive for humanity, (3) Our current epistemic state regarding AGI timelines will continue until we are close (<2 years from) to having AGI, and (4) Given timeline uncertainty, it is best to spend marginal effort on plans that assume / work in shorter timelines. There is a lot of discussion in the comments
Announcing BERI Computing Grants2017-12-01Andrew Critch Berkeley Existential Risk InitiativeBerkeley Existential Risk Initiative Berkeley Existential Risk Initiative Donee periodic updateAI safety/other global catastrophic risks
Forming an engineering team2017-10-25Andrew Critch Berkeley Existential Risk Initiative Berkeley Existential Risk Initiative Donee periodic updateAI safety/other global catastrophic risks
What we’re thinking about as we grow - ethics, oversight, and getting things done2017-10-19Andrew Critch Berkeley Existential Risk InitiativeBerkeley Existential Risk Initiative Berkeley Existential Risk Initiative Donee periodic updateAI safety/other global catastrophic risksOutlines BERI's approach to growth and "ethics" (transparency, oversight, trust, etc.).
BERI's semi-annual report, August2017-09-12Rebecca Raible Berkeley Existential Risk InitiativeBerkeley Existential Risk Initiative Berkeley Existential Risk Initiative Donee periodic updateAI safety/other global catastrophic risksA blog post announcing BERI's semi-annual report.

Full list of donations in reverse chronological order (33 donations)

Graph of top 10 donees (for donations with known year of donation) by amount, showing the timeframe of donations

Graph of donations and their timeframes
DoneeAmount (current USD)Amount rank (out of 33)Donation dateCause areaURLInfluencerNotes
Machine Intelligence Research Institute300,000.0042020-03-02AI safetyhttp://existence.org/grants/-- Intended use of funds (category): Organizational general support

Other notes: The grant is mentioned by MIRI in the blog post https://intelligence.org/2020/04/27/miris-largest-grant-to-date/ along with a large grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2020 from the Open Philanthropy Project. The post says: "at the time of our 2019 fundraiser, we expected to receive a grant from BERI in early 2020, and incorporated this into our reserves estimates. However, we predicted the grant size would be $600k; now that we know the final grant amount, that estimate should be $300k lower.".
Future of Life Institute50,000.00172019-04-22AI safetyhttp://web.archive.org/web/20190623203105/http://existence.org/grants/--
Machine Intelligence Research Institute600,000.0022019-02-26AI safetyhttp://existence.org/grants/-- Intended use of funds (category): Organizational general support

Donor retrospective of the donation: BERI would make a further to MIRI, indicating continued confidence in the grantee. The followup grant would be in March 2020 for $300,000. By that point, BERI would have transitioned these grantmaking responsibilities to the Survival and Fluorishing Fund.

Other notes: This grant is also discussed by the Machine Intelligence Research Institute (the grant recipient) at https://intelligence.org/2017/11/08/major-grant-open-phil/ along with a grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2019 from the Open Philanthropy Project. Announced: 2019-05-09.
Sebastian Farquhar12,000.00292018-10-31AI safetyhttps://web.archive.org/web/20181107040049/http://existence.org/grants-list/-- “to attend conferences and purchase compute for experiments related to his PhD research on uncertainty modeling in neural networks”.
Association for the Advancement of Artificial Intelligence20,000.00272018-10-15AI safetyhttps://web.archive.org/web/20181019174535/http://existence.org/grants-list/-- “for the support of the 2019 conference on AI, Ethics, and Society”.
Ben Goldhaber150,000.0082018-10-10AI safetyhttps://web.archive.org/web/20181019174535/http://existence.org/grants-list/-- “to support his project (co-lead by Jacob Lagerros) to bring x-risk-relevant questions to popular prediction platforms”.
Jessica Taylor10,000.00302018-10-10AI safetyhttps://web.archive.org/web/20181019174535/http://existence.org/grants-list/-- “to work on her research in AI alignment and other areas”.
Stephanie Zolayvar100,000.00102018-10-09AI safetyhttps://web.archive.org/web/20181019174535/http://existence.org/grants-list/-- “to train herself in circling and host circles for people who are promising contributors to reducing x-risk”.
Cambridge in America10,000.00302018-10-02AI safetyhttps://web.archive.org/web/20181019174535/http://existence.org/grants-list/-- “for the support of the Leverhulme Centre for the Future of Intellgience”.
Roxanne Heston49,532.00212018-10-02AI safetyhttps://web.archive.org/web/20181019174535/http://existence.org/grants-list/-- “to work on a variety of AI policy projects in Washington, D.C.”.
Baeo Maltinsky55,000.00152018-10-02AI safetyhttps://web.archive.org/web/20181019174535/http://existence.org/grants-list/-- “to further his research on AI and technology trends”.
Zoe Cremer25,200.00232018-10-02AI safetyhttps://web.archive.org/web/20181019174535/http://existence.org/grants-list/-- “to support her as a visiting fellow at CFI, where she will research disagreements about the amount and kind of structure required for AGI”.
Colleen McKenzie51,000.00162018-10-02AI safetyhttps://web.archive.org/web/20181019174535/http://existence.org/grants-list/-- “to support her research on AI timelines and the processes that produce technical and scientific progress”.
David Manheim25,000.00242018-09-26AI safetyhttps://web.archive.org/web/20181019174535/http://existence.org/grants-list/-- “to research aspects of Goodhart’s law, focusing on multi-agent dynamics”. Probably related to https://arxiv.org/pdf/1803.04585.pdf.
Bryce Hidysmith20,500.00252018-09-24AI safetyhttps://web.archive.org/web/20181019174535/http://existence.org/grants-list/-- “to analyze global risks from technology through a geopolitical lens”.
Jordan Alexander1,900.00332018-09-24AI safetyhttps://web.archive.org/web/20181019174535/http://existence.org/grants-list/-- “to host several meetings at Stanford EA and the Stanford Transhumanist Association”.
Luca Rade20,400.00262018-09-24AI safetyhttps://web.archive.org/web/20181019174535/http://existence.org/grants-list/-- “to research the implications of coarse-graining by an agent in a complex environment for AI alignment”.
Institute for Philosophical Research50,000.00172018-09-07AI safetyhttps://web.archive.org/web/20180921215949/http://existence.org/organization-grants/-- General support. Announced at http://existence.org/2018/09/18/activity-update-july-and-august-2018.html.
Lightcone Infrastructure150,000.0082018-09-04AI safetyhttps://web.archive.org/web/20180921215949/http://existence.org/organization-grants/-- Given via the Center for Applied Rationality. To support the development of LessWrong 2.0. Announced at http://existence.org/2018/09/18/activity-update-july-and-august-2018.html.
UC Berkeley Foundation400,000.0032018-07-24AI safetyhttps://web.archive.org/web/20180921215949/http://existence.org/organization-grants/-- Intended to support Andrew Critch’s work at the Center for Human-compatible AI. Announced at http://existence.org/2018/09/18/activity-update-july-and-august-2018.html.
Center for Applied Rationality300,000.0042018-07-24AI safetyhttps://web.archive.org/web/20180921215949/http://existence.org/organization-grants/-- General support. Announced at http://existence.org/2018/09/18/activity-update-july-and-august-2018.html and at http://www.rationality.org/resources/updates/2018/august-newsletter (CFAR lists $600,000 over two years as the amount).
Theiss Research50,000.00172018-07-12AI safetyhttps://web.archive.org/web/20180921215949/http://existence.org/organization-grants/-- Intended to support Carrick Flynn’s research with Allan DaFoe at the Future of Humanity Institute. Announced at http://existence.org/2018/09/18/activity-update-july-and-august-2018.html.
Future of Life Institute300,000.0042018-04-10AI safetyhttps://web.archive.org/web/20180905034853/http://existence.org/organization-grants/ https://web.archive.org/web/20180921215949/http://existence.org/organization-grants/-- General support.
Centre for the Study of Existential Risk200,000.0072018-04-06AI safetyhttps://web.archive.org/web/20180921215949/http://existence.org/organization-grants/-- For general suppotr; grant via Cambridge in America.
Association for the Advancement of Artificial Intelligence5,000.00322018-03-30AI safetyhttps://web.archive.org/web/20180731180958/http://existence.org:80/grants https://web.archive.org/web/20180921215949/http://existence.org/organization-grants/-- Grant in support of the 2018 Spring Symposium on AI and Society: Ethics, Safety and Trustworthiness in Intelligent Agents. See announcement at http://existence.org/2018/04/07/activity-update-february-and-march-2018.html.
Center for Applied Rationality800,000.0012018-01-26AI safetyhttps://web.archive.org/web/20180731180958/http://existence.org:80/grants https://web.archive.org/web/20180921215949/http://existence.org/organization-grants/-- General support for the purpose of a permanent venue for CFAR workshops. See announcement at http://existence.org/2018/02/08/activity-update-january-2018.html.
Miles Brundage32,500.00222018-01-02AI safetyhttps://web.archive.org/web/20180731180958/http://existence.org:80/grants--
Machine Intelligence Research Institute100,000.00102017-12-28AI safetyhttp://existence.org/grants/-- Intended use of funds (category): Organizational general support

Donor retrospective of the donation: BERI would make two further grants to MIRI, indicating continued confidence in the grantee. The last grant would be in March 2020 for $300,000. By that point, BERI would have transitioned these grantmaking responsibilities to the Survival and Fluorishing Fund.

Other notes: Announced: 2018-01-11.
Center for Applied Rationality100,000.00102017-12-27AI safetyhttps://web.archive.org/web/20180731180958/http://existence.org:80/grants https://web.archive.org/web/20180921215949/http://existence.org/organization-grants/-- See announcement at http://existence.org/2018/01/11/activity-update-december-2017.html.
Association for the Advancement of Artificial Intelligence20,000.00272017-12-18AI safetyhttps://web.archive.org/web/20180731180958/http://existence.org:80/grants https://web.archive.org/web/20180921215949/http://existence.org/organization-grants/-- To support the conference on Artificial Intelligence, Ethics, and Society. See announcement at http://existence.org/2017/12/01/activity-update-november-2017.html.
Future of Life Institute50,000.00172017-10-27AI safetyhttps://web.archive.org/web/20180731180958/http://existence.org:80/grants https://web.archive.org/web/20180921215949/http://existence.org/organization-grants/-- For general support. See announcement at http://existence.org/2017/11/03/activity-update-october-2017.html.
Future of Life Institute100,000.00102017-10-27AI safetyhttps://web.archive.org/web/20180731180958/http://existence.org:80/grants https://web.archive.org/web/20180921215949/http://existence.org/organization-grants/--
Machine Intelligence Research Institute100,000.00102017-09-13AI safetyhttp://existence.org/grants-- Intended use of funds (category): Organizational general support

Donor reason for selecting the donee: The announcement page says: "Broadly, we believe these groups [Machine Intelligence Research Institute and Future of Life Institute] to have done good work in the past for reducing existential risk and wish to support their continued efforts."

Donor reason for donating at this time (rather than earlier or later): This is one of two opening grants made by BERI to begin its grants program.

Donor thoughts on making further donations to the donee: The grant page says: "Over the next few months, we may write more about our reasoning behind these and other grants." It further outlines the kinds of organizations that BERI will be granting to in the short run.

Donor retrospective of the donation: BERI would make three further grants to MIRI, indicating continued confidence in the grantee. The last grant would be in March 2020 for $300,000. By that point, BERI would have transitioned these grantmaking responsibilities to the Survival and Fluorishing Fund.

Other notes: Announced: 2017-09-25.

Similarity to other donors

Sorry, we couldn't find any similar donors.