Effective Altruism Grants donations made (filtered to cause areas matching Global catastrophic risks)

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of March 2022. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Basic donor information

ItemValue
Country United Kingdom
Affiliated organizations (current or former; restricted to potential donees or others relevant to donation decisions)Centre for Effective Altruism
Best overview URLhttp://effective-altruism.com/ea/1fc/effective_altruism_grants_project_update/
Websitehttps://www.effectivealtruism.org/grants/
Donations URLhttps://docs.google.com/spreadsheets/d/1iBy--zMyIiTgybYRUQZIm11WKGQZcixaCmIaysRmGvk
Regularity with which donor updates donations datairregular
Regularity with which Donations List Website updates donations data (after donor update)irregular
Lag with which donor updates donations datamonths
Lag with which Donations List Website updates donations data (after donor update)days
Data entry method on Donations List WebsiteSQL insertion commands generated by script https://github.com/riceissa/ea-grants-processing

Brief history: Effective Altruism Grants (EA Grants for short) are a successor to Effective Altruism Ventures (see http://effective-altruism.com/ea/1b5/announcing_effective_altruism_grants/ for the announcement post). The first batch of EA Grants was publicly announced at the end of September 2017, totaling about £369,924 (with the remainder up to a total of £500,000 reserved for additional grants to recipients). In https://www.centreforeffectivealtruism.org/blog/cea-s-2017-review-and-2018-plans/ it was announced that EA Grants was moving to a rolling basis for urgent grants and a quarterly frequency of review for less urgent grants

Brief notes on broad donor philosophy and major focus areas: Focus is on filling funding gaps in grants for individuals pursuing early-stage but high-value projects

Notes on grant decision logistics: For 2017, applications had to be made before a deadline. Of the 722 applications received, 413 (57%) were rejected. The second round involved taking the remaining applications and assessing applicants based on their track record, values, and plans. This assessment adhered to a rubric, weighting each category in accordance with its predictive power for project success. 63 candidates passed this round and had three 10-minute interviews, and 22 of them were selected. However, EA Grants is moving in 2018 to a rolling basis for urgent grants and a quarterly basis for other grants, according to https://www.centreforeffectivealtruism.org/blog/cea-s-2017-review-and-2018-plans/

Notes on grant publication logistics: For 2017, all grants were published in a spreadsheet https://docs.google.com/spreadsheets/d/1iBy--zMyIiTgybYRUQZIm11WKGQZcixaCmIaysRmGvk but the long-term plan for sharing grants data is not clear

Notes on grant financing: The £500,000 budget for the first year of Effective Altruism Grants was part of the Centre for Effective Altruism budget, and the costs are covered by the Open Philanthropy Project grant https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support and a couple of other private donors, as described at http://effective-altruism.com/ea/1fc/effective_altruism_grants_project_update/

Donor donation statistics

Cause areaCountMedianMeanMinimum10th percentile 20th percentile 30th percentile 40th percentile 50th percentile 60th percentile 70th percentile 80th percentile 90th percentile Maximum
Overall 6 20,097 21,531 8,280 8,280 10,718 10,718 20,097 20,097 23,647 28,082 28,082 38,358 38,358
Global catastrophic risks 6 20,097 21,531 8,280 8,280 10,718 10,718 20,097 20,097 23,647 28,082 28,082 38,358 38,358

Donation amounts by cause area and year

If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.

Note: Cause area classification used here may not match that used by donor for all cases.

Cause area Number of donations Number of donees Total 2017
Global catastrophic risks (filter this donor) 6 6 129,183.51 129,183.51
Total 6 6 129,183.51 129,183.51

Skipping spending graph as there is fewer than one year’s worth of donations.

Donation amounts by subcause area and year

If you hover over a cell for a given subcause area and year, you will get a tooltip with the number of donees and the number of donations.

For the meaning of “classified” and “unclassified”, see the page clarifying this.

Subcause area Number of donations Number of donees Total 2017
Global catastrophic risks 6 6 129,183.51 129,183.51
Classified total 6 6 129,183.51 129,183.51
Unclassified total 0 0 0.00 0.00
Total 6 6 129,183.51 129,183.51

Skipping spending graph as there is fewer than one year’s worth of donations.

Donation amounts by donee and year

Donee Cause area Metadata Total 2017
Amanda Askell (filter this donor) 38,358.47 38,358.47
Alliance to Feed the Earth in Disasters (filter this donor) 28,082.21 28,082.21
Global Catastrophic Risk Institute (filter this donor) Global catastrophic risks FB Tw Site 23,647.47 23,647.47
Future of Humanity Institute (filter this donor) Global catastrophic risks/AI safety/Biosecurity and pandemic preparedness FB Tw WP Site TW 20,097.00 20,097.00
Jade Leung (filter this donor) 10,718.40 10,718.40
Girish Sastry (filter this donor) 8,279.96 8,279.96
Total -- -- 129,183.51 129,183.51

Skipping spending graph as there is fewer than one year’s worth of donations.

Donation amounts by influencer and year

Sorry, we couldn't find any influencer information.

Donation amounts by disclosures and year

Sorry, we couldn't find any disclosures information.

Donation amounts by country and year

Sorry, we couldn't find any country information.

Full list of donations in reverse chronological order (6 donations)

DoneeAmount (current USD)Amount rank (out of 6)Donation dateCause areaURLInfluencerNotes
Amanda Askell38,358.4712017-09-29Global catastrophic riskshttps://docs.google.com/spreadsheets/d/1iBy--zMyIiTgybYRUQZIm11WKGQZcixaCmIaysRmGvk-- A comprehensive monograph on the ethics and risks of artificial intelligence, co-written with Hilary Greaves at the University of Oxford. She believes that this monograph could be the foundation of graduate-level courses on AI ethics for computer scientists, and that it would both popularize and provide a solid theoretical foundation for work on the topic. See http://effective-altruism.com/ea/1fc/effective_altruism_grants_project_update/ for more context about the grant program.
Alliance to Feed the Earth in Disasters (Earmark: David Denkenberger)28,082.2122017-09-29Global catastrophic riskshttps://docs.google.com/spreadsheets/d/1iBy--zMyIiTgybYRUQZIm11WKGQZcixaCmIaysRmGvk-- Writing two papers on alternative foods that could be scaled up in time to feed everyone in the case of a global catastrophe. This research is predicated on there being an ~80% chance this century of a 10% global food shortfall, and a ~10% chance of agricultural collapse. See http://effective-altruism.com/ea/1fc/effective_altruism_grants_project_update/ for more context about the grant program.
Girish Sastry8,279.9662017-09-29Global catastrophic riskshttps://docs.google.com/spreadsheets/d/1iBy--zMyIiTgybYRUQZIm11WKGQZcixaCmIaysRmGvk-- A project-oriented approach to machine learning self-study. The goal is to learn key areas of ML relevant to technical AI safety, to directly work with AI safety researcher in the short-run and to better contribute to the field formally in the long-run. See http://effective-altruism.com/ea/1fc/effective_altruism_grants_project_update/ for more context about the grant program.
Future of Humanity Institute (Earmark: Gregory Lewis)20,097.0042017-09-29Global catastrophic riskshttps://docs.google.com/spreadsheets/d/1iBy--zMyIiTgybYRUQZIm11WKGQZcixaCmIaysRmGvk-- Research into biological risk mitigation with the Future of Humanity Institute. See http://effective-altruism.com/ea/1fc/effective_altruism_grants_project_update/ for more context about the grant program.
Jade Leung10,718.4052017-09-29Global catastrophic riskshttps://docs.google.com/spreadsheets/d/1iBy--zMyIiTgybYRUQZIm11WKGQZcixaCmIaysRmGvk-- Accelerated learning experiences to fill personal knowledge gaps, specifically in global governance, international cooperation theory, and the necessary interactive expertise in AI safety research. See http://effective-altruism.com/ea/1fc/effective_altruism_grants_project_update/ for more context about the grant program.
Global Catastrophic Risk Institute (Earmark: Matthew Gentzel)23,647.4732017-09-29Global catastrophic riskshttps://docs.google.com/spreadsheets/d/1iBy--zMyIiTgybYRUQZIm11WKGQZcixaCmIaysRmGvk-- Research modeling risks and writing policy/strategy posts. His work will be based at the Global Catastrophic Risk Institute. He will also do self-study based on recommendations from the likes of the Center for Strategic and International Studies and 80,000 Hours' syllabi. See http://effective-altruism.com/ea/1fc/effective_altruism_grants_project_update/ for more context about the grant program.

Similarity to other donors

Sorry, we couldn't find any similar donors.