This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of December 2019. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.
|Affiliated organizations (current or former; restricted to potential donees or others relevant to donation decisions)||Centre for Effective Altruism|
|Regularity with which donor updates donations data||irregular|
|Regularity with which Donations List Website updates donations data (after donor update)||irregular|
|Lag with which donor updates donations data||months|
|Lag with which Donations List Website updates donations data (after donor update)||days|
|Data entry method on Donations List Website||Manual (no scripts used)|
Brief history: The funds are a program of the Centre for Effective Altruism (CEA). The creation of the funds was inspired by the success of the EA Giving Group donor-advised fund run by Nick Beckstead, and also by the donor lottery run in December 2016 by Paul Christiano and Carl Shulman (see http://effective-altruism.com/ea/14d/donor_lotteries_demonstration_and_faq/ for more). EA Funds were introduced on 2017-02-09 in the post http://effective-altruism.com/ea/174/introducing_the_ea_funds/ and launched on 2017-02-28 in the post http://effective-altruism.com/ea/17v/ea_funds_beta_launch/ The first round of allocations was announced on 2017-04-20 at http://effective-altruism.com/ea/19d/update_on_effective_altruism_funds/ The funds allocation information appears to have next been updated in November 2017; see https://www.facebook.com/groups/effective.altruists/permalink/1606722932717391/ for more
Brief notes on broad donor philosophy and major focus areas: There are four EA Funds, each with its own focus area and own fund managers: Global Health and Development (Elie Hassenfeld of GiveWell), Animal Welfare (Lewis Bollard of the Open Philanthropy Project as Chair, Toni Adleberg of Animal Charity Evaluators, Natalie Cargill of Effective Giving, and Jamie Spurgeon of Animal Charity Evaluators), Long Term Future (Matt Fallshaw as Chair, Oliver Habryka, Helen Toner, Matt Wage, and Alex Zhu; Nick Beckstead and Jonas Vollmer serve as advisors), and EA Community (also known as EA Meta) (Luke Ding as Chair, Alex Foster, Tara Mac Aulay, Denise Melchin, and Matt Wage; Nick Beckstead serves as advisor)
Notes on grant decision logistics: Grants are decided separately within each of the four funds, by the managers of that fund. Allocation of the money may take about a month after the grant decision. Fund managers generally allocate multiple grants together with a bunch of money collected over the last few months. For all funds except the Global Health and Development Fund, the target months for making grant decisions are November, February, and June. For the Global Health and Development Fund, the target months are December, March and July. Actual grant decision months may be one or two months later than the target months
Notes on grant publication logistics: Grant details are published on the EA Funds website, and linked to from the page on the specific Fund. Grants allocated together are generally published together on a single page. Grants from the Global Health and Development Fund (managed by Elie Hassenfeld of GiveWell) are usually of two types: (1) Grants that are also GiveWell Incubation Grants, so they will be cross-posted to the GiveWell Incubation Grants page on GiveWell's site (but are listed only with donor Effective Altruism Funds on the donations list website), (2) Grants that are decided along with and similarly to GiveWell discretionary regranting
Notes on grant financing: Finances for each of the funds are maintained separately: individual donors can donate to a specific fund, or to all funds in a specific proportion specified by them. Only money explicitly donated to a fund can be granted out from that fund. Other money of the Centre for Effective Altruism (CEA) is not granted out through the Funds
This entity is also a donee.
Full donor page for donor Effective Altruism Funds
We do not have any donee information for the donee AI Safety Camp in our system.
Full donee page for donee AI Safety Camp
|Cause area||Count||Median||Mean||Minimum||10th percentile||20th percentile||30th percentile||40th percentile||50th percentile||60th percentile||70th percentile||80th percentile||90th percentile||Maximum|
If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.
Note: Cause area classification used here may not match that used by donor for all cases.
|Cause area||Number of donations||Total||2019|
|AI safety (filter this donor)||1||25,000.00||25,000.00|
Skipping spending graph as there is fewer than one year’s worth of donations.
|Amount (current USD)||Amount rank (out of 1)||Donation date||Cause area||URL||Influencer||Notes|
|25,000.00||1||AI safety||https://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl||Oliver Habryka Alex Zhu Matt Wage Helen Toner Matt Fallshaw||Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)
Intended use of funds (category): Organizational general support
Intended use of funds: Grant to fund an upcoming camp in Madrid being organized by AI Safety Camp in April 2019. The camp consists of several weeks of online collaboration on concrete research questions, culminating in a 9-day intensive in-person research camp. The goal is to support aspiring researchers of AI alignment to boost themselves into productivity.
Donor reason for selecting the donee: The grant investigator and main influencer Oliver Habryka mentions that: (1) He has a positive impression of the organizers and has received positive feedback from participants in the first two AI Safety Camps. (2) A greater need to improve access to opportunities in AI alignment for people in Europe. Habryka also mentions an associated greater risk of making the AI Safety Camp the focal point of the AI safety community in Europe, which could cause problems if the quality of the people involved isn't high. He mentions two more specific concerns: (a) Organizing long in-person events is hard, and can lead to conflict, as the last two camps did. (b) People who don't get along with the organizers may find themselves shut out of the AI safety network
Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 2.71%
Donor reason for donating at this time (rather than earlier or later): Timing determined by the timing of the camp (which is scheduled for April 2019; the grant is being made around the same time) as well as the timing of the grant round
Intended funding timeframe in months: 1
Donor thoughts on making further donations to the donee: Grant investigator and main influencer Habryka writes: "I would want to engage with the organizers a fair bit more before recommending a renewal of this grant"
Other notes: Grantee in the grant document is listed as Johannes Heidecke, but the grant is for the AI Safety Camp. Grant is for supporting The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions Grant decision was coordinated with Effective Altruism Grants (specifically, Nicole Ross of CEA) who had considered also making a grant to the camp. Effective Altruism Grants ultimately decided against making the grant, and the Long Term Future Fund made it instead. Nicole Ross, in the evaluation by EA Grants, mentions the same concerns that Habryka does: interpersonal conflict and people being shut out of the AI safety community if they don't get along with the camp organizers.