This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of December 2019. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.
|Affiliated organizations (current or former; restricted to potential donees or others relevant to donation decisions)||Centre for Effective Altruism|
|Regularity with which donor updates donations data||irregular|
|Regularity with which Donations List Website updates donations data (after donor update)||irregular|
|Lag with which donor updates donations data||months|
|Lag with which Donations List Website updates donations data (after donor update)||days|
|Data entry method on Donations List Website||Manual (no scripts used)|
Brief history: The funds are a program of the Centre for Effective Altruism (CEA). The creation of the funds was inspired by the success of the EA Giving Group donor-advised fund run by Nick Beckstead, and also by the donor lottery run in December 2016 by Paul Christiano and Carl Shulman (see http://effective-altruism.com/ea/14d/donor_lotteries_demonstration_and_faq/ for more). EA Funds were introduced on 2017-02-09 in the post http://effective-altruism.com/ea/174/introducing_the_ea_funds/ and launched on 2017-02-28 in the post http://effective-altruism.com/ea/17v/ea_funds_beta_launch/ The first round of allocations was announced on 2017-04-20 at http://effective-altruism.com/ea/19d/update_on_effective_altruism_funds/ The funds allocation information appears to have next been updated in November 2017; see https://www.facebook.com/groups/effective.altruists/permalink/1606722932717391/ for more
Brief notes on broad donor philosophy and major focus areas: There are four EA Funds, each with its own focus area and own fund managers: Global Health and Development (Elie Hassenfeld of GiveWell), Animal Welfare (Lewis Bollard of the Open Philanthropy Project as Chair, Toni Adleberg of Animal Charity Evaluators, Natalie Cargill of Effective Giving, and Jamie Spurgeon of Animal Charity Evaluators), Long Term Future (Matt Fallshaw as Chair, Oliver Habryka, Helen Toner, Matt Wage, and Alex Zhu; Nick Beckstead and Jonas Vollmer serve as advisors), and EA Community (also known as EA Meta) (Luke Ding as Chair, Alex Foster, Tara Mac Aulay, Denise Melchin, and Matt Wage; Nick Beckstead serves as advisor)
Notes on grant decision logistics: Grants are decided separately within each of the four funds, by the managers of that fund. Allocation of the money may take about a month after the grant decision. Fund managers generally allocate multiple grants together with a bunch of money collected over the last few months. For all funds except the Gloabl Health and Development Fund, the target months for making grant decisions are November, February, and June. For the Global Health and Development Fund, the target months are December, March and July. Actual grant decision months may be one or two months later than the target months
Notes on grant publication logistics: Grant details are published on the EA Funds website, and linked to from the page on the specific Fund. Grants allocated together are generally published together on a single page
Notes on grant financing: Finances for each of the funds are maintained separately: individual donors can donate to a specific fund, or to all funds in a specific proportion specified by them. Only money explicitly donated to a fund can be granted out from that fund. Other money of the Centre for Effective Altruism (CEA) is not granted out through the Funds
This entity is also a donee.
Full donor page for donor Effective Altruism Funds
|Donors list page||https://ought.org/about|
|Open Philanthropy Project grant review||https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/ought-general-support|
|Org Watch page||https://orgwatch.issarice.com/?organization=Ought|
|Key people||Andreas Stuhlmüller|
Full donee page for donee Ought
|Cause area||Count||Median||Mean||Minimum||10th percentile||20th percentile||30th percentile||40th percentile||50th percentile||60th percentile||70th percentile||80th percentile||90th percentile||Maximum|
If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.
Note: Cause area classification used here may not match that used by donor for all cases.
|Cause area||Number of donations||Total||2019||2018|
|AI safety (filter this donor)||2||60,000.00||50,000.00||10,000.00|
Graph of spending by cause area and year (incremental, not cumulative)
Graph of spending by cause area and year (cumulative)
|Amount (current USD)||Amount rank (out of 2)||Donation date||Cause area||URL||Influencer||Notes|
|50,000.00||1||AI safety||https://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl||Matt Wage Helen Toner Matt Fallshaw Alex Zhu Oliver Habryka||Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)
Intended use of funds (category): Organization financial buffer
Intended use of funds: No specific information is shared on how the funds will be used at the margin, but the general description gives an idea: "Ought is a nonprofit aiming to implement AI alignment concepts in real-world applications"
Donor reason for selecting the donee: Donor is explicitly interesting in diversifying funder base for donee, who currently receives almost all its funding from only two sources and is trying to change that. Othewise, same reason as with last round of funds https://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi namely "We believe that Ought’s approach is interesting and worth trying, and that they have a strong team. [...] Part of the aim of the grant is to show Ought as an example of the type of organization we are likely to fund in the future."
Donor reason for donating that amount (rather than a bigger or smaller amount): In write-up for previous grant at https://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi of $10,000, donor says: "Our understanding is that hiring is currently more of a bottleneck for them than funding, so we are only making a small grant." The amount this time is bigger ($50,000) but the general principle likely continues to apply
Percentage of total donor spend in the corresponding batch of donations: 5.42%
Donor reason for donating at this time (rather than earlier or later): In the previous grant round, donor had said "Part of the aim of the grant is to show Ought as an example of the type of organization we are likely to fund in the future." Thus, it makes sense to donate again in this round
Other notes: The grant reasoning is written up by Matt Wage and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions but the comments on the post do not discuss this specific grant.
|10,000.00||2||AI safety||https://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi||Alex Zhu Helen Toner Matt Fallshaw Matt Wage Oliver Habryka||Grant made to implement AI alignment concepts in real-world applications. Donee seems more hiring-constrained than fundraising-constrained, hence only a small amount, but donor does believe that donee has a promising approach. Percentage of total donor spend in the corresponding batch of donations: 100.00%.|