This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of December 2019. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.
We do not have any donor information for the donor EA Giving Group in our system.
This entity is also a donee.
Full donor page for donor EA Giving Group
|Timelines wiki page||https://timelines.issarice.com/wiki/Timeline_of_Berkeley_Existential_Risk_Initiative|
|Org Watch page||https://orgwatch.issarice.com/?organization=Berkeley+Existential+Risk+Initiative|
|Key people||Andrew Critch|Gina Stuessy|Michael Keenan|
|Notes||Launched to provide fast-moving support to existing existential risk organizations. Works closely with Machine Intelligence Research Institute, Center for Human-Compatible AI, Centre for the Study of Existential Risk, and Future of Humanity Institute. People working at it are closely involved with MIRI and the Center for Applied Rationality|
This entity is also a donor.
Full donee page for donee Berkeley Existential Risk Initiative
If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.
Note: Cause area classification used here may not match that used by donor for all cases.
|Cause area||Number of donations||Total||2017|
|AI safety (filter this donor)||1||35,161.98||35,161.98|
Skipping spending graph as there is fewer than one year’s worth of donations.
|Amount (current USD)||Donation date||Cause area||URL||Influencer||Notes|
|35,161.98||AI safety/other global catastrophic risks||https://app.effectivealtruism.org/funds/far-future/payouts/OzIQqsVacUKw0kEuaUGgI||Nick Beckstead||Grant discussed at http://effective-altruism.com/ea/19d/update_on_effective_altruism_funds/ along with reasoning. Grantee approached Nick Beckstead with a grant proposal asking for 50000 USD. Beckstead provided all the money donated already from the far future fund in Effective Altruism Funds, and made up the remainder via the EA Giving Group and some personal funds. It is not clear how much was personal funds, so for simplicity we are attributing the entirety of the remainder to EA Giving Group (creating some inaccuracy).|