This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of December 2019. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.
|Affiliated organizations (current or former; restricted to potential donees or others relevant to donation decisions)||Centre for the Study of Existential Risk|
|Data entry method on Donations List Website||Manual (no scripts used)|
|Org Watch page||https://orgwatch.issarice.com/?person=Jaan+Tallinn|
Miscellaneous notes: Tallinn is a co-founder of Skype and Kazaa and one of the earlier wealthy supporters of organizations working in AI risk, along with Peter Thiel. In 2011, he had a conversation with Holden Karnofsky sharing his thoughts on AI risk and in particular the work of the Singularity Institute (SI), the former name of the Machine Intelligence Research Institute. See https://groups.yahoo.com/neo/groups/givewell/conversations/topics/287 and http://lesswrong.com/lw/cbs/thoughts_on_the_singularity_institute_si/
|Cause area||Count||Median||Mean||Minimum||10th percentile||20th percentile||30th percentile||40th percentile||50th percentile||60th percentile||70th percentile||80th percentile||90th percentile||Maximum|
If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.
Note: Cause area classification used here may not match that used by donor for all cases.
|Cause area||Number of donations||Number of donees||Total||2017||2016||2014||2013||2012|
|AI safety (filter this donor)||8||2||7,604,500.00||7,060,500.00||80,000.00||100,000.00||100,000.00||264,000.00|
Graph of spending by cause area and year (incremental, not cumulative)
Graph of spending by cause area and year (cumulative)
If you hover over a cell for a given subcause area and year, you will get a tooltip with the number of donees and the number of donations.
For the meaning of “classified” and “unclassified”, see the page clarifying this.
|Subcause area||Number of donations||Number of donees||Total||2017||2016||2014||2013||2012|
Graph of spending by subcause area and year (incremental, not cumulative)
Graph of spending by subcause area and year (cumulative)
|Berkeley Existential Risk Initiative (filter this donor)||AI safety/other global catastrophic risks||Site TW||7,000,000.00||7,000,000.00||0.00||0.00||0.00||0.00|
|Machine Intelligence Research Institute (filter this donor)||AI safety||FB Tw WP Site CN GS TW||604,500.00||60,500.00||80,000.00||100,000.00||100,000.00||264,000.00|
Graph of spending by donee and year (incremental, not cumulative)
Graph of spending by donee and year (cumulative)
Sorry, we couldn't find any influencer information.
Sorry, we couldn't find any disclosures information.
Sorry, we couldn't find any country information.
|Donee||Amount (current USD)||Amount rank (out of 8)||Donation date||Cause area||URL||Influencer||Notes|
|Berkeley Existential Risk Initiative||5,000,000.00||8||AI safety||http://existence.org/2018/01/11/activity-update-december-2017.html||--||Donation amount approximate.|
|Machine Intelligence Research Institute||60,500.00||1||AI safety||https://web.archive.org/web/20170204024838/https://intelligence.org/topdonors/||--|
|Berkeley Existential Risk Initiative||2,000,000.00||7||AI safety||http://existence.org/grants||--|
|Machine Intelligence Research Institute||80,000.00||2||AI safety||https://web.archive.org/web/20160115172820/https://intelligence.org/donortools/topdonors.php||--|
|Machine Intelligence Research Institute||100,000.00||3||AI safety||http://archive.today/2014.10.10-021359/http://intelligence.org/topdonors/||--|
|Machine Intelligence Research Institute||100,000.00||3||AI safety||http://archive.today/2013.10.21-235551/http://intelligence.org/topdonors/||--|
|Machine Intelligence Research Institute||109,000.00||5||AI safety||https://web.archive.org/web/20120918094656/http://singularity.org:80/topdonors/||--|
|Machine Intelligence Research Institute||155,000.00||6||AI safety||https://web.archive.org/web/20120719220051/http://singularity.org:80/topdonors/||--|
Sorry, we couldn't find any similar donors.