This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice (see his commits) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of December 2018. See the about page for more details.
|Data entry method on Donations List Website||Manual (no scripts used)|
Miscellaneous notes: Loren Merritt is famous as the creator of x264 https://en.wikipedia.org/wiki/X264 a free software library for encoding video streams into the H.264/MPEG-4 AVC format, which has been the main way he got wealthy (see http://www.x264.nl/developers/Dark_Shikari/loren.html and https://news.ycombinator.com/item?id=320102 for some related discussion). See the comment http://lesswrong.com/lw/ftg/2012_winter_fundraiser_for_the_singularity/807l for the link between his LessWrong handle and his real name; also see https://mailman.videolan.org/pipermail/x264-devel/2008-March/004227.html for other evidence of his use of the pengvado handle
If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.
Note: Cause area classification used here may not match that used by donor for all cases.
|Cause area||Number of donations||Number of donees||Total||2017||2016||2014||2013||2012||2011|
|AI risk (filter this donor)||6||1||525,000.00||25,000.00||115,000.00||0.00||245,000.00||130,000.00||10,000.00|
|Rationality improvement (filter this donor)||1||1||40,000.00||0.00||0.00||40,000.00||0.00||0.00||0.00|
Graph of spending by cause area and year (incremental, not cumulative)
Graph of spending by cause area and year (cumulative)
If you hover over a cell for a given subcause area and year, you will get a tooltip with the number of donees and the number of donations.
For the meaning of “classified” and “unclassified”, see the page clarifying this.
|Subcause area||Number of donations||Number of donees||Total||2017||2016||2014||2013||2012||2011|
Graph of spending by subcause area and year (incremental, not cumulative)
Graph of spending by subcause area and year (cumulative)
|Machine Intelligence Research Institute (filter this donor)||AI risk||FB Tw WP Site CN GS TW||525,000.00||25,000.00||115,000.00||0.00||245,000.00||130,000.00||10,000.00|
|Center for Applied Rationality (filter this donor)||Rationality||FB Tw WP Site TW||40,000.00||0.00||0.00||40,000.00||0.00||0.00||0.00|
Graph of spending by donee and year (incremental, not cumulative)
Graph of spending by donee and year (cumulative)
Sorry, we couldn't find any influencer information.
Sorry, we couldn't find any disclosures information.
Sorry, we couldn't find any country information.
|Donee||Amount (current USD)||Donation date||Cause area||URL||Influencer||Notes|
|Machine Intelligence Research Institute||25,000.00||AI risk||http://web.archive.org/web/20171223071315/https://intelligence.org/topcontributors/||--||Total amount donated by Loren Merritt to MIRI as of 2017-12-23 is $525,000. The amount listed as of October 2017 was http://web.archive.org/web/20171003083300/https://intelligence.org/topcontributors/ so the extra $25,000 was donated between those months.|
|Machine Intelligence Research Institute||115,000.00||AI risk||http://web.archive.org/web/20170204024838/https://intelligence.org/topdonors/||--||Total amount donated up to this point is listed as $500,000. The amount listed as of November 2016 at http://web.archive.org/web/20161118163935/https://intelligence.org/topdonors/ is 4385,000. The additional $115,000 was likely raised at the end of 2016.|
|Center for Applied Rationality||40,000.00||Rationality improvement||http://lesswrong.com/lw/jej/why_cfar/ab0o||--||Donation is announced in response to the post http://lesswrong.com/lw/jej/why_cfar/ by Anna Salamon of CFAR, making the case for donating to it.|
|Machine Intelligence Research Institute||245,000.00||AI risk||http://web.archive.org/web/20140403110808/http://intelligence.org/topdonors/||--||Total amount donated up to this point is listed as $385,000. Of this, $140,000 is accounted for by explicitly disclosed donations; the remainder is approximately attributed to 2013.|
|Machine Intelligence Research Institute||20,000.00||AI risk||http://lesswrong.com/lw/ftg/2012_winter_fundraiser_for_the_singularity/7zt4||--||Donation is announced in response to the post http://lesswrong.com/lw/ftg/2012_winter_fundraiser_for_the_singularity/ for the MIRI 2012 winter fundraiser.|
|Machine Intelligence Research Institute||110,000.00||AI risk||http://lesswrong.com/lw/ftg/2012_winter_fundraiser_for_the_singularity/7zt4||--||Donation is announced in response to the post http://lesswrong.com/lw/ftg/2012_winter_fundraiser_for_the_singularity/ for the MIRI 2012 winter fundraiser.|
|Machine Intelligence Research Institute||10,000.00||AI risk||http://lesswrong.com/lw/78s/help_fund_lukeprog_at_siai/4p1x||--||Donation is announced in response to the post http://lesswrong.com/lw/78s/help_fund_lukeprog_at_siai/ by Eliezer Yudkowsky asking for help to fund Luke Muehlhauser at MIRI (then called SIAI, the Singularity Institute for Artificial Intelligence.|
The following table uses the Jaccard index and cosine similarity to compare the similarity of donors. We are showing the top 30 donors by the Jaccard index because we show up to 30 donors and show only donors with at least one donee in common.
|Donor||Number of distinct donees||Number of donees in common (intersection)||Union size||Jaccard similarity||Cosine similarity||Weighted cosine similarity|
|Berkeley Existential Risk Initiative||1||1||2||0.5||0.7071||0.9971|
|Cliff & Stephanie Hyra||1||1||2||0.5||0.7071||0.9971|