AI Safety Unconference donations received

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of December 2019. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Basic donee information

We do not have any donee information for the donee AI Safety Unconference in our system.

Donee donation statistics

Cause areaCountMedianMeanMinimum10th percentile 20th percentile 30th percentile 40th percentile 50th percentile 60th percentile 70th percentile 80th percentile 90th percentile Maximum
Overall 1 4,500 4,500 4,500 4,500 4,500 4,500 4,500 4,500 4,500 4,500 4,500 4,500 4,500
AI safety 1 4,500 4,500 4,500 4,500 4,500 4,500 4,500 4,500 4,500 4,500 4,500 4,500 4,500

Donation amounts by donor and year for donee AI Safety Unconference

Donor Total 2018
Effective Altruism Funds (filter this donee) 4,500.00 4,500.00
Total 4,500.00 4,500.00

Full list of donations in reverse chronological order (1 donations)

DonorAmount (current USD)Amount rank (out of 1)Donation dateCause areaURLInfluencerNotes
Effective Altruism Funds4,500.0012018-11-29AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSiAlex Zhu Helen Toner Matt Fallshaw Matt Wage Oliver Habryka Orpheus Lummis and Vaughn DiMarco are organizing an unconference on AI Alignment on the last day of the NeurIPS conference, with the goal of facilitating networking and research on AI Alignment among a diverse audience of AI researchers with and without safety backgrounds. Based on interaction with the organizers and some participants, the donor feels this project is worth funding. However, the donee is still not sure if the unconference will be held, so the grant is conditional to the donee deciding to proceed. The grant would fully fund the request. Percentage of total donor spend in the corresponding batch of donations: 100.00%.