This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of March 2022. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.
We do not have any donee information for the donee Nikhil Kunapuli in our system.
|Cause area||Count||Median||Mean||Minimum||10th percentile||20th percentile||30th percentile||40th percentile||50th percentile||60th percentile||70th percentile||80th percentile||90th percentile||Maximum|
|Effective Altruism Funds: Long-Term Future Fund (filter this donee)||30,000.00||30,000.00|
|Donor||Amount (current USD)||Amount rank (out of 1)||Donation date||Cause area||URL||Influencer||Notes|
|Effective Altruism Funds: Long-Term Future Fund||30,000.00||1||AI safety/deconfusion research||https://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl||Alex Zhu Matt Wage Helen Toner Matt Fallshaw Oliver Habryka||Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)
Intended use of funds (category): Living expenses during research project
Intended use of funds: Grantee is doing independent deconfusion research for AI safety. His approach is to develop better foundational understandings of various concepts in AI safety, like safe exploration and robustness to distributional shift, by exploring these concepts in complex systems science and theoretical biology, domains outside of machine learning for which these concepts are also applicable.
Donor reason for selecting the donee: Fund manager Alex Zhu says: "I recommended that we fund Nikhil because I think Nikhil’s research directions are promising, and because I personally learn a lot about AI safety every time I talk with him."
Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 3.25%
Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. No specific timing-related considerations are discussed
Donor thoughts on making further donations to the donee: Alex Zhu, in his grant write-up, says that the quality of the work will be assessed by researchers at MIRI. Although it is not explicitly stated, it is likely that this evaluation will influence the decision of whether to make further grants
Other notes: The grant reasoning is written up by Alex Zhu and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) but the comments on the post do not discuss this specific grant.