Effective Altruism Funds: Long-Term Future Fund donations made to Alex Turner

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of March 2022. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Basic donor information

We do not have any donor information for the donor Effective Altruism Funds: Long-Term Future Fund in our system.

This entity is also a donee.

Full donor page for donor Effective Altruism Funds: Long-Term Future Fund

Basic donee information

We do not have any donee information for the donee Alex Turner in our system.

Full donee page for donee Alex Turner

Donor–donee relationship

Item Value

Donor–donee donation statistics

Cause areaCountMedianMeanMinimum10th percentile 20th percentile 30th percentile 40th percentile 50th percentile 60th percentile 70th percentile 80th percentile 90th percentile Maximum
Overall 1 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000
AI safety 1 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000

Donation amounts by cause area and year

If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.

Note: Cause area classification used here may not match that used by donor for all cases.

Cause area Number of donations Total 2019
AI safety (filter this donor) 1 30,000.00 30,000.00
Total 1 30,000.00 30,000.00

Skipping spending graph as there is fewer than one year’s worth of donations.

Full list of donations in reverse chronological order (1 donations)

Amount (current USD)Amount rank (out of 1)Donation dateCause areaURLInfluencerNotes
30,000.0012019-04-07AI safety/agent foundationshttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvlOliver Habryka Alex Zhu Matt Wage Helen Toner Matt Fallshaw Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Living expenses during research project

Intended use of funds: Grant for building towards a “Limited Agent Foundations” thesis on mild optimization and corrigibility. Grantee is a third-year computer science PhD student funded by a graduate teaching assistantship; to dedicate more attention to alignment research, he is applying for one or more trimesters of funding (spring term starts April 1).

Donor reason for selecting the donee: In the grant write-up, Oliver Habryka explains that he is excited by (a) Turner's posts to LessWrong reviewing many math textbooks useful for thinking about the alignment problem, (b) Turner not being intimidated by the complexity of the problem, and (c) Turner writing up his thoughts and hypotheses in a clear way, seeking feedback on them early, and making a set of novel contributions to an interesting sub-field of AI Alignment quite quickly (in the form of his work on impact measures, on which he recently collaborated with the DeepMind AI Safety team).

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 3.25%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. No specific timing-related considerations are discussed
Intended funding timeframe in months: 4

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) The comments on the post do not discuss this specific grant.