Effective Altruism Funds: Long-Term Future Fund donations made to 80,000 Hours

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of July 2024. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Basic donor information

ItemValue
Country United Kingdom
Affiliated organizations (current or former; restricted to potential donees or others relevant to donation decisions)Centre for Effective Altruism
Websitehttps://app.effectivealtruism.org/funds/far-future
Donations URLhttps://app.effectivealtruism.org/
Regularity with which donor updates donations datairregular
Regularity with which Donations List Website updates donations data (after donor update)irregular
Lag with which donor updates donations datamonths
Lag with which Donations List Website updates donations data (after donor update)days
Data entry method on Donations List WebsiteManual (no scripts used)

Brief history: This is one of four Effective Altruism Funds that are a program of the Centre for Effective Altruism (CEA). The creation of the funds was inspired by the success of the EA Giving Group donor-advised fund run by Nick Beckstead, and also by the donor lottery run in December 2016 by Paul Christiano and Carl Shulman (see https://forum.effectivealtruism.org/posts/WvPEitTCM8ueYPeeH/donor-lotteries-demonstration-and-faq (GW, IR) for more). EA Funds were introduced on 2017-02-09 in the post https://forum.effectivealtruism.org/posts/a8eng4PbME85vdoep/introducing-the-ea-funds (GW, IR) and launched in the post https://forum.effectivealtruism.org/posts/iYoSAXhodpxJFwdQz/ea-funds-beta-launch (GW, IR) on 2017-02-28. The first round of allocations was announced at https://forum.effectivealtruism.org/posts/MsaS8JKrR8nnxyPkK/update-on-effective-altruism-funds (GW, IR) on 2017-04-20. The funds allocation information appears to have next been updated in November 2017; see https://www.facebook.com/groups/effective.altruists/permalink/1606722932717391/ for more. This particular fund was previously called the Far Future Fund; it was renamed to the Long-Term Future Fund to more accurately reflect the meaning.

Brief notes on broad donor philosophy and major focus areas: As the name suggests, the Fund's focus area is activities that could significantly affect the long term future. Historically, the fund has focused on areas such as AI safety and epistemic institutions, though it has also made grants related to biosecurity and other global catastrophic risks. At inception, the Fund had Nick Beckstead of Open Philanthropy its sole manager. Beckstead stepped down in August 2018, and October 2018, https://forum.effectivealtruism.org/posts/yYHKRgLk9ufjJZn23/announcing-new-ea-funds-management-teams (GW, IR) announces a new management team for the Fund, comprising chair Matt Fallshaw, and team Helen Toner, Oliver Habryka, Matt Wage, and Alex Zhu, with advisors Nick Beckstead and Jonas Vollmer.

Notes on grant decision logistics: Money from the fund is supposed to be granted about thrice a year, with the target months being November, February, and June. Actual grant months may differ from the target months. The amount of money granted with each decision cycle depends on the amount of money available in the Fund as well as on the available donation opportunities. Grant applications can be submitted any time; any submitted applications will be considered prior to the next grant round (each grant round has a deadline by which applications must be submitted to be considered).

Notes on grant publication logistics: Grant details are published on the EA Funds website, and linked to from the Fund page. Each grant is accompanied by a brief description of the grantee's work (and hence, the intended use of funds) as well as reasons the grantee was considered impressive. In April 2019, the write-up for each grant at https://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl had just one author (rather than group authorship), likely the management team member who did the most work on that particular grant. Grant write-ups vary greatly in length; in April 2019, the write-ups by Oliver Habryka were the most thorough.

Notes on grant financing: Money in the Long-Term Future Fund only includes funds explicitly donated for that Fund. In each grant round, the amount of money that can be allocated is limited by the balance available in the fund at that time.

This entity is also a donee.

Full donor page for donor Effective Altruism Funds: Long-Term Future Fund

Basic donee information

ItemValue
Country United Kingdom
Facebook page 80000Hours
Websitehttps://80000hours.org/
Donate pagehttps://80000hours.org/support-us/donate/
Donors list pagehttps://80000hours.org/about/donors/
Transparency and financials pagehttps://80000hours.org/about/credibility/evaluations/
Donation case pagehttp://effective-altruism.com/ea/15d/why_donate_to_80000_hours/
Twitter username80000hours
Wikipedia pagehttps://en.wikipedia.org/wiki/80%2C000_Hours
Org Watch pagehttps://orgwatch.issarice.com/?organization=80%2C000+Hours
Key peopleWilliam MacAskill|Benjamin Todd|Robert Wiblin
Launch date2011-11

Full donee page for donee 80,000 Hours

Donor–donee relationship

Item Value

Donor–donee donation statistics

Cause areaCountMedianMeanMinimum10th percentile 20th percentile 30th percentile 40th percentile 50th percentile 60th percentile 70th percentile 80th percentile 90th percentile Maximum
Overall 2 91,450 95,725 91,450 91,450 91,450 91,450 91,450 91,450 100,000 100,000 100,000 100,000 100,000
Effective altruism 2 91,450 95,725 91,450 91,450 91,450 91,450 91,450 91,450 100,000 100,000 100,000 100,000 100,000

Donation amounts by cause area and year

If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.

Note: Cause area classification used here may not match that used by donor for all cases.

Cause area Number of donations Total 2020 2018
Effective altruism (filter this donor) 2 191,450.00 100,000.00 91,450.00
Total 2 191,450.00 100,000.00 91,450.00

Graph of spending by cause area and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by cause area and year (cumulative)

Graph of spending should have loaded here

Full list of documents in reverse chronological order (2 documents)

Title (URL linked)Publication dateAuthorPublisherAffected donorsAffected doneesAffected influencersDocument scopeCause areaNotes
2021 AI Alignment Literature Review and Charity Comparison (GW, IR)2021-12-23Larks Effective Altruism ForumLarks Effective Altruism Funds: Long-Term Future Fund Survival and Flourishing Fund FTX Future Fund Future of Humanity Institute Future of Humanity Institute Centre for the Governance of AI Center for Human-Compatible AI Machine Intelligence Research Institute Global Catastrophic Risk Institute Centre for the Study of Existential Risk OpenAI Google Deepmind Anthropic Alignment Research Center Redwood Research Ought AI Impacts Global Priorities Institute Center on Long-Term Risk Centre for Long-Term Resilience Rethink Priorities Convergence Analysis Stanford Existential Risk Initiative Effective Altruism Funds: Long-Term Future Fund Berkeley Existential Risk Initiative 80,000 Hours Survival and Flourishing Fund Review of current state of cause areaAI safetyCross-posted to LessWrong at https://www.lesswrong.com/posts/C4tR3BEpuWviT7Sje/2021-ai-alignment-literature-review-and-charity-comparison (GW, IR) This is the sixth post in a tradition of annual blog posts on the state of AI safety and the work of various organizations in the space over the course of the year; the post is structured similarly to the previous year's post https://forum.effectivealtruism.org/posts/K7Z87me338BQT3Mcv/2020-ai-alignment-literature-review-and-charity-comparison (GW, IR) but has a few new features. The author mentions that he has several conflicts of interest that he cannot individually disclose. He also starts collecting "second preferences" data this year for all the organizations he talks to, which is where the organization would like to see funds go, other than itself. The Long-Term Future Fund is the clear winner here. He also announces that he's looking for a research assistant to help with next year's post given the increasing time demands and his reduced time availability. His final rot13'ed donation decision is to donate to the Long-Term Future Fund so that sufficiently skilled AI safety researchers can make a career with LTFF funding; his second preference for donations is BERI. Many other organizations that he considers to be likely to be doing excellent work are either already well-funded or do not provide sufficient disclosure.
2020 AI Alignment Literature Review and Charity Comparison (GW, IR)2020-12-21Larks Effective Altruism ForumLarks Effective Altruism Funds: Long-Term Future Fund Open Philanthropy Survival and Flourishing Fund Future of Humanity Institute Center for Human-Compatible AI Machine Intelligence Research Institute Global Catastrophic Risk Institute Centre for the Study of Existential Risk OpenAI Berkeley Existential Risk Initiative Ought Global Priorities Institute Center on Long-Term Risk Center for Security and Emerging Technology AI Impacts Leverhulme Centre for the Future of Intelligence AI Safety Camp Future of Life Institute Convergence Analysis Median Group AI Pulse 80,000 Hours Survival and Flourishing Fund Review of current state of cause areaAI safetyCross-posted to LessWrong at https://www.lesswrong.com/posts/pTYDdcag9pTzFQ7vw/2020-ai-alignment-literature-review-and-charity-comparison (GW, IR) This is the fifth post in a tradition of annual blog posts on the state of AI safety and the work of various organizations in the space over the course of the year; the previous year's post is at https://forum.effectivealtruism.org/posts/dpBB24QsnsRnkq5JT/2019-ai-alignment-literature-review-and-charity-comparison (GW, IR) The post is structured very similar to the previous year's post. It has sections on "Research" and "Finance" for a number of organizations working in the AI safety space, many of whom accept donations. A "Capital Allocators" section discusses major players who allocate funds in the space. A lengthy "Methodological Thoughts" section explains how the author approaches some underlying questions that influence his thoughts on all the organizations. To make selective reading of the document easier, the author ends each paragraph with a hashtag, and lists the hashtags at the beginning of the document. See https://www.lesswrong.com/posts/uEo4Xhp7ziTKhR6jq/reflections-on-larks-2020-ai-alignment-literature-review (GW, IR) for discussion of some aspects of the post by Alex Flint.

Full list of donations in reverse chronological order (2 donations)

Graph of all donations (with known year of donation), showing the timeframe of donations

Graph of donations and their timeframes
Amount (current USD)Amount rank (out of 2)Donation dateCause areaURLInfluencerNotes
100,000.0012020-04-14Effective altruism/movement growth/career counselinghttps://funds.effectivealtruism.org/funds/payouts/april-2020-long-term-future-fund-grants-and-recommendationsMatt Wage Helen Toner Oliver Habryka Adam Gleave Intended use of funds (category): Organizational general support

Other notes: Percentage of total donor spend in the corresponding batch of donations: 20.48%.
91,450.0022018-08-14Effective altruism/movement growth/career counselinghttps://funds.effectivealtruism.org/funds/payouts/july-2018-long-term-future-fund-grantsNick Beckstead Donation process: The grant from the EA Long-Term Future Fund is part of a final set of grant decisions being made by Nick Beckstead (granting $526,000 from the EA Meta Fund and $917,000 from the EA Long-Term Future Fund) as he transitions out of managing both funds. Due to time constraints, Beckstead primarily relies on investigation of the organization done by the Open Philanthropy Project when making its 2017 grant https://www.openphilanthropy.org/giving/grants/80000-hours-general-support and 2018 renewal https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2018

Intended use of funds (category): Organizational general support

Intended use of funds: Beckstead writes "I recommended these grants with the suggestion that these grantees look for ways to use funding to trade money for saving the time or increasing the productivity of their employees (e.g. subsidizing electronics upgrades or childcare), due to a sense that (i) their work is otherwise much less funding constrained than it used to be, and (ii) spending like this would better reflect the value of staff time and increase staff satisfaction. However, I also told them that I was open to them using these funds to accomplish this objective indirectly (e.g. through salary increases) or using the funds for another purpose if that seemed better to them."

Donor reason for selecting the donee: The grant page references https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2018 for Beckstead's opinion of the donee. This grant page is short, and in turn links to https://www.openphilanthropy.org/giving/grants/80000-hours-general-support which has a detailed Case for the grant section https://www.openphilanthropy.org/giving/grants/80000-hours-general-support#Case_for_the_grant that praises 80,000 Hours' track record in terms of impact-adjusted significant plan changes (IASPCs)

Donor reason for donating that amount (rather than a bigger or smaller amount): Beckstead is also recommending funding from the EA Meta Fund of $75,818 for 80,000 Hours. The grant page says "The amounts I’m granting out to different organizations are roughly proportional to the number of staff they have, with some skew towards MIRI that reflects greater EA Funds donor interest in the Long-Term Future Fund." Also: "I think a number of these organizations could qualify for the criteria of either the Long-Term Future Fund or the EA Community Fund because of their dual focus on EA and longtermism, which is part of the reason that 80,000 Hours is receiving a grant from each fund."
Percentage of total donor spend in the corresponding batch of donations: 9.97%

Donor reason for donating at this time (rather than earlier or later): Timing determined by the timing of this round of grants, which is in turn determined by the need for Beckstead to grant out the money before handing over management of the fund

Donor retrospective of the donation: Even after the fund management being moved to a new team, the EA Meta Fund would continue making grants to 80,000 Hours. In fact, 80,000 Hours would receive grant money in each of the three subsequent grant rounds. However, the EA Long-Term Future Fund would make no further grants to 80,000 Hours. This suggests that the selection of the grantee as a Long-Term Future Fund grantee would not continue to be endorsed by the new management team