Effective Altruism Funds: Long-Term Future Fund donations made to Tegan McCaslin

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of July 2024. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Basic donor information

ItemValue
Country United Kingdom
Affiliated organizations (current or former; restricted to potential donees or others relevant to donation decisions)Centre for Effective Altruism
Websitehttps://app.effectivealtruism.org/funds/far-future
Donations URLhttps://app.effectivealtruism.org/
Regularity with which donor updates donations datairregular
Regularity with which Donations List Website updates donations data (after donor update)irregular
Lag with which donor updates donations datamonths
Lag with which Donations List Website updates donations data (after donor update)days
Data entry method on Donations List WebsiteManual (no scripts used)

Brief history: This is one of four Effective Altruism Funds that are a program of the Centre for Effective Altruism (CEA). The creation of the funds was inspired by the success of the EA Giving Group donor-advised fund run by Nick Beckstead, and also by the donor lottery run in December 2016 by Paul Christiano and Carl Shulman (see https://forum.effectivealtruism.org/posts/WvPEitTCM8ueYPeeH/donor-lotteries-demonstration-and-faq (GW, IR) for more). EA Funds were introduced on 2017-02-09 in the post https://forum.effectivealtruism.org/posts/a8eng4PbME85vdoep/introducing-the-ea-funds (GW, IR) and launched in the post https://forum.effectivealtruism.org/posts/iYoSAXhodpxJFwdQz/ea-funds-beta-launch (GW, IR) on 2017-02-28. The first round of allocations was announced at https://forum.effectivealtruism.org/posts/MsaS8JKrR8nnxyPkK/update-on-effective-altruism-funds (GW, IR) on 2017-04-20. The funds allocation information appears to have next been updated in November 2017; see https://www.facebook.com/groups/effective.altruists/permalink/1606722932717391/ for more. This particular fund was previously called the Far Future Fund; it was renamed to the Long-Term Future Fund to more accurately reflect the meaning.

Brief notes on broad donor philosophy and major focus areas: As the name suggests, the Fund's focus area is activities that could significantly affect the long term future. Historically, the fund has focused on areas such as AI safety and epistemic institutions, though it has also made grants related to biosecurity and other global catastrophic risks. At inception, the Fund had Nick Beckstead of Open Philanthropy its sole manager. Beckstead stepped down in August 2018, and October 2018, https://forum.effectivealtruism.org/posts/yYHKRgLk9ufjJZn23/announcing-new-ea-funds-management-teams (GW, IR) announces a new management team for the Fund, comprising chair Matt Fallshaw, and team Helen Toner, Oliver Habryka, Matt Wage, and Alex Zhu, with advisors Nick Beckstead and Jonas Vollmer.

Notes on grant decision logistics: Money from the fund is supposed to be granted about thrice a year, with the target months being November, February, and June. Actual grant months may differ from the target months. The amount of money granted with each decision cycle depends on the amount of money available in the Fund as well as on the available donation opportunities. Grant applications can be submitted any time; any submitted applications will be considered prior to the next grant round (each grant round has a deadline by which applications must be submitted to be considered).

Notes on grant publication logistics: Grant details are published on the EA Funds website, and linked to from the Fund page. Each grant is accompanied by a brief description of the grantee's work (and hence, the intended use of funds) as well as reasons the grantee was considered impressive. In April 2019, the write-up for each grant at https://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl had just one author (rather than group authorship), likely the management team member who did the most work on that particular grant. Grant write-ups vary greatly in length; in April 2019, the write-ups by Oliver Habryka were the most thorough.

Notes on grant financing: Money in the Long-Term Future Fund only includes funds explicitly donated for that Fund. In each grant round, the amount of money that can be allocated is limited by the balance available in the fund at that time.

This entity is also a donee.

Full donor page for donor Effective Altruism Funds: Long-Term Future Fund

Basic donee information

We do not have any donee information for the donee Tegan McCaslin in our system.

Full donee page for donee Tegan McCaslin

Donor–donee relationship

Item Value

Donor–donee donation statistics

Cause areaCountMedianMeanMinimum10th percentile 20th percentile 30th percentile 40th percentile 50th percentile 60th percentile 70th percentile 80th percentile 90th percentile Maximum
Overall 1 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000
AI safety 1 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000 30,000

Donation amounts by cause area and year

If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.

Note: Cause area classification used here may not match that used by donor for all cases.

Cause area Number of donations Total 2019
AI safety (filter this donor) 1 30,000.00 30,000.00
Total 1 30,000.00 30,000.00

Skipping spending graph as there is at most one year’s worth of donations.

Full list of documents in reverse chronological order (0 documents)

There are no documents associated with this combination of donor and donee.

Full list of donations in reverse chronological order (1 donations)

Graph of all donations (with known year of donation), showing the timeframe of donations

Graph of donations and their timeframes
Amount (current USD)Amount rank (out of 1)Donation dateCause areaURLInfluencerNotes
30,000.0012019-03-20AI safety/forecastinghttps://funds.effectivealtruism.org/funds/payouts/april-2019-long-term-future-fund-grants-and-recommendationsOliver Habryka Alex Zhu Matt Wage Helen Toner Matt Fallshaw Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Living expenses during project

Intended use of funds: Grant for independent research projects relevant to AI forecasting and strategy, including (but not necessarily limited to) some of the following: (1) Does the trajectory of AI capability development match that of biological evolution? (2) How tractable is long-term forecasting? (3) How much compute did evolution use to produce intelligence? (4)Benchmarking AI capabilities against insects. Short doc on (1) and (2) at https://docs.google.com/document/d/1hTLrLXewF-_iJiefyZPF6L677bLrUTo2ziy6BQbxqjs/edit

Donor reason for selecting the donee: Reasons for the grant from Oliver Habryka, the main influencer, include: (1) It's easier to relocate someone who has already demonstrated trust and skills than to find someone completely new, (2.1) It's important to give good researchers runway while they find the right place. Habryka notes: "my brief assessment of Tegan’s work was not the reason why I recommended this grant, and if Tegan asks for a new grant in 6 months to focus on solo research, I will want to spend significantly more time reading her output and talking with her, to understand how these questions were chosen and what precise relation they have to forecasting technological progress in AI."

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee). Habryka also mentions that he is interested only in providing limited runway, and would need to assess much more carefully for a more long-term grant
Percentage of total donor spend in the corresponding batch of donations: 3.25%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. However, it is also related to the grantee's situation (she has just quit her job at AI Impacts, and needs financial runway to continue pursuing promising research projects)
Intended funding timeframe in months: 6

Donor thoughts on making further donations to the donee: The grant investigator Oliver Habryka notes: "if Tegan asks for a new grant in 6 months to focus on solo research, I will want to spend significantly more time reading her output and talking with her, to understand how these questions were chosen and what precise relation they have to forecasting technological progress in AI."

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) The comments on the post do not discuss this specific grant, but a grant to Lauren Lee that includes somewhat similar reasoning (providing people runway after they leave their jobs, so they can explore better) attracts some criticism.