Effective Altruism Funds: Long-Term Future Fund donations made (filtered to cause areas matching Forecasting)

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of March 2022. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Basic donor information

We do not have any donor information for the donor Effective Altruism Funds: Long-Term Future Fund in our system.

This entity is also a donee.

Donor donation statistics

Cause areaCountMedianMeanMinimum10th percentile 20th percentile 30th percentile 40th percentile 50th percentile 60th percentile 70th percentile 80th percentile 90th percentile Maximum
Overall 6 27,000 39,500 20,000 20,000 20,000 20,000 27,000 27,000 30,000 70,000 70,000 70,000 70,000
Forecasting 3 70,000 53,333 20,000 20,000 20,000 20,000 70,000 70,000 70,000 70,000 70,000 70,000 70,000
AI safety 3 27,000 25,667 20,000 20,000 20,000 20,000 27,000 27,000 27,000 30,000 30,000 30,000 30,000

Donation amounts by cause area and year

If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.

Note: Cause area classification used here may not match that used by donor for all cases.

Cause area Number of donations Number of donees Total 2019 2018
Forecasting (filter this donor) 3 2 160,000.00 140,000.00 20,000.00
AI safety (filter this donor) 3 3 77,000.00 77,000.00 0.00
Total 6 5 237,000.00 217,000.00 20,000.00

Graph of spending by cause area and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by cause area and year (cumulative)

Graph of spending should have loaded here

Donation amounts by subcause area and year

If you hover over a cell for a given subcause area and year, you will get a tooltip with the number of donees and the number of donations.

For the meaning of “classified” and “unclassified”, see the page clarifying this.

Subcause area Number of donations Number of donees Total 2019 2018
Forecasting 3 2 160,000.00 140,000.00 20,000.00
AI safety/forecasting 3 3 77,000.00 77,000.00 0.00
Classified total 6 5 237,000.00 217,000.00 20,000.00
Unclassified total 0 0 0.00 0.00 0.00
Total 6 5 237,000.00 217,000.00 20,000.00

Graph of spending by subcause area and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by subcause area and year (cumulative)

Graph of spending should have loaded here

Donation amounts by donee and year

Donee Cause area Metadata Total 2019 2018
Foretold (filter this donor) 90,000.00 70,000.00 20,000.00
Metaculus (filter this donor) 70,000.00 70,000.00 0.00
Tegan McCaslin (filter this donor) 30,000.00 30,000.00 0.00
Jacob Lagerros (filter this donor) 27,000.00 27,000.00 0.00
Connor Flexman (filter this donor) 20,000.00 20,000.00 0.00
Total -- -- 237,000.00 217,000.00 20,000.00

Graph of spending by donee and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by donee and year (cumulative)

Graph of spending should have loaded here

Donation amounts by influencer and year

If you hover over a cell for a given influencer and year, you will get a tooltip with the number of donees and the number of donations.

For the meaning of “classified” and “unclassified”, see the page clarifying this.

Influencer Number of donations Number of donees Total 2019 2018
Oliver Habryka|Alex Zhu|Matt Wage|Helen Toner|Matt Fallshaw 5 5 217,000.00 217,000.00 0.00
Alex Zhu|Helen Toner|Matt Fallshaw|Matt Wage|Oliver Habryka 1 1 20,000.00 0.00 20,000.00
Classified total 6 5 237,000.00 217,000.00 20,000.00
Unclassified total 0 0 0.00 0.00 0.00
Total 6 5 237,000.00 217,000.00 20,000.00

Graph of spending by influencer and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by influencer and year (cumulative)

Graph of spending should have loaded here

Donation amounts by disclosures and year

Sorry, we couldn't find any disclosures information.

Donation amounts by country and year

Sorry, we couldn't find any country information.

Full list of donations in reverse chronological order (6 donations)

DoneeAmount (current USD)Amount rank (out of 6)Donation dateCause areaURLInfluencerNotes
Foretold (Earmark: Ozzie Gooen)70,000.0012019-04-07Forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvlOliver Habryka Alex Zhu Matt Wage Helen Toner Matt Fallshaw Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant will be mainly used by Ozzie Gooen to pay programmers to work on Foretold at http://www.foretold.io/ a forecasting application that handles full probability distributions. This includes work on Ken.js, a private version of Wikidata that Gooen has started integrating with Foretold

Donor reason for selecting the donee: Grant investigator and main influencer Oliver Habryka gives these reasons for the grant, as well as other forecasting-related grants made to Anthony Aguirre (Metaculus) and Jacob Lagerros: (1) confusion about what is progress and what problems need solving, (2) need for many people to collaborate and document, (3) low-hanging fruit in designing better online platforms for making intellectual progress -- Habryka works on LessWrong 2.0 for that reason, and Gooen has past experience in the space with his building of Guesstimate, (4) promise and tractability for forecasting platforms in particular (for instance, work by Philip Tetlock and work by Robin Hanson), (5) Even though some platforms, such as Predictionbook and Guesstimate, did not get the traction they expected, others like the Good Judgment Project have been successful, so one should not overgeneralize from a few failures. In addition, Habryka has a positive impression of Gooen in both in-person interaction and online writing

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 7.58%

Donor reason for donating at this time (rather than earlier or later): Timing determined partly by timing of grant round. Gooen was a recipient of a previous $20,000 grant from the same fund (the EA Long-Term Future Fund) and found the money very helpful. He applied for more money in this round to scale the project up further

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) The comments discuss this and the other forecasting grants, and include the question "why are you acting as grant-givers here rather than as special interest investors?" It is also included in a list of potentially concerning grants in a portfolio evaluation comment https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions#d4YHzSJnNWmyxf6HM (GW, IR) by Evan Gaensbauer.
Tegan McCaslin30,000.0032019-04-07AI safety/forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvlOliver Habryka Alex Zhu Matt Wage Helen Toner Matt Fallshaw Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Living expenses during research project

Intended use of funds: Grant for independent research projects relevant to AI forecasting and strategy, including (but not necessarily limited to) some of the following: (1) Does the trajectory of AI capability development match that of biological evolution? (2) How tractable is long-term forecasting? (3) How much compute did evolution use to produce intelligence? (4)Benchmarking AI capabilities against insects. Short doc on (1) and (2) at https://docs.google.com/document/d/1hTLrLXewF-_iJiefyZPF6L677bLrUTo2ziy6BQbxqjs/edit

Donor reason for selecting the donee: Reasons for the grant from Oliver Habryka, the main influencer, include: (1) It's easier to relocate someone who has already demonstrated trust and skills than to find someone completely new, (2.1) It's important to give good researchers runway while they find the right place. Habryka notes: "my brief assessment of Tegan’s work was not the reason why I recommended this grant, and if Tegan asks for a new grant in 6 months to focus on solo research, I will want to spend significantly more time reading her output and talking with her, to understand how these questions were chosen and what precise relation they have to forecasting technological progress in AI."

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee). Habryka also mentions that he is interested only in providing limited runway, and would need to assess much more carefully for a more long-term grant
Percentage of total donor spend in the corresponding batch of donations: 3.25%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. However, it is also related to the grantee's situation (she has just quit her job at AI Impacts, and needs financial runway to continue pursuing promising research projects)
Intended funding timeframe in months: 6

Donor thoughts on making further donations to the donee: The grant investigator Oliver Habryka notes: "if Tegan asks for a new grant in 6 months to focus on solo research, I will want to spend significantly more time reading her output and talking with her, to understand how these questions were chosen and what precise relation they have to forecasting technological progress in AI."

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) The comments on the post do not discuss this specific grant, but a grant to Lauren Lee that includes somewhat similar reasoning (providing people runway after they leave their jobs, so they can explore better) attracts some criticism.
Metaculus (Earmark: Anthony Aguirre)70,000.0012019-04-07Forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvlOliver Habryka Alex Zhu Matt Wage Helen Toner Matt Fallshaw Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to Anothony Aguirre to expand the Metaculus prediction platform along with its community. Metaculus.com is a fully-functional prediction platform with ~10,000 registered users and >120,000 predictions made to date on more than >1000 questions. The two major high-priority expansions are: (1) An integrated set of extensions to improve user interaction and information-sharing. This would include private messaging and notifications, private groups, a prediction “following” system to create micro-teams within individual questions, and various incentives and systems for information-sharing. (2) Link questions into a network. Users would express links between questions, from very simple (“notify me regarding question Y when P(X) changes substantially) to more complex (“Y happens only if X happens, but not conversely”, etc.) Information can also be gleaned from what users actually do.

Donor reason for selecting the donee: The grant investigator and main influencer, Oliver Habryka, refers to reasoning included in the grant to Ozzie Gooen for Foretold, that is made in the same batch of grants and described at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) He also lists these reasons for liking Metaculus: (1) Valuable service in the past few years, (2) Cooperation with the X-risk space to get answers to important questions

Donor reason for donating that amount (rather than a bigger or smaller amount): The grantee requested $150,000, but Oliver Habryka, the grant investigator, was not confident enough in the grant to recommend the full amount. Some concerns mentioned: (1) Lack of a dedicated full-time resource, (2) Overlap with the Good Judgment Project, that reduces its access to resources and people
Percentage of total donor spend in the corresponding batch of donations: 7.58%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) The comments discuss this and the other forecasting grants, and include the question "why are you acting as grant-givers here rather than as special interest investors?" It is also included in a list of potentially concerning grants in a portfolio evaluation comment https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions#d4YHzSJnNWmyxf6HM (GW, IR) by Evan Gaensbauer.
Jacob Lagerros27,000.0042019-04-07AI safety/forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvlOliver Habryka Alex Zhu Matt Wage Helen Toner Matt Fallshaw Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Living expenses during research project|Direct project expenses

Intended use of funds: Grant to build a private platform where AI safety and policy researchers have direct access to a base of superforecaster-equivalents. Lagerros previously received two grants to work on the project: a half-time salary from Effective Altruism Grants, and a grant for direct project expenses from Berkeley Existential Risk Initiative.

Donor reason for selecting the donee: Grant investigator and main influencer notes the same high-level reasons for the grant as for similar grants to Anothony Aguirre (Metaculus) and Ozzie Gooen (Foretold); the general reasons are explained in the grant writeup for Gooen. Habryka also mentions Lagerros being around the community for 3 years, and having done useful owrk and received other funding. Habryka mentions he did not assess the grant in detail; the main reason for granting from the Long-Term Future Fund was due to logistical complications with other grantmakers (FHI and BERI), who already vouched for the value of the project

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 2.92%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) The comments discuss this and the other forecasting grants, and include the question "why are you acting as grant-givers here rather than as special interest investors?" It is also included in a list of potentially concerning grants in a portfolio evaluation comment https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions#d4YHzSJnNWmyxf6HM (GW, IR) by Evan Gaensbauer.
Connor Flexman20,000.0052019-04-07AI safety/forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvlOliver Habryka Alex Zhu Matt Wage Helen Toner Matt Fallshaw Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Living expenses during research project

Intended use of funds: Grant to perform independent research in collaboration with John Salvatier

Donor reason for selecting the donee: The grant was originally requested by John Salvatier (who is already funded by an EA Grant), as a grant to Salvatier to hire Flexman to help him. But Oliver Habryka (the primary person on whose recommendation the grant was made) ultimately decided to give the money to Flexman to give him more flexibility to switch if the work with Salvatier does not go well. Despite the reservations, Habryka considers significant negative consequences unlkely. Habryka also says: "I assign some significant probability that this grant can help Connor develop into an excellent generalist researcher of a type that I feel like EA is currently quite bottlenecked on." Habryka has two other reservations: potential conflict of interest because he lives in the same house as the recipient, and lack of concrete, externally verifiable evidence of competence

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 2.17%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) Habryka was the primary person on whose recommendation the grant was made. Habryka replies to a comment giving ideas on what independent research Flexman might produce if he stops working with Salvatier.
Foretold (Earmark: Ozzie Gooen)20,000.0052018-11-29Forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSiAlex Zhu Helen Toner Matt Fallshaw Matt Wage Oliver Habryka Donation process: Donee submitted grant application through the application form for the November 2018 round of grants from the Long-Term Future Fund, and was selected as a grant recipient

Intended use of funds (category): Organizational general support

Intended use of funds: Ozzie Gooen plans to build an online community of EA forecasters, researchers, and data scientists to predict variables of interest to the EA community. Ozzie proposed using the platform to answer a range of questions, including examples like “How many Google searches will there be for reinforcement learning in 2020?” or “How many plan changes will 80,000 hours cause in 2020?”, and using the results to help EA organizations and individuals to prioritize. The grant funds the project's basic setup and initial testing. The community and tool would later get created with the name Foretold; it is available at https://www.foretold.io/

Donor reason for selecting the donee: The grant decision was made based on past success by Ozzie Gooen with Guesstimate https://www.getguesstimate.com/ as well as belief both in the broad value of the project and the specifics of the project plan.

Donor reason for donating that amount (rather than a bigger or smaller amount): Amount likely determined by the specifics of the project plan and the scope of this round of funding, namely, the project's basic setup and initial testing.
Percentage of total donor spend in the corresponding batch of donations: 20.94%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round, and also by the donee's desire to start the project

Donor retrospective of the donation: The Long-Term Future Fund would make a followup grant of $70,000 to Foretold in the April 2019 grant round https://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl see also https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) for more detail

Similarity to other donors

Sorry, we couldn't find any similar donors.