Effective Altruism Funds: Long-Term Future Fund donations made (filtered to cause areas matching Rationality improvement)

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of March 2022. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Basic donor information

We do not have any donor information for the donor Effective Altruism Funds: Long-Term Future Fund in our system.

This entity is also a donee.

Donor donation statistics

Cause areaCountMedianMeanMinimum10th percentile 20th percentile 30th percentile 40th percentile 50th percentile 60th percentile 70th percentile 80th percentile 90th percentile Maximum
Overall 5 30,000 78,404 10,000 10,000 10,000 28,000 28,000 30,000 30,000 150,000 150,000 174,021 174,021
Rationality improvement 5 30,000 78,404 10,000 10,000 10,000 28,000 28,000 30,000 30,000 150,000 150,000 174,021 174,021

Donation amounts by cause area and year

If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.

Note: Cause area classification used here may not match that used by donor for all cases.

Cause area Number of donations Number of donees Total 2019 2018
Rationality improvement (filter this donor) 5 4 392,021.00 218,000.00 174,021.00
Total 5 4 392,021.00 218,000.00 174,021.00

Graph of spending by cause area and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by cause area and year (cumulative)

Graph of spending should have loaded here

Donation amounts by subcause area and year

If you hover over a cell for a given subcause area and year, you will get a tooltip with the number of donees and the number of donations.

For the meaning of “classified” and “unclassified”, see the page clarifying this.

Subcause area Number of donations Number of donees Total 2019 2018
Rationality improvement 5 4 392,021.00 218,000.00 174,021.00
Classified total 5 4 392,021.00 218,000.00 174,021.00
Unclassified total 0 0 0.00 0.00 0.00
Total 5 4 392,021.00 218,000.00 174,021.00

Graph of spending by subcause area and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by subcause area and year (cumulative)

Graph of spending should have loaded here

Donation amounts by donee and year

Donee Cause area Metadata Total 2019 2018
Center for Applied Rationality (filter this donor) Rationality FB Tw WP Site TW 324,021.00 150,000.00 174,021.00
Eli Tyre (filter this donor) 30,000.00 30,000.00 0.00
Effective Altruism Russia (filter this donor) 28,000.00 28,000.00 0.00
Roam Research (filter this donor) 10,000.00 10,000.00 0.00
Total -- -- 392,021.00 218,000.00 174,021.00

Graph of spending by donee and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by donee and year (cumulative)

Graph of spending should have loaded here

Donation amounts by influencer and year

If you hover over a cell for a given influencer and year, you will get a tooltip with the number of donees and the number of donations.

For the meaning of “classified” and “unclassified”, see the page clarifying this.

Influencer Number of donations Number of donees Total 2019 2018
Oliver Habryka|Alex Zhu|Matt Wage|Helen Toner|Matt Fallshaw 3 3 208,000.00 208,000.00 0.00
Nick Beckstead 1 1 174,021.00 0.00 174,021.00
Alex Zhu|Helen Toner|Matt Wage|Oliver Habryka 1 1 10,000.00 10,000.00 0.00
Classified total 5 4 392,021.00 218,000.00 174,021.00
Unclassified total 0 0 0.00 0.00 0.00
Total 5 4 392,021.00 218,000.00 174,021.00

Graph of spending by influencer and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by influencer and year (cumulative)

Graph of spending should have loaded here

Donation amounts by disclosures and year

Sorry, we couldn't find any disclosures information.

Donation amounts by country and year

Sorry, we couldn't find any country information.

Full list of donations in reverse chronological order (5 donations)

DoneeAmount (current USD)Amount rank (out of 5)Donation dateCause areaURLInfluencerNotes
Roam Research10,000.0052019-08-30Rationality improvementhttps://app.effectivealtruism.org/funds/far-future/payouts/4UBI3Q0TBGbWcIZWCh4EQVAlex Zhu Helen Toner Matt Wage Oliver Habryka Donation process: Grantee applied through the online application process, and was selected based on review by the fund managers. Alex Zhu was the fund manager most excited about the grant, and responsible for the public write-up

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to support the continued development of Roam, a web application from Conor White-Sullivan filling a similar niche as Workflowy. Roam automates the Zettelkasten method, "a note-taking / document-drafting process based on physical index cards." The grant write-up says: "This funding will support Roam’s general operating costs, including expenses for Conor, one employee, and several contractors."

Donor reason for selecting the donee: Fund manager Alex Zhu writes: "On my inside view, if Roam succeeds, an experienced user of the note-taking app Workflowy will get at least as much value switching to Roam as they got from using Workflowy in the first place. (Many EAs, myself included, see Workflowy as an integral part of our intellectual process, and I think Roam might become even more integral than Workflowy" and links to Sarah Constantin's posts on Roam: https://www.facebook.com/sarah.constantin.543/posts/242611079943317 and https://srconstantin.posthaven.com/how-to-make-a-memex Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Center for Applied Rationality150,000.0022019-04-07Rationality improvementhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvlOliver Habryka Alex Zhu Matt Wage Helen Toner Matt Fallshaw Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Organizational general support

Intended use of funds: The grant is to help the Center for Applied Rationality (CFAR) survive as an organization for the next few months (i.e., till the next grant round, which is 3 months later) without having to scale down operations. CFAR is low on finances because they did not run a 2018 fundraiser. because they felt that running a fundraiser would be in bad taste after what they considered a messup on their part in the Brent Dill situation

Donor reason for selecting the donee: Grant investigator and main influencer Oliver Habryka thinks CFAR intro workshops have had positive impact in 3 ways: (1) establishing epistemic norms, (2) training, and (3) recruitment into the X-risk network (especially AI safety). He also thinks CFAR faces many challenges, including the departure of many key employees, the difficulty of attracting top talent, and a dilution of its truth-seeking focus. However, he is enthusiastic about joint CFAR/MIRI workshops for programmers, where CFAR provides instructors. His final reason for donating is to avoid CFAR having to scale down due to its funding shortfall because it didn't run the 2018 fundraiser

Donor reason for donating that amount (rather than a bigger or smaller amount): The grant amount, which is the largest in this grant round from the EA Long-Term Future Fund, is chosen to be sufficient for CFAR to continue operating as usual till the next grant round from the EA Long-Term Future Fund (in about 3 months). Habryka further elaborates in https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-recommendations#uhH4ioNbdaFrwGt4e (GW, IR) in reply to Milan Griffes, explaining why the grant is large and unrestricted
Percentage of total donor spend in the corresponding batch of donations: 16.25%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round, as well as by CFAR's time-sensitive financial situation; the grant round is a few months after the end of 2018, so the shortfall of funds raised because of not conducting the 2018 fundraiser is starting to hit on the finances
Intended funding timeframe in months: 3

Donor thoughts on making further donations to the donee: Grant investigator and main influencer Oliver Habryka writes: "I didn’t have enough time this grant round to understand how the future of CFAR will play out; the current grant amount seems sufficient to ensure that CFAR does not have to take any drastic action until our next grant round. By the next grant round, I plan to have spent more time learning and thinking about CFAR’s trajectory and future, and to have a more confident opinion about what the correct funding level for CFAR is."

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) In the comments, Milan Griffes asks why such a large, unrestricted grant is being made to CFAR despite these concerns, and also what Habryka hopes to learn about CFAR before the next grant round. There are replies from Peter McCluskey and Habryka, with some further comment back-and-forth.
Effective Altruism Russia (Earmark: Mikhail Yugadin)28,000.0042019-04-07Rationality improvementhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvlOliver Habryka Alex Zhu Matt Wage Helen Toner Matt Fallshaw Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to Mikhail Yugadin for Effective Altruism Russia to give copies of Harry Potter and the Methods of Rationality to the winners of EGMO 2019 and IMO 2020.

Donor reason for selecting the donee: In the grant write-up, Oliver Habryka explains his evaluation of the grant as based on three questions: (1) What effects does reading HPMOR have on people? (2) How good of a target group are Math Olympiad winners for these effects? (3) Is the team competent enough to execute on their plan?

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee). The comments include more discussion of the unit economics of the grant, and whether the effective cost of $43/copy is reasonable
Percentage of total donor spend in the corresponding batch of donations: 3.03%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. No specific timing-related considerations are discussed. The need to secure money in advance of the events for which the money will be used likely affected the timing of the application

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) There is a lot of criticism and discussion of the grant in the comments.
Eli Tyre30,000.0032019-04-07Rationality improvementhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvlOliver Habryka Alex Zhu Matt Wage Helen Toner Matt Fallshaw Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to support projects for rationality and community building interventions. Example projects: facilitating conversations between top people in AI alignment, organization advanced workshops on double crux, doing independent research projects such as https://www.lesswrong.com/posts/tj8QP2EFdP8p54z6i/historical-mathematicians-exhibit-a-birth-order-effect-too (GW, IR) (evaluating burth order effects in mathematicians), providing new EAs and rationalists with advice and guidance on how to get traction on working on important problems, and helping John Salvatier develop techniques around skill transfer. Grant investigator and main influencer Oliver Habryka writes: "the goal of this grant is to allow [Eli Tyre] to take actions with greater leverage by hiring contractors, paying other community members for services, and paying for other varied expenses associated with his projects."

Donor reason for selecting the donee: Grant investigation and main influencer is excited about the projects Tyre is interested in working on, and writes: "Eli has worked on a large variety of interesting and valuable projects over the last few years, many of them too small to have much payment infrastructure, resulting in him doing a lot of work without appropriate compensation. I think his work has been a prime example of picking low-hanging fruit by using local information and solving problems that aren’t worth solving at scale, and I want him to have resources to continue working in this space."

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 3.25%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decision (GW, IR).
Center for Applied Rationality174,021.0012018-08-14Rationality improvementhttps://app.effectivealtruism.org/funds/far-future/payouts/6g4f7iae5Ok6K6YOaAiyK0Nick Beckstead Donation process: The grant from the EA Long-Term Future Fund is part of a final set of grant decisions being made by Nick Beckstead (granting $526,000 from the EA Meta Fund and $917,000 from the EA Long-Term Future Fund) as he transitions out of managing both funds. Due to time constraints, Beckstead primarily relies on investigation of the organization done by the Open Philanthropy Project when making its 2018 grant https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018

Intended use of funds (category): Organizational general support

Intended use of funds: Beckstead writes "I recommended these grants with the suggestion that these grantees look for ways to use funding to trade money for saving the time or increasing the productivity of their employees (e.g. subsidizing electronics upgrades or childcare), due to a sense that (i) their work is otherwise much less funding constrained than it used to be, and (ii) spending like this would better reflect the value of staff time and increase staff satisfaction. However, I also told them that I was open to them using these funds to accomplish this objective indirectly (e.g. through salary increases) or using the funds for another purpose if that seemed better to them."

Donor reason for selecting the donee: The grant page references https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018 for Beckstead's opinion of the donee.

Donor reason for donating that amount (rather than a bigger or smaller amount): The grant page says "The amounts I’m granting out to different organizations are roughly proportional to the number of staff they have, with some skew towards MIRI that reflects greater EA Funds donor interest in the Long-Term Future Fund." Also: "I think a number of these organizations could qualify for the criteria of either the Long-Term Future Fund or the EA Community Fund because of their dual focus on EA and longtermism, which is part of the reason that 80,000 Hours is receiving a grant from each fund."
Percentage of total donor spend in the corresponding batch of donations: 18.98%

Donor reason for donating at this time (rather than earlier or later): Timing determined by the timing of this round of grants, which is in turn determined by the need for Beckstead to grant out the money before handing over management of the fund

Donor retrospective of the donation: Even after the fund management being moved to a new team, the EA Long-Term Future Fund would continue making grants to CFAR.

Similarity to other donors

Sorry, we couldn't find any similar donors.