This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of March 2022. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.
We do not have any donor information for the donor Effective Altruism Funds: Long-Term Future Fund in our system.
This entity is also a donee.
Full donor page for donor Effective Altruism Funds: Long-Term Future Fund
|Donors list page||http://rationality.org/about/top-donors|
|Transparency and financials page||http://rationality.org/about/official-records|
|Donation case page||http://lesswrong.com/lw/n39/why_cfar_the_view_from_2015/|
|Open Philanthropy Project grant review||http://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support|
|Timelines wiki page||https://timelines.issarice.com/wiki/Timeline_of_Center_for_Applied_Rationality|
|Org Watch page||https://orgwatch.issarice.com/?organization=Center+for+Applied+Rationality|
|Key people||Julia Galef|Anna Salamon|
Full donee page for donee Center for Applied Rationality
|Cause area||Count||Median||Mean||Minimum||10th percentile||20th percentile||30th percentile||40th percentile||50th percentile||60th percentile||70th percentile||80th percentile||90th percentile||Maximum|
If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.
Note: Cause area classification used here may not match that used by donor for all cases.
|Cause area||Number of donations||Total||2019||2018|
|Rationality improvement (filter this donor)||2||324,021.00||150,000.00||174,021.00|
Graph of spending by cause area and year (incremental, not cumulative)
Graph of spending by cause area and year (cumulative)
|Amount (current USD)||Amount rank (out of 2)||Donation date||Cause area||URL||Influencer||Notes|
|150,000.00||2||Rationality improvement||https://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl||Oliver Habryka Alex Zhu Matt Wage Helen Toner Matt Fallshaw||Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)
Intended use of funds (category): Organizational general support
Intended use of funds: The grant is to help the Center for Applied Rationality (CFAR) survive as an organization for the next few months (i.e., till the next grant round, which is 3 months later) without having to scale down operations. CFAR is low on finances because they did not run a 2018 fundraiser. because they felt that running a fundraiser would be in bad taste after what they considered a messup on their part in the Brent Dill situation
Donor reason for selecting the donee: Grant investigator and main influencer Oliver Habryka thinks CFAR intro workshops have had positive impact in 3 ways: (1) establishing epistemic norms, (2) training, and (3) recruitment into the X-risk network (especially AI safety). He also thinks CFAR faces many challenges, including the departure of many key employees, the difficulty of attracting top talent, and a dilution of its truth-seeking focus. However, he is enthusiastic about joint CFAR/MIRI workshops for programmers, where CFAR provides instructors. His final reason for donating is to avoid CFAR having to scale down due to its funding shortfall because it didn't run the 2018 fundraiser
Donor reason for donating that amount (rather than a bigger or smaller amount): The grant amount, which is the largest in this grant round from the EA Long-Term Future Fund, is chosen to be sufficient for CFAR to continue operating as usual till the next grant round from the EA Long-Term Future Fund (in about 3 months). Habryka further elaborates in https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-recommendations#uhH4ioNbdaFrwGt4e (GW, IR) in reply to Milan Griffes, explaining why the grant is large and unrestricted
Percentage of total donor spend in the corresponding batch of donations: 16.25%
Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round, as well as by CFAR's time-sensitive financial situation; the grant round is a few months after the end of 2018, so the shortfall of funds raised because of not conducting the 2018 fundraiser is starting to hit on the finances
Intended funding timeframe in months: 3
Donor thoughts on making further donations to the donee: Grant investigator and main influencer Oliver Habryka writes: "I didn’t have enough time this grant round to understand how the future of CFAR will play out; the current grant amount seems sufficient to ensure that CFAR does not have to take any drastic action until our next grant round. By the next grant round, I plan to have spent more time learning and thinking about CFAR’s trajectory and future, and to have a more confident opinion about what the correct funding level for CFAR is."
Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) In the comments, Milan Griffes asks why such a large, unrestricted grant is being made to CFAR despite these concerns, and also what Habryka hopes to learn about CFAR before the next grant round. There are replies from Peter McCluskey and Habryka, with some further comment back-and-forth.
|174,021.00||1||Rationality improvement||https://app.effectivealtruism.org/funds/far-future/payouts/6g4f7iae5Ok6K6YOaAiyK0||Nick Beckstead||Donation process: The grant from the EA Long-Term Future Fund is part of a final set of grant decisions being made by Nick Beckstead (granting $526,000 from the EA Meta Fund and $917,000 from the EA Long-Term Future Fund) as he transitions out of managing both funds. Due to time constraints, Beckstead primarily relies on investigation of the organization done by the Open Philanthropy Project when making its 2018 grant https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018
Intended use of funds (category): Organizational general support
Intended use of funds: Beckstead writes "I recommended these grants with the suggestion that these grantees look for ways to use funding to trade money for saving the time or increasing the productivity of their employees (e.g. subsidizing electronics upgrades or childcare), due to a sense that (i) their work is otherwise much less funding constrained than it used to be, and (ii) spending like this would better reflect the value of staff time and increase staff satisfaction. However, I also told them that I was open to them using these funds to accomplish this objective indirectly (e.g. through salary increases) or using the funds for another purpose if that seemed better to them."
Donor reason for selecting the donee: The grant page references https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018 for Beckstead's opinion of the donee.
Donor reason for donating that amount (rather than a bigger or smaller amount): The grant page says "The amounts I’m granting out to different organizations are roughly proportional to the number of staff they have, with some skew towards MIRI that reflects greater EA Funds donor interest in the Long-Term Future Fund." Also: "I think a number of these organizations could qualify for the criteria of either the Long-Term Future Fund or the EA Community Fund because of their dual focus on EA and longtermism, which is part of the reason that 80,000 Hours is receiving a grant from each fund."
Percentage of total donor spend in the corresponding batch of donations: 18.98%
Donor reason for donating at this time (rather than earlier or later): Timing determined by the timing of this round of grants, which is in turn determined by the need for Beckstead to grant out the money before handing over management of the fund
Donor retrospective of the donation: Even after the fund management being moved to a new team, the EA Long-Term Future Fund would continue making grants to CFAR.