Nick Beckstead money moved

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of July 2024. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

This entity is also a donor.

Full list of documents in reverse chronological order (3 documents)

Title (URL linked)Publication dateAuthorPublisherAffected donorsAffected doneesAffected influencersDocument scopeCause areaNotes
Fireside chat | Nick Beckstead | EA Global: SF 222022-08-22Nick Beckstead EA GlobalFTX Future Fund Broad donor strategyIn this fireside chat, Nick Beckstead talks about the work of the FTX Future Fund. He reiterates points made in various blog posts by FTX earlier in the year. He talks about how FTX has tried to scale up giving rapidly. He says he's happy with how much FTX has been able to do relative to its team size, but that money moved per employee is not the metric to optimize for; what the money achieves is what's important. He talks about the regranting program and the open call for applications. Later in the talk, he reminisces on his Ph.D. thesis and how his thinking has evolved since then.
Some clarifications on the Future Fund's approach to grantmaking (GW, IR)2022-05-09Nick Beckstead Effective Altruism ForumFTX Future Fund Broad donor strategyLongtermismThis blog post is written partly in response to the concerns that https://forum.effectivealtruism.org/posts/HWaH8tNdsgEwNZu8B/free-spending-ea-might-be-a-big-problem-for-optics-and (GW, IR) raises. The post clarifies the processes and safeguards the FTX Future Fund has in place for reviewing grants, and explains how FTX is able to manage a large grant volume with a small team: mainly by relying on regranting. The post also clarifies that FTX has not granted much for community-building, so some of the concerns specifically related to community-building grants don't matter quite yet.
Update on Investigating Neglected Goals in Biological Research2017-11-30Nick Beckstead Open PhilanthropyOpen Philanthropy Good Ventures/not recommended by GiveWell or Open Philanthropy Project Target Malaria Broad donor strategyScientific research,Global health,Biosecurity and pandemic preparedness,AgricultureThe blog post describes the way the Open Philanthropy Project is identifying neglected goals in biological research. Previously the hope was to investigate sub-areas deeply and produce write-ups. Now, the approach is more "opportunistic": rather than do public write-ups, staff look out for good opportunities for shovel-ready or highly promising grants in the specific topics identified as having strong potential.

Full list of donations in reverse chronological order (59 donations)

DonorDoneeAmount (current USD)Donation dateCause areaURLNotes
Open PhilanthropyRedwood Research9,420,000.002021-11AI safety/technical researchhttps://www.openphilanthropy.org/grants/redwood-research-general-support/ Intended use of funds (category): Organizational general support

Intended use of funds: Grant "for general support. Redwood Research is a new research institution that conducts research to better understand and make progress on AI alignment in order to improve the long-run future."

Donor retrospective of the donation: The followup grant https://www.openphilanthropy.org/grants/redwood-research-general-support-2/ of a comparable amount ($10.7 million) suggests continued satisfaction with the grantee.

Other notes: This is a total across four grants.
Open PhilanthropyUniversity of Southern California320,000.002021-08AI safety/technical researchhttps://www.openphilanthropy.org/grants/university-of-southern-california-adversarial-robustness-research/ Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support early-career research by Robin Jia on adversarial robustness and out-of-distribution generalization as a means to improve AI safety."

Donor reason for donating that amount (rather than a bigger or smaller amount): No explicit reasons for the amount are given, but the amount is similar to the amounts for other grants from Open Philanthropy to early-stage researchers in adversarial robustness research. This includes the two other grants https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/stanford-adversarial-robustness-research-tsipras and https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/stanford-adversarial-robustness-research-santurkar made around the same time, as well as grants earlier in the year to researchers at Carnegie Mellon University, University of Tübingen, and UC Berkeley.

Donor reason for donating at this time (rather than earlier or later): At around the same time as this grant, Open Philanthropy made two other grants https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/stanford-adversarial-robustness-research-tsipras and https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/stanford-adversarial-robustness-research-santurkar to early-stage researchers in adversarial robustness research.
Intended funding timeframe in months: 36
Open PhilanthropyDaniel Dewey175,000.002021-05AI safetyhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/daniel-dewey-ai-alignment-project Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to support "work on an AI alignment project and related field-building efforts. Daniel plans to use this funding to produce writing and reports summarizing existing research and investigating potentially valuable projects relevant to AI alignment, with the goal of helping junior researchers and others understand how they can contribute to the field."

Donor retrospective of the donation: The followup grant https://www.openphilanthropy.org/grants/daniel-dewey-ai-alignment-projects-2022/ suggests continued satisfaction with the grant outcome.
Open PhilanthropyBrian Christian66,000.002021-03AI safety/movement growthhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/brian-christian-alignment-book-promotion Intended use of funds (category): Direct project expenses

Intended use of funds: Contractor agreement "with Brian Christian to support the promotion of his book The Alignment Problem: Machine Learning and Human Values."

Donor reason for selecting the donee: The grant page says: "Our potential risks from advanced artificial intelligence team hopes that the book will generate interest in AI alignment among academics and others."
Open PhilanthropyUniversity of Tübingen590,000.002021-02AI safety/technical researchhttps://www.openphilanthropy.org/grants/university-of-tubingen-robustness-research-wieland-brendel/ Intended use of funds (category): Direct project expenses

Intended use of funds: The grant page says the grant is "to support early-career research by Wieland Brendel on robustness as a means to improve AI safety."

Donor reason for selecting the donee: Open Phil made five grants https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/university-of-tuebingen-adversarial-robustness-hein https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-santa-cruz-xie-adversarial-robustness https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-wagner https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-song https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/mit-adversarial-robustness-research for "adversarial robustness research" in January and February 2021, around the time of this grant. It looks like the donor became interested in funding this research topic at this time.

Donor reason for donating at this time (rather than earlier or later): Open Phil made five grants https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/university-of-tuebingen-adversarial-robustness-hein https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-santa-cruz-xie-adversarial-robustness https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-wagner https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-song https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/mit-adversarial-robustness-research for "adversarial robustness research" in January and February 2021, around the time of this grant. It looks like the donor became interested in funding this research topic at this time.
Intended funding timeframe in months: 36
Open PhilanthropyUniversity of Tübingen300,000.002021-02AI safety/technical researchhttps://www.openphilanthropy.org/grants/university-of-tubingen-adversarial-robustness-research-matthias-hein/ Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support research by Professor Matthias Hein on adversarial robustness as a means to improve AI safety."

Donor reason for selecting the donee: This is one of five grants made by the donor for "adversarial robustness research" in January and February 2021, all with the same grant investigators (Catherine Olsson and Daniel Dewey) except the Santa Cruz grant that had Olsson and Nick Beckstead. https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-santa-cruz-xie-adversarial-robustness https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/mit-adversarial-robustness-research https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-wagner and https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-song are the four other grants. It looks like the donor became interested in funding this research topic at this time.

Donor reason for donating that amount (rather than a bigger or smaller amount): No explicit reasons for the amount are given, but the amount is similar to the amounts for other grants from Open Philanthropy to early-stage researchers in adversarial robustness research. This includes three other grants https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-santa-cruz-xie-adversarial-robustness https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-wagner and https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-song made at the same time as well as grants later in the year to early-stage researchers at Carnegie Mellon University, Stanford University, and University of Southern California.

Donor reason for donating at this time (rather than earlier or later): This is one of five grants made by the donor for "adversarial robustness research" in January and February 2021, all with the same grant investigators (Catherine Olsson and Daniel Dewey) except the Santa Cruz grant that had Olsson and Nick Beckstead. https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-santa-cruz-xie-adversarial-robustness https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/mit-adversarial-robustness-research https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-wagner and https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-song are the four other grants. It looks like the donor became interested in funding this research topic at this time.
Intended funding timeframe in months: 36
Open PhilanthropyCenter for Human-Compatible AI11,355,246.002021-01AI safety/technical researchhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-center-human-compatible-ai-2021 Intended use of funds (category): Organizational general support

Intended use of funds: The grant page says "The multi-year commitment and increased funding will enable CHAI to expand its research and student training related to potential risks from advanced artificial intelligence."

Other notes: This is a renewal of the original founding grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-center-human-compatible-ai made August 2016. Intended funding timeframe in months: 60.
Open PhilanthropyUniversity of California, Santa Cruz265,000.002021-01AI safety/technical researchhttps://www.openphilanthropy.org/grants/uc-santa-cruz-adversarial-robustness-research/ Intended use of funds (category): Direct project expenses

Intended use of funds: The grant page says the grant is "to support early-career research by Cihang Xie on adversarial robustness as a means to improve AI safety."

Donor reason for selecting the donee: This is one of five grants made by the donor for "adversarial robustness research" in January and February 2021, all with the same grant investigators (Catherine Olsson and Daniel Dewey) except the Santa Cruz grant that had Olsson and Nick Beckstead. https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/university-of-tuebingen-adversarial-robustness-hein https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/mit-adversarial-robustness-research https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-wagner and https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-song are the four other grants. It looks like the donor became interested in funding this research topic at this time.

Donor reason for donating that amount (rather than a bigger or smaller amount): No explicit reasons for the amount are given, but the amount is similar to the amounts for other grants from Open Philanthropy to early-stage researchers in adversarial robustness research. This includes three other grants https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/university-of-tuebingen-adversarial-robustness-hein https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-wagner and https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-song made at the same time as well as grants later in the year to early-stage researchers at Carnegie Mellon University, Stanford University, and University of Southern California.

Donor reason for donating at this time (rather than earlier or later): This is one of five grants made by the donor for "adversarial robustness research" in January and February 2021, all with the same grant investigators (Catherine Olsson and Daniel Dewey) except the Santa Cruz grant that had Olsson and Nick Beckstead. https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/university-of-tuebingen-adversarial-robustness-hein https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/mit-adversarial-robustness-research https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-wagner and https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-song are the four other grants. It looks like the donor became interested in funding this research topic at this time.
Intended funding timeframe in months: 36

Donor retrospective of the donation: The followup grant https://www.openphilanthropy.org/grants/university-of-california-santa-cruz-adversarial-robustness-research-2023/ to support the same research leader and research agenda suggests satisfaction with the grant outcome.
Open PhilanthropyEffective Altruism Foundation1,000,000.002019-07Effective altruismhttps://www.openphilanthropy.org/giving/grants/effective-altruism-foundation-research-operations Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support research and operations"

Donor reason for selecting the donee: The grant page says "A major purpose of this grant is to encourage and support EAF and our other grantees in the space in taking approaches to longtermism with greater emphasis on shared objectives between different value systems. We conceive of this grant as falling under our work aimed at growing and supporting the EA community." Earlier in the document, past reservations that Open Phil has had about EAF are described: "EAF is an organization whose values put a particular emphasis on trying to reduce the risks of future suffering. While preventing suffering is a value we share, we also believe that the speculative and suffering-focused nature of this work means that it needs to be communicated about carefully, and could be counterproductive otherwise. As a result, we have felt ambivalent about EAF’s work to date (despite feeling unambiguously positively about some of their projects)."

Other notes: The grant would be discussed further by Simon Knutsson in his critical post https://www.simonknutsson.com/problems-in-effective-altruism-and-existential-risk-and-what-to-do-about-them/ that also includes discussion of guidelines that Nick Beckstead of the Open Philanthropy Project developed, and that EAF was now adopting and encouraging others to adopt. Knutsson sees the adoption of the guidelines as being linked to the grant money, due to both the timing matching and the language of the grant page. On separate pages, Knutsson publishes correspondence between him and people at Open Phil and EAF where he tried to get more specific information from the two organizations: https://www.simonknutsson.com/e-mail-exchange-with-the-open-philanthropy-project and https://www.simonknutsson.com/message-exchange-with-eaf/. Intended funding timeframe in months: 24; announced: 2019-07-30.
Effective Altruism Funds: Long-Term Future Fund80,000 Hours91,450.002018-08-14Effective altruism/movement growth/career counselinghttps://funds.effectivealtruism.org/funds/payouts/july-2018-long-term-future-fund-grants Donation process: The grant from the EA Long-Term Future Fund is part of a final set of grant decisions being made by Nick Beckstead (granting $526,000 from the EA Meta Fund and $917,000 from the EA Long-Term Future Fund) as he transitions out of managing both funds. Due to time constraints, Beckstead primarily relies on investigation of the organization done by the Open Philanthropy Project when making its 2017 grant https://www.openphilanthropy.org/giving/grants/80000-hours-general-support and 2018 renewal https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2018

Intended use of funds (category): Organizational general support

Intended use of funds: Beckstead writes "I recommended these grants with the suggestion that these grantees look for ways to use funding to trade money for saving the time or increasing the productivity of their employees (e.g. subsidizing electronics upgrades or childcare), due to a sense that (i) their work is otherwise much less funding constrained than it used to be, and (ii) spending like this would better reflect the value of staff time and increase staff satisfaction. However, I also told them that I was open to them using these funds to accomplish this objective indirectly (e.g. through salary increases) or using the funds for another purpose if that seemed better to them."

Donor reason for selecting the donee: The grant page references https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2018 for Beckstead's opinion of the donee. This grant page is short, and in turn links to https://www.openphilanthropy.org/giving/grants/80000-hours-general-support which has a detailed Case for the grant section https://www.openphilanthropy.org/giving/grants/80000-hours-general-support#Case_for_the_grant that praises 80,000 Hours' track record in terms of impact-adjusted significant plan changes (IASPCs)

Donor reason for donating that amount (rather than a bigger or smaller amount): Beckstead is also recommending funding from the EA Meta Fund of $75,818 for 80,000 Hours. The grant page says "The amounts I’m granting out to different organizations are roughly proportional to the number of staff they have, with some skew towards MIRI that reflects greater EA Funds donor interest in the Long-Term Future Fund." Also: "I think a number of these organizations could qualify for the criteria of either the Long-Term Future Fund or the EA Community Fund because of their dual focus on EA and longtermism, which is part of the reason that 80,000 Hours is receiving a grant from each fund."
Percentage of total donor spend in the corresponding batch of donations: 9.97%

Donor reason for donating at this time (rather than earlier or later): Timing determined by the timing of this round of grants, which is in turn determined by the need for Beckstead to grant out the money before handing over management of the fund

Donor retrospective of the donation: Even after the fund management being moved to a new team, the EA Meta Fund would continue making grants to 80,000 Hours. In fact, 80,000 Hours would receive grant money in each of the three subsequent grant rounds. However, the EA Long-Term Future Fund would make no further grants to 80,000 Hours. This suggests that the selection of the grantee as a Long-Term Future Fund grantee would not continue to be endorsed by the new management team
Effective Altruism Funds: Long-Term Future FundMachine Intelligence Research Institute488,994.002018-08-14AI safetyhttps://funds.effectivealtruism.org/funds/payouts/july-2018-long-term-future-fund-grants Donation process: The grant from the EA Long-Term Future Fund is part of a final set of grant decisions being made by Nick Beckstead (granting $526,000 from the EA Meta Fund and $917,000 from the EA Long-Term Future Fund) as he transitions out of managing both funds. Due to time constraints, Beckstead primarily relies on investigation of the organization done by the Open Philanthropy Project when making its 2017 grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2017

Intended use of funds (category): Organizational general support

Intended use of funds: Beckstead writes "I recommended these grants with the suggestion that these grantees look for ways to use funding to trade money for saving the time or increasing the productivity of their employees (e.g. subsidizing electronics upgrades or childcare), due to a sense that (i) their work is otherwise much less funding constrained than it used to be, and (ii) spending like this would better reflect the value of staff time and increase staff satisfaction. However, I also told them that I was open to them using these funds to accomplish this objective indirectly (e.g. through salary increases) or using the funds for another purpose if that seemed better to them."

Donor reason for selecting the donee: The grant page references https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2017 for Beckstead's opinion of the donee.

Donor reason for donating that amount (rather than a bigger or smaller amount): The grant page says "The amounts I’m granting out to different organizations are roughly proportional to the number of staff they have, with some skew towards MIRI that reflects greater EA Funds donor interest in the Long-Term Future Fund." Also: "I think a number of these organizations could qualify for the criteria of either the Long-Term Future Fund or the EA Community Fund because of their dual focus on EA and longtermism, which is part of the reason that 80,000 Hours is receiving a grant from each fund."
Percentage of total donor spend in the corresponding batch of donations: 53.32%

Donor reason for donating at this time (rather than earlier or later): Timing determined by the timing of this round of grants, which is in turn determined by the need for Beckstead to grant out the money before handing over management of the fund

Donor retrospective of the donation: Even after the fund management being moved to a new team, the EA Long-Term Future Fund would continue making grants to MIRI.
Effective Altruism Funds: Long-Term Future FundCenter for Applied Rationality174,021.002018-08-14Epistemic institutionshttps://funds.effectivealtruism.org/funds/payouts/july-2018-long-term-future-fund-grants Donation process: The grant from the EA Long-Term Future Fund is part of a final set of grant decisions being made by Nick Beckstead (granting $526,000 from the EA Meta Fund and $917,000 from the EA Long-Term Future Fund) as he transitions out of managing both funds. Due to time constraints, Beckstead primarily relies on investigation of the organization done by the Open Philanthropy Project when making its 2018 grant https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018

Intended use of funds (category): Organizational general support

Intended use of funds: Beckstead writes "I recommended these grants with the suggestion that these grantees look for ways to use funding to trade money for saving the time or increasing the productivity of their employees (e.g. subsidizing electronics upgrades or childcare), due to a sense that (i) their work is otherwise much less funding constrained than it used to be, and (ii) spending like this would better reflect the value of staff time and increase staff satisfaction. However, I also told them that I was open to them using these funds to accomplish this objective indirectly (e.g. through salary increases) or using the funds for another purpose if that seemed better to them."

Donor reason for selecting the donee: The grant page references https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018 for Beckstead's opinion of the donee.

Donor reason for donating that amount (rather than a bigger or smaller amount): The grant page says "The amounts I’m granting out to different organizations are roughly proportional to the number of staff they have, with some skew towards MIRI that reflects greater EA Funds donor interest in the Long-Term Future Fund." Also: "I think a number of these organizations could qualify for the criteria of either the Long-Term Future Fund or the EA Community Fund because of their dual focus on EA and longtermism, which is part of the reason that 80,000 Hours is receiving a grant from each fund."
Percentage of total donor spend in the corresponding batch of donations: 18.98%

Donor reason for donating at this time (rather than earlier or later): Timing determined by the timing of this round of grants, which is in turn determined by the need for Beckstead to grant out the money before handing over management of the fund

Donor retrospective of the donation: Even after the fund management being moved to a new team, the EA Long-Term Future Fund would continue making grants to CFAR.
Effective Altruism Funds: Long-Term Future FundCentre for Effective Altruism162,537.002018-08-14Cause prioritizationhttps://funds.effectivealtruism.org/funds/payouts/july-2018-long-term-future-fund-grants Donation process: The grant from the EA Long-Term Future Fund is part of a final set of grant decisions being made by Nick Beckstead (granting $526,000 from the EA Meta Fund and $917,000 from the EA Long-Term Future Fund) as he transitions out of managing both funds. Due to time constraints, Beckstead primarily relies on investigation of the organization done by the Open Philanthropy Project when making its 2018 grant https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2018

Intended use of funds (category): Direct project expenses

Intended use of funds: Beckstead writes "CEA will use the LTF funding to support a new project whose objective is to expand global priorities research in academia, especially related to issues around longtermism."

Donor reason for selecting the donee: The grant page references https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2018 for Beckstead's opinion of the donee. This grant page is short, and in turn links to https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support for a more in-depth review of CEA

Donor reason for donating that amount (rather than a bigger or smaller amount): Beckstead is also recommending funding from the EA Meta Fund of $56,061 for CEA, which is for organizational general support (whereas the LTF grant is to support a new project on longtermism). Beckstead writes: "The amounts I’m granting out to different organizations are roughly proportional to the number of staff they have, with some skew towards MIRI that reflects greater EA Funds donor interest in the Long-term Future Fund."
Percentage of total donor spend in the corresponding batch of donations: 17.72%

Donor reason for donating at this time (rather than earlier or later): Timing determined by the timing of this round of grants, which is in turn determined by the need for Beckstead to grant out the money before handing over management of the fund

Donor retrospective of the donation: After transition to the new management team, the EA Long-Term Future Fund would make no further grants to CEA. This suggests that the selection of the grantee as a Long-Term Future Fund grantee would not continue to be endorsed by the new management team
Effective Altruism Funds: Effective Altruism Infrastructure FundFounders Pledge393,939.002018-08-14Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/6M8SQFdecEm0WuAYweO2UQ Donation process: The grant from the EA Meta Fund is part of a final set of grant decisions being made by Nick Beckstead (granting $526,000 from the EA Meta Fund and $917,000 from the EA Long Term Future Fund) as he transitions out of managing both funds. Due to time constraints, Beckstead primarily relies on investigation of the organization done by the Open Philanthropy Project for its 2016 grant https://www.openphilanthropy.org/giving/grants/founders-pledge-general-support and 2018 renewal https://www.openphilanthropy.org/giving/grants/founders-pledge-general-support-2018

Intended use of funds (category): Organizational general support

Intended use of funds: Beckstead writes "I recommended these grants with the suggestion that these grantees look for ways to use funding to trade money for saving the time or increasing the productivity of their employees (e.g. subsidizing electronics upgrades or childcare), due to a sense that (i) their work is otherwise much less funding constrained than it used to be, and (ii) spending like this would better reflect the value of staff time and increase staff satisfaction. However, I also told them that I was open to them using these funds to accomplish this objective indirectly (e.g. through salary increases) or using the funds for another purpose if that seemed better to them."

Donor reason for selecting the donee: The grant page references https://www.openphilanthropy.org/giving/grants/founders-pledge-general-support-2018 for Beckstead's opinion of the donee. That page says: "We hope this funding will reduce Founders Pledge’s time spent on organizational fundraising so that it can spend more time securing pledges from entrepreneurs and helping them to donate thoughtfully and impactfully." It also links to the 2016 grant page https://www.openphilanthropy.org/giving/grants/founders-pledge-general-support

Donor reason for donating that amount (rather than a bigger or smaller amount): Beckstead writes on the grant page: "The amounts I’m granting out to different organizations are roughly proportional to the number of staff they have, with some skew towards MIRI that reflects greater EA Funds donor interest in the Long-Term Future Fund."
Percentage of total donor spend in the corresponding batch of donations: 74.89%

Donor reason for donating at this time (rather than earlier or later): Timing determined by the timing of this round of grants, which is in turn determined by the need for Beckstead to grant out the money before handing over management of the fund

Donor retrospective of the donation: Even after the fund management being moved to a new team, the EA Meta Fund would continue making grants to Founders Pledge. In fact, Founders Pledge would receive grant money in both of the next two rounds. This suggests that the grant would be considered successful
Effective Altruism Funds: Effective Altruism Infrastructure Fund80,000 Hours75,818.002018-08-14Effective altruism/movement growth/career counselinghttps://app.effectivealtruism.org/funds/ea-community/payouts/6M8SQFdecEm0WuAYweO2UQ Donation process: The grant from the EA Meta Fund is part of a final set of grant decisions being made by Nick Beckstead (granting $526,000 from the EA Meta Fund and $917,000 from the EA Long Term Future Fund) as he transitions out of managing both funds. Due to time constraints, Beckstead primarily relies on investigation of the organization done by the Open Philanthropy Project when making its 2017 grant https://www.openphilanthropy.org/giving/grants/80000-hours-general-support and 2018 renewal https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2018

Intended use of funds (category): Organizational general support

Intended use of funds: Beckstead writes "I recommended these grants with the suggestion that these grantees look for ways to use funding to trade money for saving the time or increasing the productivity of their employees (e.g. subsidizing electronics upgrades or childcare), due to a sense that (i) their work is otherwise much less funding constrained than it used to be, and (ii) spending like this would better reflect the value of staff time and increase staff satisfaction. However, I also told them that I was open to them using these funds to accomplish this objective indirectly (e.g. through salary increases) or using the funds for another purpose if that seemed better to them."

Donor reason for selecting the donee: The grant page references https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2018 for Beckstead's opinion of the donee. This grant page is short, and in turn links to https://www.openphilanthropy.org/giving/grants/80000-hours-general-support which has a detailed Case for the grant section https://www.openphilanthropy.org/giving/grants/80000-hours-general-support#Case_for_the_grant that praises 80,000 Hours' track record in terms of impact-adjusted significant plan changes (IASPCs)

Donor reason for donating that amount (rather than a bigger or smaller amount): Beckstead is also recommending funding from the EA Long Term Future Fund of $91,450 for 80,000 Hours. The grant page says "The amounts I’m granting out to different organizations are roughly proportional to the number of staff they have, with some skew towards MIRI that reflects greater EA Funds donor interest in the Long-Term Future Fund." Also: "I think a number of these organizations could qualify for the criteria of either the Long-Term Future Fund or the EA Community Fund because of their dual focus on EA and longtermism, which is part of the reason that 80,000 Hours is receiving a grant from each fund."
Percentage of total donor spend in the corresponding batch of donations: 14.41%

Donor reason for donating at this time (rather than earlier or later): Timing determined by the timing of this round of grants, which is in turn determined by the need for Beckstead to grant out the money before handing over management of the fund

Donor retrospective of the donation: Even after the fund management being moved to a new team, the EA Meta Fund would continue making grants to 80,000 Hours. In fact, 80,000 Hours would receive grant money in each of the three subsequent grant rounds. This suggests that the grant would be considered successful
Effective Altruism Funds: Effective Altruism Infrastructure FundCentre for Effective Altruism56,061.002018-08-14Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/6M8SQFdecEm0WuAYweO2UQ Funding from the Effective Altruism Community Fund; an accompanying payment from the Long-Term Future Fund of $162,537 was also made. Beckstead recommended that the grantee spend the money to save time and increase productivity of employees (for instance, by subsidizing childcare or electronics). Percentage of total donor spend in the corresponding batch of donations: 10.66%.
Open PhilanthropyFuture of Humanity Institute12,066,808.932018-07Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/biosecurity/future-humanity-institute-work-on-global-catastrophic-risks Donation process: This is a series of awards totaling £13,428,434 ($16,200,062.78 USD at market rate on September 2, 2019); as of September 18, 2019, $12,066,808.93 of the amount has been allocated

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to support work on risks from advanced artificial intelligence, biosecurity and pandemic preparedness, and macrostrategy. The grant page says: "The largest pieces of the omnibus award package will allow FHI to recruit and hire for an education and training program led by Owen Cotton­Barratt, and retain and attract talent in biosecurity research and FHI’s Governance of AI program."

Other notes: Intended funding timeframe in months: 36; announced: 2018-09-01.
Open PhilanthropyUniversity of Oxford429,770.002018-07AI safetyhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/oxford-university-global-politics-of-ai-dafoe Grant to support research on the global politics of advanced artificial intelligence. The work will be led by Professor Allan Dafoe at the Future of Humanity Institute in Oxford, United Kingdom. The Open Philanthropy Project recommended additional funds to support this work in 2017, while Professor Dafoe was at Yale. Continuation of grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/yale-university-global-politics-of-ai-dafoe. Announced: 2018-07-20.
Open PhilanthropyCentre for Effective Altruism2,688,000.002018-06Effective altruism/movement growthhttps://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2018 Intended use of funds (category): Organizational general support

Intended use of funds: According to the previous grant page https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2017#Budget_and_room_for_more_funding "Our funding will be used primarily to allow CEA to hire new staff; increase staff salaries (from what we see as previously low levels); provide additional support to local EA groups; increase its budget for EA Global and EAGx events (conferences about EA); and partially fund EA Grants." Also: "our funding will in total increase CEA’s 2017 budget by $1.25 million and its 2018 budget by $1.875 million, with the remaining $1.875 million partly offsetting reduced fundraising from other donors, and partly increasing CEA’s reserves for 2019." The current grant page stands by the previous grant page

Donor reason for selecting the donee: According to the previous grant page https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2017#Case_for_the_grant "We believe that CEA has a good track record of helping the effective altruism community grow, and its leadership appears to be fairly value-aligned with us in terms of this goal." Two key contributions highlighted are "$1.4 billion worth of pledges made to Giving What We Can (GWWC)" and "Introducing effective altruism to people who have become valuable members of the EA community." The grant renewal is based on renewal plans described in the previous grant page

Donor reason for donating that amount (rather than a bigger or smaller amount): The previous grant page https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2017#Follow-up_expectations says: "We plan to renew this grant for between $1.25 million and $2.5 million next year depending on the outcomes of the various projects CEA plans to try out this year, and at a level consistent with our funding being less than 50% of CEA’s total budget." The grant ammount decided is at the top of the range

Donor reason for donating at this time (rather than earlier or later): The grant is made about one year after the previous grant; this is the expected timeframe for the grant renewal, according to the previous grant page https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2017#Follow-up_expectations
Intended funding timeframe in months: 24

Other notes: Grant of £2,000,000 ($2,688,000 at time of conversion). Announced: 2018-06-27.
Open PhilanthropyFuture of Life Institute250,000.002018-06Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/future-life-institute-general-support-2018 Intended use of funds (category): Organizational general support

Intended use of funds: Grant for general support. It is a renewal of the May 2017 grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/future-life-institute-general-support-2017 whose primary purpose to administer a request for proposals in AI safety similar to a request for proposals in 2015 https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/update-fli-grant

Donor retrospective of the donation: The followup grant in 2019 suggests that Open Phil would continue to stand by its assessment of the grantee.

Other notes: Announced: 2018-07-05.
Open Philanthropy80,000 Hours510,000.002018-02Effective altruism/movement growth/career counselinghttps://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2018 Intended use of funds (category): Organizational general support

Intended use of funds: No explicitly listed priority uses of the funds, but likely similar to the grant https://www.openphilanthropy.org/giving/grants/80000-hours-general-support that it is renewing

Donor reason for selecting the donee: Likely the same reasons as for the 2017 grant that this is renewing; see https://www.openphilanthropy.org/giving/grants/80000-hours-general-support#Case_for_the_grant The key reason is that Open Phil finds impressive the large number of impact-adjusted significant plan changes (IASPCs) that 80,000 Hours claims to have brought about, and broadly agrees with 80,000 Hours' calculation of their IASPCs

Donor reason for donating that amount (rather than a bigger or smaller amount): Amount determined by the amount ($510,000) raised from other donors in 2017, which turned out to be the smallest of the three constraining amounts described in the previous grant page https://www.openphilanthropy.org/giving/grants/80000-hours-general-support#Case_for_the_grant "We expect to recommend another grant to 80,000 Hours at the beginning of 2018, with the amount recommended being whichever of the following is smallest: (1) $1.25 million (2) The amount 80,000 Hours raises from other donors in 2017 (3) The amount necessary for 80,000 Hours to have $3.75 million in its bank account"

Donor reason for donating at this time (rather than earlier or later): Timing as pre-committed on the previous grant page https://www.openphilanthropy.org/giving/grants/80000-hours-general-support#Case_for_the_grant "We expect to recommend another grant to 80,000 Hours at the beginning of 2018"

Donor retrospective of the donation: The followup grant https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2019 for $4,795,803 in February 2019 (amount determined by the Committee for Effective Altruism Support) suggests general satisfaction with the grantee and the grant

Other notes: Announced: 2018-02-22.
Open PhilanthropyFounders Pledge3,222,653.002018-02Effective altruism/donor pledgeshttps://www.openphilanthropy.org/giving/grants/founders-pledge-general-support-2018 Total across two grants for general support over three years, representing a renewal of the 2016 grant https://www.openphilanthropy.org/giving/grants/founders-pledge-general-support Since the previous grant, Founders Pledge has increased its pledge commitments and a large percentage of donations made by founders taking their pledge have been to organizations that prioritize an evidence based approach with respect to their interventions. However, its expansion into other parts of Europe has been more limited than anticipated. Instead, FP has shifted its primary expansion focus to the United States and Canada. Announced: 2018-04-05.
Open PhilanthropyGlobal Priorities Institute2,674,284.002018-02Cause prioritizationhttps://www.openphilanthropy.org/giving/grants/global-priorities-institute-general-support Grant of £2,051,232 over five years (estimated at $2,674,284, depending upon currency conversion rates at the time of annual installments) via Americans for Oxford for general support. GPI is an interdisciplinary research center at the University of Oxford that conducts foundational research to inform the decision-making of individuals and institutions seeking to do as much good as possible. GPI intends to use this funding to support global priorities research, specifically: to hire three early-career, non-tenured research fellows with expertise in philosophy or economics, as well as two operations staff; to secure a larger office space to accommodate them; to host visiting researchers; and to hold seminars which address global priorities research topics. Announced: 2018-05-21.
Effective Altruism Funds: Effective Altruism Infrastructure FundEffective Altruism Sweden83,264.002018-01-14Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/1EjFHdfk3GmIeIaqquWgQI After the EA Grants Team made its selections, I discussed some applications that barely missed the cut with Will MacAskill. MacAskill recommended funding EA Sweden by Markus Anderljung given his track record of community-building at Cambridge and in Sweden, as well as a conversation that MacAskill had with Anderljung. Affected countries: Sweden; Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Open PhilanthropyCenter for Applied Rationality560,000.002018-01Epistemic institutionshttps://www.openphilanthropy.org/giving/grants/center-applied-rationality-sparc-2018 Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support the Summer Program on Applied Rationality and Cognition (SPARC). SPARC is a two-week summer program for top high school students to further develop skills related to applied reasoning, with a broad-ranging curriculum."

Donor reason for selecting the donee: The grant page says: "We expect that this program will expand the horizons of some students with extremely high potential and, hopefully, increase their positive impact on the world. We are especially interested in the possibility that participation in SPARC leads to greater awareness of effective altruism and issues important to the effective altruism community."

Donor reason for donating at this time (rather than earlier or later): Open Phil had previously funded SPARC for 2016 and 2017 with the grant https://www.openphilanthropy.org/giving/grants/center-applied-rationality-sparc This grant continues the funding to 2018 (and possibly to later years)

Other notes: Announced: 2018-02.
Open PhilanthropyCenter for Applied Rationality1,000,000.002018-01Epistemic institutionshttps://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018 Intended use of funds (category): Organizational general support

Intended use of funds: The grant page says: "CFAR is an adult education nonprofit that seeks to find and develop cognitive tools and to deliver these tools to promising individuals, groups, organizations, and networks focused on solving large and pressing problems."

Donor reason for selecting the donee: The grant page says: "Our primary interest in [CFAR] workshops is that we believe they introduce people to and/or strengthen their connections with the effective altruism (EA) community and way of thinking, which we hope results in people with outstanding potential pursuing more impactful career trajectories. CFAR is particularly interested in growing the talent pipeline for work on potential risks from advanced artificial intelligence (AI). More on our interest in supporting work along these lines is here."

Donor thoughts on making further donations to the donee: The grant page says: "CFAR’s performance on this grant will be judged primarily in terms of whether it provides adequate evidence of its programs resulting in improved career trajectories of the sort described above."

Donor retrospective of the donation: The followup grant https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2020 would be an exit grant, suggesting that Open Phil would revise downward its assessment of continued support of CFAR, but still continue to value CFAR enough to help it exit smoothly.

Other notes: Intended funding timeframe in months: 24; announced: 2018-02-28.
Open PhilanthropyMachine Intelligence Research Institute3,750,000.002017-10AI safety/technical researchhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2017 Donation process: The donor, Open Philanthropy Project, appears to have reviewed the progress made by MIRI one year after the one-year timeframe for the previous grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support ended. The full process is not described, but the July 2017 post https://forum.effectivealtruism.org/posts/SEL9PW8jozrvLnkb4/my-current-thoughts-on-miri-s-highly-reliable-agent-design (GW, IR) suggests that work on the review had been going on well before the grant renewal date

Intended use of funds (category): Organizational general support

Intended use of funds: According to the grant page: "MIRI expects to use these funds mostly toward salaries of MIRI researchers, research engineers, and support staff."

Donor reason for selecting the donee: The reasons for donating to MIRI remain the same as the reasons for the previous grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support made in August 2016, but with two new developments: (1) a very positive review of MIRI’s work on “logical induction” by a machine learning researcher who (i) is interested in AI safety, (ii) is rated as an outstanding researcher by at least one of Open Phil's close advisors, and (iii) is generally regarded as outstanding by the ML. (2) An increase in AI safety spending by Open Phil, so that Open Phil is "therefore less concerned that a larger grant will signal an outsized endorsement of MIRI’s approach." The skeptical post https://forum.effectivealtruism.org/posts/SEL9PW8jozrvLnkb4/my-current-thoughts-on-miri-s-highly-reliable-agent-design (GW, IR) by Daniel Dewey of Open Phil, from July 2017, is not discussed on the grant page

Donor reason for donating that amount (rather than a bigger or smaller amount): The grant page explains "We are now aiming to support about half of MIRI’s annual budget." In the previous grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support of $500,000 made in August 2016, Open Phil had expected to grant about the same amount ($500,000) after one year. The increase to $3.75 million over three years (or $1.25 million/year) is due to the two new developments: (1) a very positive review of MIRI’s work on “logical induction” by a machine learning researcher who (i) is interested in AI safety, (ii) is rated as an outstanding researcher by at least one of Open Phil's close advisors, and (iii) is generally regarded as outstanding by the ML. (2) An increase in AI safety spending by Open Phil, so that Open Phil is "therefore less concerned that a larger grant will signal an outsized endorsement of MIRI’s approach."

Donor reason for donating at this time (rather than earlier or later): The timing is mostly determined by the end of the one-year funding timeframe of the previous grant https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support made in August 2016 (a little over a year before this grant)
Intended funding timeframe in months: 36

Donor thoughts on making further donations to the donee: The MIRI blog post https://intelligence.org/2017/11/08/major-grant-open-phil/ says: "The Open Philanthropy Project has expressed openness to potentially increasing their support if MIRI is in a position to usefully spend more than our conservative estimate, if they believe that this increase in spending is sufficiently high-value, and if we are able to secure additional outside support to ensure that the Open Philanthropy Project isn’t providing more than half of our total funding."

Other notes: MIRI, the grantee, blogs about the grant at https://intelligence.org/2017/11/08/major-grant-open-phil/ Open Phil's statement that due to its other large grants in the AI safety space, it is "therefore less concerned that a larger grant will signal an outsized endorsement of MIRI’s approach." is discussed in the comments on the Facebook post https://www.facebook.com/vipulnaik.r/posts/10213581410585529 by Vipul Naik. Announced: 2017-11-08.
Open PhilanthropyYale University299,320.002017-07AI safetyhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/yale-university-global-politics-of-ai-dafoe Grant to support research into the global politics of artificial intelligence, led by Assistant Professor of Political Science, Allan Dafoe, who will conduct part of the research at the Future of Humanity Institute in Oxford, United Kingdom over the next year. Funds from the two gifts will support the hiring of two full-time research assistants, travel, conferences, and other expenses related to the research efforts, as well as salary, relocation, and health insurance expenses related to Professor Dafoe’s work in Oxford. Announced: 2017-09-28.
Open PhilanthropyFuture of Research150,000.002017-07Scientific researchhttps://www.openphilanthropy.org/focus/scientific-research/miscellaneous/future-research-exit-grant-2017 Exit grant to allow grantee to find time for alternative funding sources. Follows a $300,000 grant made in 2016 for two years. Announced: 2017-08-21.
Open PhilanthropyCenter for Applied Rationality340,000.002017-05Epistemic institutionshttps://www.openphilanthropy.org/giving/grants/center-applied-rationality-european-summer-program-rationality Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to support the "European Summer Program on Rationality (ESPR), a two-week summer workshop for about 40 mathematically gifted students aged 16-19."

Donor reason for selecting the donee: The grant page says: "We are excited about this grant because we expect that ESPR will orient participants to problems that we believe to be high impact, and may lead them to increase their positive impact on the world."

Donor reason for donating at this time (rather than earlier or later): Donation made in time to fund the event for 2017

Other notes: Announced: 2017-09-27.
Open PhilanthropyFuture of Life Institute100,000.002017-05Global catastrophic risks/AI safetyhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/future-life-institute-general-support-2017 Intended use of funds (category): Organizational general support

Intended use of funds: Grant for general support. However, the primary use of the grant will be to administer a request for proposals in AI safety similar to a request for proposals in 2015 https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/update-fli-grant

Donor retrospective of the donation: The followup grants in 2018 and 2019, for similar or larger amounts, suggest that Open Phil would continue to stand by its assessment of the grantee.

Other notes: Announced: 2017-09-27.
EA Giving GroupBerkeley Existential Risk Initiative35,161.982017-04AI safety/other global catastrophic riskshttps://app.effectivealtruism.org/funds/far-future/payouts/OzIQqsVacUKw0kEuaUGgI Grant discussed at http://effective-altruism.com/ea/19d/update_on_effective_altruism_funds/ along with reasoning. Grantee approached Nick Beckstead with a grant proposal asking for 50000 USD. Beckstead provided all the money donated already from the far future fund in Effective Altruism Funds, and made up the remainder via the EA Giving Group and some personal funds. It is not clear how much was personal funds, so for simplicity we are attributing the entirety of the remainder to EA Giving Group (creating some inaccuracy).
Effective Altruism Funds: Long-Term Future FundBerkeley Existential Risk Initiative14,838.022017-03-20AI safety/other global catastrophic riskshttps://funds.effectivealtruism.org/funds/payouts/march-2017-berkeley-existential-risk-initiative-beri Donation process: The grant page says that Nick Beckstead, the fund manager, learned that Andrew Critch was starting up BERI and needed $50,000. Beckstead determined that this would be the best use of the money in the Long-Term Future Fund.

Intended use of funds (category): Organizational general support

Intended use of funds: The grant page says: "It is a new initiative providing various forms of support to researchers working on existential risk issues (administrative, expert consultations, technical support). It works as a non-profit entity, independent of any university, so that it can help multiple organizations and to operate more swiftly than would be possible within a university context."

Donor reason for selecting the donee: Nick Beckstead gives these reasons on the grant page: the basic idea makes sense to him, his confidence in Critch's ability to make it happen, supporting people to try out reasonable ideas and learn from how they unfold seems valuable, and the natural role of Beckstead as a "first funder" for such opportunities and confidence that other competing funders for this would have good counterfactual uses of their money.

Donor reason for donating that amount (rather than a bigger or smaller amount): The requested amount was $50,000, and at the time of grant, the fund only had $14,838.02. So, all the fund money was granted. Beckstead donated the remainder of the funding via the EA Giving Group and a personal donor-advised fund.
Percentage of total donor spend in the corresponding batch of donations: 100.00%

Donor reason for donating at this time (rather than earlier or later): The timing of BERI starting up and the launch of the Long-Term Future Fund closely matched, leading to this grant happening when it did.

Donor retrospective of the donation: BERI would become successful and get considerable funding from Jaan Tallinn in the coming months, validating the grant. The Long-Term Future Fund would not make any further grants to BERI.
Open PhilanthropyCentre for Effective Altruism2,500,000.002017-03Effective altruism/movement growthhttps://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2017 Intended use of funds (category): Organizational general support

Intended use of funds: According to https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2017#Budget_and_room_for_more_funding "Our funding will be used primarily to allow CEA to hire new staff; increase staff salaries (from what we see as previously low levels); provide additional support to local EA groups; increase its budget for EA Global and EAGx events (conferences about EA); and partially fund EA Grants." Also: "If we renew our grant for $2.5 million next year (see below), our funding will in total increase CEA’s 2017 budget by $1.25 million and its 2018 budget by $1.875 million"

Donor reason for selecting the donee: According to https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2017#Case_for_the_grant "We believe that CEA has a good track record of helping the effective altruism community grow, and its leadership appears to be fairly value-aligned with us in terms of this goal." Two key contributions highlighted are "$1.4 billion worth of pledges made to Giving What We Can (GWWC)" and "Introducing effective altruism to people who have become valuable members of the EA community."

Donor reason for donating that amount (rather than a bigger or smaller amount): According to https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2017#Budget_and_room_for_more_funding "If we renew our grant for $2.5 million next year (see below), our funding will in total increase CEA’s 2017 budget by $1.25 million and its 2018 budget by $1.875 million, with the remaining $1.875 million partly offsetting reduced fundraising from other donors, and partly increasing CEA’s reserves for 2019."

Donor thoughts on making further donations to the donee: According to https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2017#Follow-up_expectations "We plan to renew this grant for between $1.25 million and $2.5 million next year depending on the outcomes of the various projects CEA plans to try out this year, and at a level consistent with our funding being less than 50% of CEA’s total budget. "

Donor retrospective of the donation: The June 2018 renewal https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2018 (for a similar amount) as well as subsequent grants in 2019 and 2020 suggest that Open Phil would continue to stand by its reasoning for the grant

Other notes: The grant writeup is fairly detailed, including a list of predictions as well as sources. Intended funding timeframe in months: 24; announced: 2017-08-17.
Open Philanthropy80,000 Hours1,125,000.002017-03Effective altruism/movement growth/career counselinghttps://www.openphilanthropy.org/giving/grants/80000-hours-general-support Intended use of funds (category): Organizational general support

Intended use of funds: According to https://www.openphilanthropy.org/giving/grants/80000-hours-general-support#Budget_and_proposed_activities grantee plans to use the grant to fund the following activities: (1) Hiring four new junior staff members to (a) improve career guides and career profiles, (b) do career coaching, (c) replacing a departing part-time software engineer, (d) research/career coaching/marketing. (2) Increasing staff salaries by 30% to be competitive in the Bay Area, to which it recently moved. (3) Marketing activities including online retargeting advertisements, Facebook advertisements for workshops targeted at universities, and giving away books. (4) Holding funds in reserve for year 2

Donor reason for selecting the donee: https://www.openphilanthropy.org/giving/grants/80000-hours-general-support#Case_for_the_grant explains the reasons. The key reason is that Open Phil finds impressive the large number of impact-adjusted significant plan changes (IASPCs) that 80,000 Hours claims to have brought about, and broadly agrees with 80,000 Hours' calculation of their IASPCs

Donor thoughts on making further donations to the donee: According to https://www.openphilanthropy.org/giving/grants/80000-hours-general-support#Case_for_the_grant "We expect to recommend another grant to 80,000 Hours at the beginning of 2018, with the amount recommended being whichever of the following is smallest: (1) $1.25 million (2) The amount 80,000 Hours raises from other donors in 2017 (3) The amount necessary for 80,000 Hours to have $3.75 million in its bank account"

Donor retrospective of the donation: The renewal grant https://www.openphilanthropy.org/giving/grants/80000-hours-general-support-2018 of $510,000 is consistent with the expectations set during this grant; the constraining factor in determining the amount for the renewal is "The amount 80,000 Hours raises from other donors in 2017"

Other notes: Announced: 2017-05-16.
Ben KuhnEA Giving Group20,000.002016-12-31--https://www.benkuhn.net/ea/ See https://www.benkuhn.net/giving-2016 for more context. Employer match: Wave matched 10,000.00.
Jacob SteinhardtCarnegie Endowment for International Peace500.002016-12-28World peacehttps://jsteinhardt.wordpress.com/2016/12/28/donations-for-2016/ Earmarked for Carnegie-Tsinghua Center. Although donation was announced on this day we do not know when it was made. Percentage of total donor spend in the corresponding batch of donations: 5.00%.
Open PhilanthropyCenter for Applied Rationality1,035,000.002016-07Epistemic institutions/effective altruism/movement growthhttps://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support Donation process: According to https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support#Our_process "While investigating this grant, we had several conversations with Anna Salamon, as well as with various other contacts of ours in the EA community. Nick Beckstead was the primary investigator for this grant."

Intended use of funds (category): Organizational general support

Intended use of funds: The grant page says: "$915,000 of this grant will support CFAR workshops and organizational improvements. $120,000 of this grant will fund a pilot version of EuroSPARC, an eight-day summer program in Europe run by CFAR for mathematically gifted high school students, modeled on the San Francisco-based Summer Program in Applied Rationality and Cognition (SPARC), which CFAR has helped run for the past three years."

Donor reason for selecting the donee: Stated reasons for the grant include value-alignment, success attracting and cultivating talented people to work on effective altruist causes, and funding being a substantial constraint at present

Donor reason for donating that amount (rather than a bigger or smaller amount): Amount tied to a budget proposed by CFAR, described at https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support#Budget_and_room_for_more_funding first year: $360,000 for organizational improvements, $100,000 for scholarships for CFAR workshops, $120,000 for EuroSPARC 2016, $47,500 for half the salary and benefits of a new staff member splitting time between CFAR operations and SPARC; second year: 360,000 for organizational improvements, $47,500 for half the salary and benefits of a new staff member

Donor thoughts on making further donations to the donee: https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support#Key_questions_for_follow-up lists key follow-up questions

Donor retrospective of the donation: The grant page for a followup January 2018 grant https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018 would say: "Since our 2016 funding recommendation, CFAR has largely met its milestones for organizational improvement." The statement, along with the fact that the followup grant would have a comparable size of $1,000,000, suggests that Open Phil would be satisfied with the results of the grant

Other notes: Intended funding timeframe in months: 24; announced: 2016-09-06.
Open PhilanthropyCenter for Applied Rationality304,000.002016-05Epistemic institutions/effective altruism/movement growthhttps://www.openphilanthropy.org/giving/grants/center-applied-rationality-sparc Donation process: According to https://www.openphilanthropy.org/giving/grants/center-applied-rationality-sparc#Our_process "Nick Beckstead, our Program Officer for Scientific Research, spoke with members of SPARC’s staff regarding its program, finances, and future plans."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant of $137,000 in 2016 and $167,000 in 2017 to support the Summer Program on Applied Rationality and Cognition (SPARC). "SPARC is a two-week summer program for high school students. Students selected to participate in the program typically show exceptional ability in mathematics, with many scoring highly among US participants in national or international math competitions."

Donor reason for selecting the donee: According to https://www.openphilanthropy.org/giving/grants/center-applied-rationality-sparc#Case_for_the_grant "we believe the program is strong, with the potential to have a substantial impact. [...] SPARC attracts unusually talented students. [...] we think very highly of several of the instructors who work at SPARC, some of whom also show strong interest in effective altruism."

Donor reason for donating that amount (rather than a bigger or smaller amount): According to https://www.openphilanthropy.org/giving/grants/center-applied-rationality-sparc#Budget_and_room_for_more_funding "SPARC’s total budget was approximately $90,000 in 2015. This grant will allow it to cover alumni events, travel reimbursement, unexpected contingencies, and some of the expenses associated with hiring a full-time logistics manager, as well as half of the salary and benefits for the new logistics manager, with the other half paid out of CFAR’s general budget. Our understanding is that the two years of support provided by this grant will be sufficient to enable SPARC to hire the new logistics manager and that a third year of support would not materially affect SPARC’s planning."

Donor reason for donating at this time (rather than earlier or later): Grant made shortly before SPARC 2016, and timing likely chosen so that the grant could be used for SPARC 2016
Intended funding timeframe in months: 24

Donor thoughts on making further donations to the donee: The grant page section https://www.openphilanthropy.org/giving/grants/center-applied-rationality-sparc#Key_questions_for_follow-up lists follow-up questions that Open Phil is interested in understanding better for the future

Other notes: Announced: 2016-07-07.
EA Giving GroupCentre for Effective Altruism--2016Effective altruism/movement growthhttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: February 2016 to December 2016. Exact date, amount, or fraction not known, but it is the donee with the highest amount donated out of three donees in this period. Money to be used at the discretion of William MacAskill.
EA Giving GroupCharity Science Health--2016Global health/vaccination/SMS remindershttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: February 2016 to December 2016. Exact date, amount, or fraction not known, but it is the donee with the lowest amount donated out of three donees in this period.
EA Giving Group80,000 Hours--2016Effective altruism/movement growth/career counselinghttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: February 2016 to December 2016. Exact date, amount, or fraction not known, but it is the donee with the second highest amount donated out of three donees in this period.
EA Giving GroupCharity Entrepreneurship--2016Effective altruism/movement growthhttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2015 to February 2016. Exact date, amount, or fraction not known, but it is the donee with the lowest amount donated out of six donees in this period.
EA Giving GroupFounders Pledge--2016Effective altruism/movement growthhttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2015 to February 2016. Exact date, amount, or fraction not known, but it is the donee with the highest amount donated out of six donees in this period.
EA Giving GroupCenter for Applied Rationality--2016Epistemic institutionshttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2015 to February 2016. Exact date, amount, or fraction not known, but it is the donee with the fifth highest amount donated out of six donees in this period.
EA Giving GroupFuture of Life Institute--2016Global catastrophic riskshttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2015 to February 2016. Exact date, amount, or fraction not known, but it is the donee with the fourth highest amount donated out of six donees in this period.
EA Giving Group80,000 Hours--2016Effective altruism/movement growth/career counselinghttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2015 to February 2016. Exact date, amount, or fraction not known, but it is the donee with the second highest amount donated out of six donees in this period.
EA Giving GroupCentre for Effective Altruism--2016Effective altruism/movement growthhttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2015 to February 2016. Exact date, amount, or fraction not known, but it is the donee with the third highest amount donated out of six donees in this period. The original grant was to Effective Altruism Outreach, which is now a part of CEA.
EA Giving GroupCentre for Effective Altruism--2015Effective altruism/movement growthhttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2014 to December 2015. Exact date, amount, or fraction not known, but it is the donee with the highest amount donated out of eight donees in this period. The original grant was to Effective Altruism Outreach, which is now a part of the Centre for Effective Altruism.
EA Giving GroupEffective Altruism Policy Analytics--2015Effective altruism/movement growthhttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2014 to December 2015. Exact date, amount, or fraction not known, but it is the donee with the lowest amount donated out of eight donees in this period.
EA Giving GroupFuture of Life Institute--2015Global catastrophic riskshttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2014 to December 2015. Exact date, amount, or fraction not known, but it is the donee with the second highest amount donated out of eight donees in this period.
EA Giving GroupCharity Science--2015Effective altruism/movement growthhttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2014 to December 2015. Exact date, amount, or fraction not known, but it is the donee with the third highest amount donated out of eight donees in this period.
EA Giving GroupFounders Pledge--2015Effective altruism/movement growthhttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2014 to December 2015. Exact date, amount, or fraction not known, but it is the donee with the fifth highest amount donated out of eight donees in this period.
EA Giving GroupRethink Charity--2015Effective altruism/movement growthhttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2014 to December 2015. Exact date, amount, or fraction not known, but it is the donee with the sixth highest amount donated out of eight donees in this period. Grant originally to the Local Effective Altruism Network (LEAN) which is now part of Rethink Charity.
EA Giving GroupCentre for Effective Altruism--2015Effective altruism/movement growthhttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2014 to December 2015. Exact date, amount, or fraction not known, but it is the donee with the seventh highest amount donated out of eight donees in this period. Money intended for regranting to local student groups, CEA is the intermediary for financing.
EA Giving GroupCharity Science Outreach--2014Epistemic institutionshttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2013 to December 2014. Exact date, amount, or fraction not known, but it is the donee with the lowest amount donated out of five donees in this period.
EA Giving GroupCentre for the Study of Existential Risk--2014Epistemic institutionshttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2013 to December 2014. Exact date, amount, or fraction not known, but it is the donee with the fourth highest amount donated out of five donees in this period.
EA Giving GroupCenter for Applied Rationality--2014Epistemic institutionshttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2013 to December 2014. Exact date, amount, or fraction not known, but it is the donee with the second highest amount donated out of five donees in this period.
EA Giving Group80,000 Hours--2014Effective altruism/movement growth/career counselinghttps://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit Actual date range: December 2013 to December 2014. Exact date, amount, or fraction not known, but it is the donee with the highest amount donated out of five donees in this period.

Donation amounts by donee and year

Donee Donors influenced Cause area Metadata Total 2021 2019 2018 2017 2016
Future of Humanity Institute Open Philanthropy (filter this donor) Global catastrophic risks/AI safety/Biosecurity and pandemic preparedness FB Tw WP Site TW 12,066,808.93 0.00 0.00 12,066,808.93 0.00 0.00
Center for Human-Compatible AI Open Philanthropy (filter this donor) AI safety WP Site TW 11,355,246.00 11,355,246.00 0.00 0.00 0.00 0.00
Redwood Research Open Philanthropy (filter this donor) 9,420,000.00 9,420,000.00 0.00 0.00 0.00 0.00
Centre for Effective Altruism EA Giving Group (filter this donor), Effective Altruism Funds: Effective Altruism Infrastructure Fund (filter this donor), Effective Altruism Funds: Long-Term Future Fund (filter this donor), Open Philanthropy (filter this donor) Effective altruism/movement growth FB Site 5,406,598.00 0.00 0.00 2,906,598.00 2,500,000.00 0.00
Machine Intelligence Research Institute Effective Altruism Funds: Long-Term Future Fund (filter this donor), Open Philanthropy (filter this donor) AI safety FB Tw WP Site CN GS TW 4,238,994.00 0.00 0.00 488,994.00 3,750,000.00 0.00
Founders Pledge EA Giving Group (filter this donor), Effective Altruism Funds: Effective Altruism Infrastructure Fund (filter this donor), Open Philanthropy (filter this donor) Effective altruism/donor pledges FB Tw WP Site 3,616,592.00 0.00 0.00 3,616,592.00 0.00 0.00
Center for Applied Rationality EA Giving Group (filter this donor), Effective Altruism Funds: Long-Term Future Fund (filter this donor), Open Philanthropy (filter this donor) Rationality FB Tw WP Site TW 3,413,021.00 0.00 0.00 1,734,021.00 340,000.00 1,339,000.00
Global Priorities Institute Open Philanthropy (filter this donor) Cause prioritization Site 2,674,284.00 0.00 0.00 2,674,284.00 0.00 0.00
80,000 Hours EA Giving Group (filter this donor), Effective Altruism Funds: Effective Altruism Infrastructure Fund (filter this donor), Effective Altruism Funds: Long-Term Future Fund (filter this donor), Open Philanthropy (filter this donor) Career coaching/life guidance FB Tw WP Site 1,802,268.00 0.00 0.00 677,268.00 1,125,000.00 0.00
Effective Altruism Foundation Open Philanthropy (filter this donor) Effective altruism/movement growth FB Tw Site 1,000,000.00 0.00 1,000,000.00 0.00 0.00 0.00
University of Tübingen Open Philanthropy (filter this donor) 890,000.00 890,000.00 0.00 0.00 0.00 0.00
University of Oxford Open Philanthropy (filter this donor) FB Tw WP Site 429,770.00 0.00 0.00 429,770.00 0.00 0.00
Future of Life Institute EA Giving Group (filter this donor), Open Philanthropy (filter this donor) AI safety/other global catastrophic risks FB Tw WP Site 350,000.00 0.00 0.00 250,000.00 100,000.00 0.00
University of Southern California Open Philanthropy (filter this donor) FB Tw WP Site 320,000.00 320,000.00 0.00 0.00 0.00 0.00
Yale University Open Philanthropy (filter this donor) FB Tw WP Site 299,320.00 0.00 0.00 0.00 299,320.00 0.00
University of California, Santa Cruz Open Philanthropy (filter this donor) 265,000.00 265,000.00 0.00 0.00 0.00 0.00
Daniel Dewey Open Philanthropy (filter this donor) 175,000.00 175,000.00 0.00 0.00 0.00 0.00
Future of Research Open Philanthropy (filter this donor) 150,000.00 0.00 0.00 0.00 150,000.00 0.00
Effective Altruism Sweden Effective Altruism Funds: Effective Altruism Infrastructure Fund (filter this donor) 83,264.00 0.00 0.00 83,264.00 0.00 0.00
Brian Christian Open Philanthropy (filter this donor) 66,000.00 66,000.00 0.00 0.00 0.00 0.00
Berkeley Existential Risk Initiative EA Giving Group (filter this donor), Effective Altruism Funds: Long-Term Future Fund (filter this donor) AI safety/other global catastrophic risks Site TW 50,000.00 0.00 0.00 0.00 50,000.00 0.00
EA Giving Group Ben Kuhn (filter this donor) Existential risk/far future/trajectory improvement Site 20,000.00 0.00 0.00 0.00 0.00 20,000.00
Carnegie Endowment for International Peace Jacob Steinhardt (filter this donor) FB Tw WP Site 500.00 0.00 0.00 0.00 0.00 500.00
Centre for the Study of Existential Risk EA Giving Group (filter this donor) 0.00 0.00 0.00 0.00 0.00 0.00
Charity Entrepreneurship EA Giving Group (filter this donor) 0.00 0.00 0.00 0.00 0.00 0.00
Charity Science EA Giving Group (filter this donor) Effective altruism/movement growth FB Tw Site 0.00 0.00 0.00 0.00 0.00 0.00
Charity Science Health EA Giving Group (filter this donor) 0.00 0.00 0.00 0.00 0.00 0.00
Charity Science Outreach EA Giving Group (filter this donor) 0.00 0.00 0.00 0.00 0.00 0.00
Effective Altruism Policy Analytics EA Giving Group (filter this donor) 0.00 0.00 0.00 0.00 0.00 0.00
Rethink Charity EA Giving Group (filter this donor) Effective altruism/Community project coordination FB Site 0.00 0.00 0.00 0.00 0.00 0.00
Total ---- -- 58,092,665.93 22,491,246.00 1,000,000.00 24,927,599.93 8,314,320.00 1,359,500.00

Graph of spending by donee and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by donee and year (cumulative)

Graph of spending should have loaded here

Donation amounts by donor and year for influencer Nick Beckstead

Donor Donees Total 2021 2019 2018 2017 2016
Open Philanthropy (filter this donee) 80,000 Hours (filter this donee), Brian Christian (filter this donee), Center for Applied Rationality (filter this donee), Center for Human-Compatible AI (filter this donee), Centre for Effective Altruism (filter this donee), Daniel Dewey (filter this donee), Effective Altruism Foundation (filter this donee), Founders Pledge (filter this donee), Future of Humanity Institute (filter this donee), Future of Life Institute (filter this donee), Future of Research (filter this donee), Global Priorities Institute (filter this donee), Machine Intelligence Research Institute (filter this donee), Redwood Research (filter this donee), University of California, Santa Cruz (filter this donee), University of Oxford (filter this donee), University of Southern California (filter this donee), University of Tübingen (filter this donee), Yale University (filter this donee) 56,496,081.93 22,491,246.00 1,000,000.00 23,401,515.93 8,264,320.00 1,339,000.00
Effective Altruism Funds: Long-Term Future Fund (filter this donee) 80,000 Hours (filter this donee), Berkeley Existential Risk Initiative (filter this donee), Center for Applied Rationality (filter this donee), Centre for Effective Altruism (filter this donee), Machine Intelligence Research Institute (filter this donee) 931,840.02 0.00 0.00 917,002.00 14,838.02 0.00
Effective Altruism Funds: Effective Altruism Infrastructure Fund (filter this donee) 80,000 Hours (filter this donee), Centre for Effective Altruism (filter this donee), Effective Altruism Sweden (filter this donee), Founders Pledge (filter this donee) 609,082.00 0.00 0.00 609,082.00 0.00 0.00
EA Giving Group (filter this donee) 80,000 Hours (filter this donee), Berkeley Existential Risk Initiative (filter this donee), Center for Applied Rationality (filter this donee), Centre for Effective Altruism (filter this donee), Centre for the Study of Existential Risk (filter this donee), Charity Entrepreneurship (filter this donee), Charity Science (filter this donee), Charity Science Health (filter this donee), Charity Science Outreach (filter this donee), Effective Altruism Policy Analytics (filter this donee), Founders Pledge (filter this donee), Future of Life Institute (filter this donee), Rethink Charity (filter this donee) 35,161.98 0.00 0.00 0.00 35,161.98 0.00
Ben Kuhn (filter this donee) EA Giving Group (filter this donee) 20,000.00 0.00 0.00 0.00 0.00 20,000.00
Jacob Steinhardt (filter this donee) Carnegie Endowment for International Peace (filter this donee) 500.00 0.00 0.00 0.00 0.00 500.00
Total -- 58,092,665.93 22,491,246.00 1,000,000.00 24,927,599.93 8,314,320.00 1,359,500.00

Graph of spending by donee and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by donee and year (cumulative)

Graph of spending should have loaded here