FTX Future Fund donations made

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of July 2026. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Basic donor information

We do not have any donor information for the donor FTX Future Fund in our system.

Donor donation statistics

Cause areaCountMedianMeanMinimum10th percentile 20th percentile 30th percentile 40th percentile 50th percentile 60th percentile 70th percentile 80th percentile 90th percentile Maximum
Overall 75 336,000 1,109,348 7,500 100,000 140,000 200,000 250,000 336,000 470,000 600,000 900,000 2,000,000 15,000,000
Global catastrophic risks|Economic growth 1 7,500 7,500 7,500 7,500 7,500 7,500 7,500 7,500 7,500 7,500 7,500 7,500 7,500
AI safety 19 250,000 566,316 30,000 40,000 95,000 100,000 155,000 250,000 290,000 380,000 600,000 1,500,000 5,000,000
Values and reflective processes 1 50,000 50,000 50,000 50,000 50,000 50,000 50,000 50,000 50,000 50,000 50,000 50,000 50,000
Global catastrophic risks|International relations|Space exploration 1 85,000 85,000 85,000 85,000 85,000 85,000 85,000 85,000 85,000 85,000 85,000 85,000 85,000
7 480,000 2,477,143 100,000 100,000 160,000 200,000 200,000 480,000 700,000 700,000 700,000 15,000,000 15,000,000
Effective altruism|AI safety|Biosecurity and pandemic preparedness|Climate change 1 135,000 135,000 135,000 135,000 135,000 135,000 135,000 135,000 135,000 135,000 135,000 135,000 135,000
Biosecurity and pandemic preparedness 16 480,000 1,385,000 135,000 140,000 250,000 350,000 470,000 480,000 500,000 1,000,000 1,200,000 3,600,000 10,000,000
Epistemic institutions 10 250,000 383,550 137,500 137,500 182,000 200,000 230,000 250,000 300,000 336,000 500,000 700,000 1,000,000
AI safety|Biosecurity and pandemic preparedness 2 190,000 255,000 190,000 190,000 190,000 190,000 190,000 190,000 320,000 320,000 320,000 320,000 320,000
Education|Talent pipeline 2 200,000 2,600,000 200,000 200,000 200,000 200,000 200,000 200,000 5,000,000 5,000,000 5,000,000 5,000,000 5,000,000
Effective altruism 7 700,000 2,505,714 250,000 250,000 350,000 400,000 400,000 700,000 900,000 900,000 1,000,000 13,940,000 13,940,000
International relations 2 250,000 319,040 250,000 250,000 250,000 250,000 250,000 250,000 388,080 388,080 388,080 388,080 388,080
Epistemic institutions|Politics 1 300,000 300,000 300,000 300,000 300,000 300,000 300,000 300,000 300,000 300,000 300,000 300,000 300,000
Talent pipeline 1 320,000 320,000 320,000 320,000 320,000 320,000 320,000 320,000 320,000 320,000 320,000 320,000 320,000
Economic growth 1 500,000 500,000 500,000 500,000 500,000 500,000 500,000 500,000 500,000 500,000 500,000 500,000 500,000
Global catastrophic risks|International relations 1 820,000 820,000 820,000 820,000 820,000 820,000 820,000 820,000 820,000 820,000 820,000 820,000 820,000
AI safety|Migration policy 1 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000 1,000,000
Effective altruism|AI safety 1 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000

Donation amounts by cause area and year

If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.

Note: Cause area classification used here may not match that used by donor for all cases.

Cause area Number of donations Number of donees Total 2022
Biosecurity and pandemic preparedness (filter this donor) 16 15 22,160,000.00 22,160,000.00
Effective altruism (filter this donor) 7 7 17,540,000.00 17,540,000.00
(filter this donor) 7 6 17,340,000.00 17,340,000.00
AI safety (filter this donor) 19 18 10,760,000.00 10,760,000.00
Education|Talent pipeline (filter this donor) 2 2 5,200,000.00 5,200,000.00
Epistemic institutions (filter this donor) 10 9 3,835,500.00 3,835,500.00
Effective altruism|AI safety (filter this donor) 1 1 2,000,000.00 2,000,000.00
AI safety|Migration policy (filter this donor) 1 1 1,000,000.00 1,000,000.00
Global catastrophic risks|International relations (filter this donor) 1 1 820,000.00 820,000.00
International relations (filter this donor) 2 2 638,080.00 638,080.00
AI safety|Biosecurity and pandemic preparedness (filter this donor) 2 2 510,000.00 510,000.00
Economic growth (filter this donor) 1 1 500,000.00 500,000.00
Talent pipeline (filter this donor) 1 1 320,000.00 320,000.00
Epistemic institutions|Politics (filter this donor) 1 1 300,000.00 300,000.00
Effective altruism|AI safety|Biosecurity and pandemic preparedness|Climate change (filter this donor) 1 1 135,000.00 135,000.00
Global catastrophic risks|International relations|Space exploration (filter this donor) 1 1 85,000.00 85,000.00
Values and reflective processes (filter this donor) 1 1 50,000.00 50,000.00
Global catastrophic risks|Economic growth (filter this donor) 1 1 7,500.00 7,500.00
Total 75 69 83,201,080.00 83,201,080.00

Skipping spending graph as there is at most one year’s worth of donations.

Donation amounts by subcause area and year

If you hover over a cell for a given subcause area and year, you will get a tooltip with the number of donees and the number of donations.

For the meaning of “classified” and “unclassified”, see the page clarifying this.

Subcause area Number of donations Number of donees Total 2022
Effective altruism 3 3 15,840,000.00 15,840,000.00
AI safety 15 14 10,115,000.00 10,115,000.00
Biosecurity and pandemic preparedness/COVID-19/COVID-19 vaccine 1 1 10,000,000.00 10,000,000.00
Education|Talent pipeline 2 2 5,200,000.00 5,200,000.00
Biosecurity and pandemic preparedness 7 6 4,920,000.00 4,920,000.00
Biosecurity and pandemic preparedness/vaccine development 2 2 4,600,000.00 4,600,000.00
Epistemic institutions/forecasting 7 6 3,248,000.00 3,248,000.00
Effective altruism|AI safety 1 1 2,000,000.00 2,000,000.00
Effective altruism/effective giving 2 2 1,050,000.00 1,050,000.00
AI safety|Migration policy/high-skilled migration 1 1 1,000,000.00 1,000,000.00
Biosecurity and pandemic preparedness/regulatory speedup 1 1 1,000,000.00 1,000,000.00
Global catastrophic risks|International relations 1 1 820,000.00 820,000.00
Biosecurity and pandemic preparedness/coordination efforts 2 2 750,000.00 750,000.00
Effective altruism/information corpus 2 2 650,000.00 650,000.00
International relations 2 2 638,080.00 638,080.00
Epistemic institutions 3 3 587,500.00 587,500.00
AI safety|Biosecurity and pandemic preparedness 2 2 510,000.00 510,000.00
Biosecurity and pandemic preparedness/personal protective equipment 1 1 500,000.00 500,000.00
Economic growth 1 1 500,000.00 500,000.00
AI safety/talent pipeline 3 3 395,000.00 395,000.00
Biosecurity and pandemic preparedness/disinfection 2 2 390,000.00 390,000.00
Talent pipeline 1 1 320,000.00 320,000.00
Epistemic institutions|Politics 1 1 300,000.00 300,000.00
AI safety/forecasting 1 1 250,000.00 250,000.00
Effective altruism|AI safety|Biosecurity and pandemic preparedness|Climate change 1 1 135,000.00 135,000.00
Global catastrophic risks|International relations|Space exploration/space governance 1 1 85,000.00 85,000.00
Values and reflective processes 1 1 50,000.00 50,000.00
Classified total 68 65 65,861,080.00 65,861,080.00
Unclassified total 7 6 17,340,000.00 17,340,000.00
Total 75 69 83,201,080.00 83,201,080.00

Skipping spending graph as there is at most one year’s worth of donations.

Donation amounts by donee and year

Donee Cause area Metadata Total 2022
Longview Philanthropy (filter this donor) 15,000,000.00 15,000,000.00
Centre for Effective Altruism (filter this donor) Effective altruism/movement growth FB Site 13,940,000.00 13,940,000.00
HelixNano (filter this donor) 10,000,000.00 10,000,000.00
Ought (filter this donor) AI safety Site 5,000,000.00 5,000,000.00
The Atlas Fellowship (filter this donor) 5,000,000.00 5,000,000.00
AVECRIS (filter this donor) 3,600,000.00 3,600,000.00
Sherlock Biosciences (filter this donor) 2,000,000.00 2,000,000.00
Lightcone Infrastructure (filter this donor) Epistemic institutions FB WP Site 2,000,000.00 2,000,000.00
Cornell University (filter this donor) FB Tw WP Site 1,500,000.00 1,500,000.00
Manifold Markets (filter this donor) 1,500,000.00 1,500,000.00
University of California, Berkeley (filter this donor) FB Tw WP Site 1,400,000.00 1,400,000.00
Legal Priorities Project (filter this donor) 1,317,500.00 1,317,500.00
SecureBio (filter this donor) 1,200,000.00 1,200,000.00
Federation for American Scientists (filter this donor) 1,000,000.00 1,000,000.00
Confirm Solutions (filter this donor) 1,000,000.00 1,000,000.00
Piezo Therapeutics (filter this donor) 1,000,000.00 1,000,000.00
Non-trivial Pursuits (filter this donor) 1,000,000.00 1,000,000.00
Effective Ideas Blog Prize (filter this donor) 900,000.00 900,000.00
Simon Institute for Longterm Governance (filter this donor) 820,000.00 820,000.00
Giving What We Can (filter this donor) FB Tw WP Site 700,000.00 700,000.00
Sage (filter this donor) 700,000.00 700,000.00
Rethink Priorities (filter this donor) Cause prioritization Site 700,000.00 700,000.00
Institute for Progress (filter this donor) 615,000.00 615,000.00
Virginia Tech (filter this donor) 500,000.00 500,000.00
Brown University (filter this donor) FB Tw WP Site 500,000.00 500,000.00
ML Safety Scholars Program (filter this donor) 490,000.00 490,000.00
MITRE (filter this donor) 485,000.00 485,000.00
Charity Entrepreneurship (filter this donor) 470,000.00 470,000.00
Council on Strategic Risks (filter this donor) 400,000.00 400,000.00
Rational Animations (filter this donor) 400,000.00 400,000.00
Ohio State University (filter this donor) FB Tw WP Site 388,080.00 388,080.00
University of Cambridge (filter this donor) FB Tw WP Site 380,000.00 380,000.00
1Day Sooner (filter this donor) 350,000.00 350,000.00
High Impact Athletes (filter this donor) 350,000.00 350,000.00
Global Guessing (filter this donor) 336,000.00 336,000.00
Association for Long Term Existence and Resilience (filter this donor) 320,000.00 320,000.00
High Impact Professionals (filter this donor) 320,000.00 320,000.00
Brian Christian (filter this donor) 300,000.00 300,000.00
Good Judgment Project (filter this donor) 300,000.00 300,000.00
The Center for Election Science (filter this donor) 300,000.00 300,000.00
AI Safety Camp (filter this donor) 290,000.00 290,000.00
University of Utah (filter this donor) FB Tw WP Site 280,000.00 280,000.00
Berkeley Existential Risk Initiative (filter this donor) AI safety/other global catastrophic risks Site TW 255,000.00 255,000.00
AI Impacts (filter this donor) AI safety Site 250,000.00 250,000.00
University of Ottawa (filter this donor) WP Site 250,000.00 250,000.00
Nonlinear (filter this donor) 250,000.00 250,000.00
Apollo Academic Surveys (filter this donor) 250,000.00 250,000.00
Stimson South Asia Program (filter this donor) 250,000.00 250,000.00
Peter Hrosso (filter this donor) 230,000.00 230,000.00
AI Safety Support (filter this donor) 200,000.00 200,000.00
Quantified Uncertainty Research Institute (filter this donor) 200,000.00 200,000.00
Anysphere (filter this donor) 200,000.00 200,000.00
Rajalakshmi Children Foundation (filter this donor) 200,000.00 200,000.00
James Lin (filter this donor) 190,000.00 190,000.00
Nathan Young (filter this donor) 182,000.00 182,000.00
Cecil Abungu (filter this donor) 160,000.00 160,000.00
Moncef Slaoui (filter this donor) 150,000.00 150,000.00
Justin Mares (filter this donor) 140,000.00 140,000.00
EffiSciences (filter this donor) 135,000.00 135,000.00
Siddharth Hiregowdara (filter this donor) 100,000.00 100,000.00
Columbia University (filter this donor) FB Tw WP Site 100,000.00 100,000.00
Prometheus Science Bowl (filter this donor) 100,000.00 100,000.00
Apart Research (filter this donor) 95,000.00 95,000.00
Konstantinos Konstantinidis (filter this donor) 85,000.00 85,000.00
Trojan Detection Challenge at NeurIPS 2022 (filter this donor) 50,000.00 50,000.00
Pathos Labs (filter this donor) 50,000.00 50,000.00
AI Risk Public Materials Competition (filter this donor) 40,000.00 40,000.00
Evan R. Murphy (filter this donor) 30,000.00 30,000.00
Maxwell Tabarrok (filter this donor) 7,500.00 7,500.00
Total -- -- 83,201,080.00 83,201,080.00

Skipping spending graph as there is at most one year’s worth of donations.

Donation amounts by influencer and year

If you hover over a cell for a given influencer and year, you will get a tooltip with the number of donees and the number of donations.

For the meaning of “classified” and “unclassified”, see the page clarifying this.

Influencer Number of donations Number of donees Total 2022
Longview Philanthropy 1 1 900,000.00 900,000.00
Classified total 1 1 900,000.00 900,000.00
Unclassified total 74 68 82,301,080.00 82,301,080.00
Total 75 69 83,201,080.00 83,201,080.00

Skipping spending graph as there is at most one year’s worth of donations.

Donation amounts by disclosures and year

Sorry, we couldn't find any disclosures information.

Donation amounts by country and year

If you hover over a cell for a given country and year, you will get a tooltip with the number of donees and the number of donations.

For the meaning of “classified” and “unclassified”, see the page clarifying this.

Country Number of donations Number of donees Total 2022
India|Pakistan 1 1 250,000.00 250,000.00
India 1 1 200,000.00 200,000.00
France 1 1 135,000.00 135,000.00
Classified total 3 3 585,000.00 585,000.00
Unclassified total 72 66 82,616,080.00 82,616,080.00
Total 75 69 83,201,080.00 83,201,080.00

Skipping spending graph as there is at most one year’s worth of donations.

Full list of documents in reverse chronological order (9 documents)

Title (URL linked)Publication dateAuthorPublisherAffected donorsAffected doneesAffected influencersDocument scopeCause areaNotes
Announcing the Future Fund’s AI Worldview Prize2022-09-23Nick Beckstead Leopold Aschenbrenner Avital Balwit William MacAskill Ketan Ramakrishnan FTX Future FundFTX Future Fund Request for critiques of donor strategyAI safetyIn this post, cross-posted as https://forum.effectivealtruism.org/posts/W7C5hwq7sjdpTdrQF/announcing-the-future-fund-s-ai-worldview-prize (GW, IR) to the EA Forum, the Future Fund teams announces its Worldview Prize, that seeks content that would cause FTX Future Fund to significantly update its views regarding AI timelines and its perspective on how to approach AI safety.
Fireside chat | Nick Beckstead | EA Global: SF 222022-08-22Nick Beckstead EA GlobalFTX Future Fund Broad donor strategyIn this fireside chat, Nick Beckstead talks about the work of the FTX Future Fund. He reiterates points made in various blog posts by FTX earlier in the year. He talks about how FTX has tried to scale up giving rapidly. He says he's happy with how much FTX has been able to do relative to its team size, but that money moved per employee is not the metric to optimize for; what the money achieves is what's important. He talks about the regranting program and the open call for applications. Later in the talk, he reminisces on his Ph.D. thesis and how his thinking has evolved since then.
Future Fund June 2022 Update2022-06-30Nick Beckstead Leopold Aschenbrenner Avital Balwit William MacAskill Ketan Ramakrishnan FTX Future FundFTX Future Fund Manifold Markets ML Safety Scholars Program Andi Peng Braden Leach Thomas Kwa SecureBio Ray Amjad Apollo Academic Surveys Justin Mares Longview Philanthropy Atlas Fellowship Effective Ideas Blog Prize Ought Swift Centre for Applied Forecasting Federation for American Scientists Public Editor Project Quantified Uncertainty Research Institute Moncef Slaoui AI Impacts EA Critiques and Red Teaming Prize Broad donor strategyLongtermism|AI safety|Biosecurity and pandemic preparedness|Effective altruismThis lengthy blog post, cross-posted at https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update (GW, IR) to the Effective Altruism Forum, goes into detail regarding the grantmaking of the FTX Future Fund so far, and learnings from this grantmaking. The post reports having made 262 grants and investments, with $132 million in total spend. Three funding models are in use: regranting ($31 million so far), open call ($26 million so far), and staff-led grantmaking ($73 million so far).
Some clarifications on the Future Fund's approach to grantmaking (GW, IR)2022-05-09Nick Beckstead Effective Altruism ForumFTX Future Fund Broad donor strategyLongtermismThis blog post is written partly in response to the concerns that https://forum.effectivealtruism.org/posts/HWaH8tNdsgEwNZu8B/free-spending-ea-might-be-a-big-problem-for-optics-and (GW, IR) raises. The post clarifies the processes and safeguards the FTX Future Fund has in place for reviewing grants, and explains how FTX is able to manage a large grant volume with a small team: mainly by relying on regranting. The post also clarifies that FTX has not granted much for community-building, so some of the concerns specifically related to community-building grants don't matter quite yet.
FTX Future Fund and Longtermism2022-03-17Rhys Lindmark Open Philanthropy FTX Future Fund Miscellaneous commentaryLongtermism|Global health and developmentThis blog post, cross-posted at https://forum.effectivealtruism.org/posts/fDLmDe8HQq2ueCxk6/ftx-future-fund-and-longtermism (GW, IR) to the EA Forum is written in a fun format and including charts and memes. The post talks about the change to the EA funding landscape with the arrival of FTX Future Fund, including both an increase in the amount of funding and the shift toward longtermism.
Some thoughts on recent Effective Altruism funding announcements. It's been a big week in Effective Altruism2022-03-03James Ozden Open Philanthropy FTX Future Fund FTX Community Fund FTX Climate Fund Mercy For Animals Charity Entrepreneurship Miscellaneous commentaryLongtermism|Animal welfare|Global health and development|AI safety|Climate changeIn this blog post, cross-posted at https://forum.effectivealtruism.org/posts/Wpr5ssnNW5JPDDPvd/some-thoughts-on-recent-effective-altruism-funding (GW, IR) to the EA Forum, James Ozden discusses recent increases in funding by donors aligned with effective altruism (EA) and makes forecasts for the amount of annual money moved by 2025. Highlights of the post: 1. The entry of the FTX Future Fund is expected to increase the proportion of funds allocated to longtermist causes to increase to become more in line with what EA leaders think it should be (based on the data that https://80000hours.org/2021/08/effective-altruism-allocation-resources-cause-areas/ compiles). 2. Grantmaking capacity needs to be scaled up to match the increase in available funds. 3. The EA movement may need to shift from marginal thinking to coordination dynamics, as their funding amounts are no longer as marginal. 4. Entrepreneurs, founders, and incubators are needed. 6. We need to be more ambitious.
Announcing the Future Fund2022-02-28Nick Beckstead Leopold Aschenbrenner Avital Balwit William MacAskill Ketan Ramakrishnan FTX Future FundFTX Future Fund Request for proposalsLongtermismIn this blog post, cross-posted at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) to the EA Forum, the FTX Future Fund announces an open call for applications. It links to https://ftxfuturefund.org/projects/ for the list of project ideas that they are interested in funding, and links to https://ftxfuturefund.org/our-2022-plans/ for more information on the 2022 plans. The version https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) on the EA Forum includes the content of both posts from the FTX website.
Announcing Our Regranting Program2022-02-28Nick Beckstead Leopold Aschenbrenner FTX Future FundFTX Future Fund Request for proposalsLongtermismThe FTX Future Fund announces its regranting program that "will offer discretionary budgets to independent part-time grantmakers, to be spent in the next ~6 months. Budgets will typically be in the $250k-few million range. We've already invited a first cohort of 21 regrantors to test the program." The post also invites people to apply to be regrantors or recommend others as regrantors.
2021 AI Alignment Literature Review and Charity Comparison (GW, IR)2021-12-23Larks Effective Altruism ForumLarks Effective Altruism Funds: Long-Term Future Fund Survival and Flourishing Fund FTX Future Fund Future of Humanity Institute Future of Humanity Institute Centre for the Governance of AI Center for Human-Compatible AI Machine Intelligence Research Institute Global Catastrophic Risk Institute Centre for the Study of Existential Risk OpenAI Google Deepmind Anthropic Alignment Research Center Redwood Research Ought AI Impacts Global Priorities Institute Center on Long-Term Risk Centre for Long-Term Resilience Rethink Priorities Convergence Analysis Stanford Existential Risk Initiative Effective Altruism Funds: Long-Term Future Fund Berkeley Existential Risk Initiative 80,000 Hours Survival and Flourishing Fund Review of current state of cause areaAI safetyCross-posted to LessWrong at https://www.lesswrong.com/posts/C4tR3BEpuWviT7Sje/2021-ai-alignment-literature-review-and-charity-comparison (GW, IR) This is the sixth post in a tradition of annual blog posts on the state of AI safety and the work of various organizations in the space over the course of the year; the post is structured similarly to the previous year's post https://forum.effectivealtruism.org/posts/K7Z87me338BQT3Mcv/2020-ai-alignment-literature-review-and-charity-comparison (GW, IR) but has a few new features. The author mentions that he has several conflicts of interest that he cannot individually disclose. He also starts collecting "second preferences" data this year for all the organizations he talks to, which is where the organization would like to see funds go, other than itself. The Long-Term Future Fund is the clear winner here. He also announces that he's looking for a research assistant to help with next year's post given the increasing time demands and his reduced time availability. His final rot13'ed donation decision is to donate to the Long-Term Future Fund so that sufficiently skilled AI safety researchers can make a career with LTFF funding; his second preference for donations is BERI. Many other organizations that he considers to be likely to be doing excellent work are either already well-funded or do not provide sufficient disclosure.

Full list of donations in reverse chronological order (75 donations)

Graph of top 10 donees (for donations with known year of donation) by amount, showing the timeframe of donations

Graph of donations and their timeframes
DoneeAmount (current USD)Amount rank (out of 75)Donation dateCause areaURLInfluencerNotes
Evan R. Murphy30,000.00742022-07AI safetyhttps://ftxfuturefund.org/our-regrants/-- Donation process: The grant is made as part of the Future Fund's regranting program. See https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Regranting_program_in_more_detail (GW, IR) for more detail on the regranting program.

Intended use of funds (category): Living expenses during project

Intended use of funds: Grant to "support six months of independent research on interpretability and other AI safety topics."
Legal Priorities Project700,000.00192022-06--https://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support one year of operating expenses and salaries at the Legal Priorities Project, a longtermist legal research and field-building organization."

Other notes: This is the second grant from FTX Future Fund to Legal Priorities Project; the preceding grant of $480,000 was in April 2022. Intended funding timeframe in months: 12.
Nathan Young182,000.00572022-06Epistemic institutions/forecastinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the creation of a website for collaboratively creating public forecasting questions for a range of prediction aggregators and markets."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
1Day Sooner350,000.00362022-06Biosecurity and pandemic preparedness/coordination effortshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support 1DS’ work on pandemic preparedness, including advocacy for advance market purchase commitments, collaboration with the UK Pandemic Ethics Accelerator on challenge studies, and advocacy with 1Day Africa and the West African Health Organization for a global pandemic insurance fund."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
AI Safety Camp290,000.00442022-06AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "partially support the salaries for AI Safety Camp’s two directors and to support logistical expenses at its physical camp."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
University of California, Berkeley (Earmark: Sergey Levine)600,000.00232022-06AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a project to study how large language models integrated with offline reinforcement learning pose a risk of machine deception and persuasion."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
AI Impacts250,000.00462022-06AI safety/forecastinghttps://ftxfuturefund.org/our-regrants/-- Donation process: The grant is made as part of the Future Fund's regranting program. See https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Regranting_program_in_more_detail (GW, IR) for more detail on the regranting program.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support rerunning the highly-cited survey “When Will AI Exceed Human Performance? Evidence from AI Experts” from 2016, analysis, and publication of results."
Justin Mares140,000.00612022-05Biosecurity and pandemic preparedness/disinfectionhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support research on the feasibility of inactivating viruses via electromagnetic radiation."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Global Guessing336,000.00382022-05Epistemic institutions/forecastinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support Global Guessing’s forecasting coverage on the Russian invasion of Ukraine, which they will also use to build tools and infrastructure to support future forecasting work."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Anysphere200,000.00522022-05--https://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This investment is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Organizational general support

Intended use of funds: Investment to "build a communication platform that provably leaks zero metadata."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Council on Strategic Risks400,000.00322022-05Biosecurity and pandemic preparedness/coordination effortshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: The grant description says: "We recommended a grant to support a project which will develop and advance ideas for strengthening regional and multilateral cooperation for addressing biological risks and filling gaps in current international institutions. These efforts include promoting the creation of a center with the capacity to rapidly respond to emerging infectious disease threats to prioritize blunting the impact of such events as well as quickly saving lives, and cooperative mechanisms to enhance biosafety and biosecurity while reducing the potential risks of spaces such as high-containment laboratories."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Confirm Solutions1,000,000.00112022-05Biosecurity and pandemic preparedness/regulatory speeduphttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This investment (the grantee is a public-benefit corporation) is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Investment to "support development of statistical models and software tools that can automate parts of the regulatory process for complex clinical trials."

Donor reason for selecting the donee: The grant description says: "We anticipate that this work can help to speed up approvals of new vaccines and medical treatments while enhancing their statistical rigor."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
AVECRIS3,600,000.0062022-05Biosecurity and pandemic preparedness/vaccine developmenthttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This investment is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Investment in AVECRIS’s "Project DOOR to support the development of a next generation genetic vaccine platform that aims to allow for highly distributed vaccine production using AVECRIS’s advanced DNA vector delivery technology."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Piezo Therapeutics1,000,000.00112022-05Biosecurity and pandemic preparedness/vaccine developmenthttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This investment is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support work on technology for delivering mRNA vaccines without lipid nanoparticles with the aim of making vaccines more safe, affordable, and scalable."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Moncef Slaoui150,000.00602022-05Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-regrants/-- Donation process: The grant is made as part of the Future Fund's regranting program. See https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Regranting_program_in_more_detail (GW, IR) for more detail on the regranting program.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "fund the writing of Slaoui's memoir, especially including his experience directing Operation Warp Speed."
Maxwell Tabarrok7,500.00752022-05Global catastrophic risks|Economic growthhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Living expenses during project

Intended use of funds: Grant to "support [Maxwell Tabarrok] to spend a summer at the Future of Humanity Institute at Oxford University researching differential tech development and the connection between existential risks to humanity and economic growth."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Intended funding timeframe in months: 3
Konstantinos Konstantinidis85,000.00702022-05Global catastrophic risks|International relations|Space exploration/space governancehttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support two years of research on the impacts of disruptive space technologies, nuclear risk, and mitigating risks from future space-based weapons."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Intended funding timeframe in months: 24
Rational Animations400,000.00322022-05Effective altruism/information corpushttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Organizational general support

Intended use of funds: Grant to "support the creation of animated videos on topics related to rationality and effective altruism to explain these topics for a broader audience."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
Giving What We Can700,000.00192022-05Effective altruism/effective givinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Organizational general support

Intended use of funds: Grant to "support Giving What We Can’s mission to create a world in which giving effectively and significantly is a cultural norm."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
Non-trivial Pursuits1,000,000.00112022-05Effective altruismhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Organizational general support

Intended use of funds: Grant to "support outreach to help students to learn about career options, develop their skills, and plan their careers to work on the world’s most pressing problems."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
Quantified Uncertainty Research Institute200,000.00522022-05Epistemic institutionshttps://ftxfuturefund.org/our-regrants/-- Donation process: The grant is made as part of the Future Fund's regranting program. See https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Regranting_program_in_more_detail (GW, IR) for more detail on the regranting program.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support QURI to develop a programming language called "Squiggle" as a tool for probabilistic estimation."

Donor reason for selecting the donee: The grant description says: "The hope is this will be a useful tool for forecasting and fermi estimates."
Legal Priorities Project137,500.00622022-05Epistemic institutionshttps://ftxfuturefund.org/our-regrants/-- Donation process: The grant is made as part of the Future Fund's regranting program. See https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Regranting_program_in_more_detail (GW, IR) for more detail on the regranting program.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support an essay contest on “Accounting for Existential Risks in US Cost-Benefit Analysis,” with the aim of contributing to the revision of OMB Circular-A4, a document which guides US government cost-benefit analysis. The Legal Priorities Project is administering the contest."

Other notes: The Future Fund makes two other grants to support Legal Priorities Project around the same time, but these grants are staff-led.
Sage700,000.00192022-05Epistemic institutions/forecastinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the creation of a pilot version of a forecasting platform, and a paid forecasting team, to make predictions about questions relevant to high-impact research."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Good Judgment Project300,000.00412022-05Epistemic institutions/forecastinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a Good Judgment initiative to produce forecasts on 10 Our World in Data data sets/charts."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Peter Hrosso230,000.00512022-05Epistemic institutions/forecastinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a project aimed at training large language models to represent the probability distribution over question answers in a prediction market."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Manifold Markets500,000.00242022-05Epistemic institutions/forecastinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support Manifold Markets in building a charity prediction market, as an experiment for enabling effective forecasters to direct altruistic donations."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).

Other notes: The Future Fund makes another grant to Manifold Markets of $1,000,000 for a related purpose at about the same time; this other grant is a regrant.
Apollo Academic Surveys250,000.00462022-05Epistemic institutionshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support Apollo’s work aggregating the views of academic experts in many different fields and making them freely available online."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
High Impact Professionals320,000.00392022-05Talent pipelinehttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Organizational general support

Intended use of funds: Grant to "support HIP’s work recruiting EA working professionals to use more of their resources, including their careers, to focus on the world’s most pressing problems."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
University of California, Berkeley (Earmark: Anca Dragan)800,000.00182022-05AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a project to develop interactive AI algorithms for alignment that can uncover the causal features in human reward systems, and thereby help AI systems learn underlying human values that generalize to new situations."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Association for Long Term Existence and Resilience320,000.00392022-05AI safety|Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support ALTER, an academic research and advocacy organization, which hopes to investigate, demonstrate, and foster useful ways to improve the future in the short term, and to safeguard and improve the long-term trajectory of humanity. The organization's initial focus is building bridges to academia via conferences and grants to find researchers who can focus on AI safety, and on policy for reducing biorisk."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
University of Cambridge (Earmark: Gabriel Recchia)380,000.00352022-05AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support research on how to fine-tune GPT-3 models to identify flaws in other fine-tuned language models' arguments for the correctness of their outputs, and to test whether these help nonexpert humans successfully judge such arguments."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
AI Safety Support200,000.00522022-05AI safety/talent pipelinehttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "for general funding for community building and managing the talent pipeline for AI alignment researchers. AI Safety Support’s work includes one-on-one coaching, events, and research training programs."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
University of Utah (Earmark: Daniel Brown)280,000.00452022-05AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support research on value alignment in AI systems, practical algorithms for efficient value alignment verification, and user studies and experiments to test these algorithms."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Brian Christian300,000.00412022-05AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the completion of a book which explores the nature of human values and the implications for aligning AI with human preferences."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
James Lin190,000.00562022-05AI safety|Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21. The grant recipient had written a blog post https://forum.effectivealtruism.org/posts/qoB8MHe94kCEZyswd/i-want-future-perfect-but-for-science-publications (GW, IR) on 2022-03-08 (during the grant application period) describing the idea.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "allow a reputable technology publication to engage 2-5 undergraduate student interns to write about topics including AI safety, alternative proteins, and biosecurity." See https://forum.effectivealtruism.org/posts/qoB8MHe94kCEZyswd/i-want-future-perfect-but-for-science-publications (GW, IR) for the grantee's original vision.

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Trojan Detection Challenge at NeurIPS 202250,000.00712022-05AI safetyhttps://ftxfuturefund.org/our-regrants/-- Donation process: The grant is made as part of the Future Fund's regranting program. See https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Regranting_program_in_more_detail (GW, IR) for more detail on the regranting program.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support prizes for a trojan detection competition at NeurIPS, which involves identifying whether a deep neural network will suddenly change behavior if certain unknown conditions are met."

Other notes: Intended funding timeframe in months: 1.
Ought5,000,000.0042022-05AI safetyhttps://ftxfuturefund.org/our-regrants/-- Donation process: The grant is made as part of the Future Fund's regranting program. See https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Regranting_program_in_more_detail (GW, IR) for more detail on the regranting program.

Intended use of funds (category): Organizational general support

Intended use of funds: Grant to "support Ought’s work building Elicit, a language-model based research assistant."

Donor reason for selecting the donee: The grant description says: "This work contributes to research on reducing alignment risk through scaling human supervision via process-based systems."
Prometheus Science Bowl100,000.00652022-05AI safety/talent pipelinehttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a competition for work on Eliciting Latent Knowledge, an open problem in AI alignment, for talented high school and college students who are participating in Prometheus Science Bowl."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Virginia Tech (Earmark: Guoliang Liu)500,000.00242022-05Biosecurity and pandemic preparedness/personal protective equipmenthttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a project to develop a new material -- an ultra-thin polymer-based thin film -- for use in next-generation Personal Protective Equipment which is both more effective and more comfortable."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Berkeley Existential Risk Initiative155,000.00592022-05AI safetyhttps://ftxfuturefund.org/our-regrants/-- Donation process: The grant is made as part of the Future Fund's regranting program. See https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Regranting_program_in_more_detail (GW, IR) for more detail on the regranting program.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a NeurIPS competition applying human feedback in a non-language-model setting, specifically pretrained models in Minecraft."
Federation for American Scientists1,000,000.00112022-05AI safety|Migration policy/high-skilled migrationhttps://ftxfuturefund.org/our-regrants/-- Donation process: The grant is made as part of the Future Fund's regranting program. See https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Regranting_program_in_more_detail (GW, IR) for more detail on the regranting program.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a researcher and research assistant to work on high-skill immigration and AI policy at FAS for three years."

Other notes: Intended funding timeframe in months: 36.
Apart Research95,000.00692022-05AI safety/talent pipelinehttps://ftxfuturefund.org/our-regrants/-- Donation process: The grant is made as part of the Future Fund's regranting program. See https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Regranting_program_in_more_detail (GW, IR) for more detail on the regranting program.

Intended use of funds (category): Organizational general support

Intended use of funds: Grant to "support the creation of an AI Safety organization which will create a platform to share AI safety research ideas and educational materials, connect people working on AI safety, and bring new people into the field."
Rajalakshmi Children Foundation200,000.00522022-05Education|Talent pipelinehttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Organizational general support

Intended use of funds: Grant to "support the identification of children in India from under-resourced areas who excel in math, science, and technology, and enable them to obtain high quality online education by digitally connecting them with mentors and teachers."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).

Other notes: Affected countries: India.
Institute for Progress480,000.00292022-05Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the Institute’s research and policy engagement work on high skilled immigration, biosecurity, and pandemic prevention."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Institute for Progress (Earmark: Nikki Teran)135,000.00632022-05Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the creation of biosecurity policy priorities via conversations with experts in security, technology, policy, and advocacy. It will develop position papers, research papers, and agendas for the biosecurity community."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Cecil Abungu160,000.00582022-05--https://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "Cecil Abungu, Visiting Researcher at the Centre for the Study of Existential Risk and Research Affiliate at the Legal Priorities Project, to support the writing and publication of a book on longtermist currents in historical African thought."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
MITRE (Earmark: Michael Jacob)485,000.00282022-05Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support research that we hope will be used to help strengthen the bioweapons convention and guide proactive actions to better secure those facilities or stop the dangerous work being done there."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Charity Entrepreneurship470,000.00312022-05Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the incubation of new charities that will work on health security."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Cornell University (Earmark: Lionel Levine)1,500,000.0092022-04AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support Prof. Levine, as well as students and collaborators, to work on alignment theory research at the Cornell math department."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).

Other notes: The June 2022 update https://ftxfuturefund.org/future-fund-june-2022-update/ by the FTX Future Fund highlights the grant as one of its example grants.
Legal Priorities Project480,000.00292022-04--https://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the Legal Priorities Project’s ongoing research and outreach activities. This will allow LPP to pay two new hires and to put on a summer institute for non-US law students in Oxford."

Donor retrospective of the donation: FTX Future Fund would make a further grant of $700,000 in June 2022, about two months later, indicating continued satisfaction with the grantee.
Stimson South Asia Program250,000.00462022-04International relationshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the identification and implementation of promising confidence-building measures to reduce conflict between India and Pakistan."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).

Other notes: Affected countries: India|Pakistan.
Simon Institute for Longterm Governance820,000.00172022-04Global catastrophic risks|International relationshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support SI’s policy work with the United Nations system on the prevention of existential risks to humanity."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Ohio State University (Earmark: Bear F. Braumoeller)388,080.00342022-04International relationshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a postdoc and two research assistants for Professor Braumoeller’s MESO Lab for two years to carry out research on international orders and how they affect the probability of war."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Intended funding timeframe in months: 24
Nonlinear250,000.00462022-04Effective altruism/information corpushttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the maintenance of a library of high-quality audio content on the world’s most pressing problems, and a fund to provide productivity-enhancing equipment and support staff for people working on important social issues."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
High Impact Athletes350,000.00362022-04Effective altruism/effective givinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Organizational general support

Intended use of funds: Grant to "support HIA’s work encouraging professional athletes to donate more of their earnings to high impact charities and causes, and to promote a culture of giving among their fans."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
EffiSciences135,000.00632022-04Effective altruism|AI safety|Biosecurity and pandemic preparedness|Climate changehttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support EffiSciences’s work promoting high impact research on global priorities (e.g. AI safety, biosecurity, and climate change) among French students and academics, and building up a community of people willing to work on important topics."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).

Other notes: Affected countries: France.
AI Risk Public Materials Competition40,000.00732022-04AI safetyhttps://ftxfuturefund.org/our-regrants/-- Donation process: The grant is made as part of the Future Fund's regranting program. See https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Regranting_program_in_more_detail (GW, IR) for more detail on the regranting program.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support two competitions to produce better public materials on the existential risk from AI."
ML Safety Scholars Program490,000.00272022-04AI safetyhttps://ftxfuturefund.org/our-regrants/-- Donation process: The grant is made as part of the Future Fund's regranting program. See https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Regranting_program_in_more_detail (GW, IR) for more detail on the regranting program.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "fund a summer program for up to 100 students to spend 9 weeks studying machine learning, deep learning, and technical topics in safety."

Other notes: Intended funding timeframe in months: 2.
Columbia University (Earmark: Claudia Shi)100,000.00652022-04AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the work of a PhD student [Claudia Shi] working on AI safety at Columbia University."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Intended funding timeframe in months: 36
Manifold Markets1,000,000.00112022-03Epistemic institutions/forecastinghttps://ftxfuturefund.org/our-regrants/-- Donation process: The grant is made as part of the Future Fund's regranting program. See https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Regranting_program_in_more_detail (GW, IR) for more detail on the regranting program.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support Manifold Markets in building a play-money prediction market platform. The platform is also experimenting with impact certificates and charity prediction markets."

Other notes: The Future Fund makes another grant to Manifold Markets of $500,000 for a related purpose at about the same time; this other grant is a result of the open call.
Centre for Effective Altruism13,940,000.0022022-03Effective altruismhttps://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Organizational general support

Intended use of funds: Grant for "general support for their activities, including running conferences, supporting student groups, and maintaining online resources."
Rethink Priorities700,000.00192022-03--https://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support Rethink’s research and projects aimed at improving humanity’s long-term prospects." Rethink Priorities also does non-human-centric work (such as research into animal welfare) and more neartermist work, and the grant seems limited to the longtermist work.
Berkeley Existential Risk Initiative100,000.00652022-03--https://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support BERI in hiring a second core operations employee to contribute to BERI’s work supporting university research groups."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Siddharth Hiregowdara100,000.00652022-03AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the production of high quality materials for learning about AI safety work."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
The Center for Election Science300,000.00412022-03Epistemic institutions|Politicshttps://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Direct project expenses

Intended use of funds: The grant description says: "We recommended a grant to support the development of statewide ballot initiatives to institute approval voting. Approval voting is a simple voting method reform that lets voters select all the candidates they wish."
University of Ottawa (Earmark: Emilio I. Alarcón)250,000.00462022-03Biosecurity and pandemic preparedness/disinfectionhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a project to develop new plastic surfaces incorporating molecules that can be activated with low-energy visible light to eradicate bacteria and kill viruses continuously."

Donor reason for selecting the donee: The grant description says: "If successful, this project will change how plastic surfaces are currently decontaminated."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
SecureBio (Earmark: Kevin Esvelt)1,200,000.00102022-03Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: According to https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Open_call (GW, IR) this grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: The grant description states: "We recommended a grant to support the hiring of several key staff for Dr. Kevin Esvelt’s pandemic prevention work. SecureBio is working to implement universal DNA synthesis screening, build a reliable early warning system, and coordinate the development of improved personal protective equipment and its delivery to essential workers when needed."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).

Other notes: The June 2022 update https://ftxfuturefund.org/future-fund-june-2022-update/ by the FTX Future Fund highlights the grant as one of its example grants.
Longview Philanthropy15,000,000.0012022-02--https://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Regranting

Intended use of funds: Grant to "support Longview’s independent grantmaking on global priorities research, nuclear weapons policy, and other longtermist issues."
Sherlock Biosciences2,000,000.0072022-02Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This investment is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the development of universal CRISPR-based diagnostics, including paper-based diagnostics that can be used in developing-country settings without electricity."
Lightcone Infrastructure2,000,000.0072022-02Effective altruism|AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Organizational general support

Intended use of funds: Grant to "support Lightcone’s ongoing projects including running the LessWrong forum, hosting conferences and events, and maintaining an office space for Effective Altruist organizations."
Pathos Labs50,000.00712022Values and reflective processeshttps://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support Pathos Labs to produce a PopShift convening connecting experts on the future of technology and existential risks with television writers to inspire new ideas for their shows."
The Atlas Fellowship5,000,000.0042022-01Education|Talent pipelinehttps://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support scholarships for talented and promising high school students to use towards educational opportunities and enrolling in a summer program."
Brown University (Earmark: Oded Galor)500,000.00242022-01Economic growthhttps://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support two years of academic research on long-term economic growth."

Other notes: Intended funding timeframe in months: 24.
Effective Ideas Blog Prize900,000.00162022-01Effective altruismhttps://ftxfuturefund.org/our-grants/?_funding_stream=ad-hocLongview Philanthropy Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas." This particular grant is co-recommended by LongView Philanthropy and the funding is to be administered by Longview Media.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support prizes for outstanding writing which encourages a broader public conversation around effective altruism and longtermism."
HelixNano10,000,000.0032022-01Biosecurity and pandemic preparedness/COVID-19/COVID-19 vaccinehttps://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This investment is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support Helix Nano running preclinical and Phase 1 trials of a pan-variant Covid-19 vaccine."

Similarity to other donors

Sorry, we couldn't find any similar donors.