FTX Future Fund donations made

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of March 2023. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Basic donor information

We do not have any donor information for the donor FTX Future Fund in our system.

Donor donation statistics

Cause areaCountMedianMeanMinimum10th percentile 20th percentile 30th percentile 40th percentile 50th percentile 60th percentile 70th percentile 80th percentile 90th percentile Maximum
Overall 43 350,000 1,116,814 100,000 135,000 200,000 250,000 300,000 350,000 470,000 500,000 700,000 1,200,000 15,000,000
AI safety 11 290,000 422,727 100,000 100,000 100,000 200,000 280,000 290,000 300,000 380,000 600,000 800,000 1,500,000
Biosecurity and pandemic preparedness 10 400,000 441,000 135,000 135,000 140,000 250,000 350,000 400,000 470,000 480,000 485,000 500,000 1,200,000
Effective altruism|AI safety|Biosecurity and pandemic preparedness|Climate change 1 135,000 135,000 135,000 135,000 135,000 135,000 135,000 135,000 135,000 135,000 135,000 135,000 135,000
Epistemic institutions 7 300,000 356,857 182,000 182,000 230,000 250,000 250,000 300,000 336,000 336,000 500,000 700,000 700,000
AI safety|Biosecurity and pandemic preparedness 2 190,000 255,000 190,000 190,000 190,000 190,000 190,000 190,000 320,000 320,000 320,000 320,000 320,000
Effective altruism 6 400,000 2,773,333 250,000 250,000 350,000 350,000 400,000 400,000 700,000 1,000,000 1,000,000 13,940,000 13,940,000
Epistemic institutions|Politics 1 300,000 300,000 300,000 300,000 300,000 300,000 300,000 300,000 300,000 300,000 300,000 300,000 300,000
4 700,000 4,220,000 480,000 480,000 480,000 700,000 700,000 700,000 700,000 700,000 15,000,000 15,000,000 15,000,000
Effective altruism|AI safety 1 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000 2,000,000

Donation amounts by cause area and year

If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.

Note: Cause area classification used here may not match that used by donor for all cases.

Cause area Number of donations Number of donees Total 2022
(filter this donor) 4 3 16,880,000.00 16,880,000.00
Effective altruism (filter this donor) 6 6 16,640,000.00 16,640,000.00
AI safety (filter this donor) 11 10 4,650,000.00 4,650,000.00
Biosecurity and pandemic preparedness (filter this donor) 10 9 4,410,000.00 4,410,000.00
Epistemic institutions (filter this donor) 7 7 2,498,000.00 2,498,000.00
Effective altruism|AI safety (filter this donor) 1 1 2,000,000.00 2,000,000.00
AI safety|Biosecurity and pandemic preparedness (filter this donor) 2 2 510,000.00 510,000.00
Epistemic institutions|Politics (filter this donor) 1 1 300,000.00 300,000.00
Effective altruism|AI safety|Biosecurity and pandemic preparedness|Climate change (filter this donor) 1 1 135,000.00 135,000.00
Total 43 40 48,023,000.00 48,023,000.00

Skipping spending graph as there is at most one year’s worth of donations.

Donation amounts by subcause area and year

If you hover over a cell for a given subcause area and year, you will get a tooltip with the number of donees and the number of donations.

For the meaning of “classified” and “unclassified”, see the page clarifying this.

Subcause area Number of donations Number of donees Total 2022
Effective altruism 2 2 14,940,000.00 14,940,000.00
AI safety 9 8 4,350,000.00 4,350,000.00
Biosecurity and pandemic preparedness 5 4 2,770,000.00 2,770,000.00
Epistemic institutions/forecasting 6 6 2,248,000.00 2,248,000.00
Effective altruism|AI safety 1 1 2,000,000.00 2,000,000.00
Effective altruism/effective giving 2 2 1,050,000.00 1,050,000.00
Biosecurity and pandemic preparedness/coordination efforts 2 2 750,000.00 750,000.00
Effective altruism/information corpus 2 2 650,000.00 650,000.00
AI safety|Biosecurity and pandemic preparedness 2 2 510,000.00 510,000.00
Biosecurity and pandemic preparedness/personal protective equipment 1 1 500,000.00 500,000.00
Biosecurity and pandemic preparedness/disinfection 2 2 390,000.00 390,000.00
AI safety/talent pipeline 2 2 300,000.00 300,000.00
Epistemic institutions|Politics 1 1 300,000.00 300,000.00
Epistemic institutions 1 1 250,000.00 250,000.00
Effective altruism|AI safety|Biosecurity and pandemic preparedness|Climate change 1 1 135,000.00 135,000.00
Classified total 39 37 31,143,000.00 31,143,000.00
Unclassified total 4 3 16,880,000.00 16,880,000.00
Total 43 40 48,023,000.00 48,023,000.00

Skipping spending graph as there is at most one year’s worth of donations.

Donation amounts by donee and year

Donee Cause area Metadata Total 2022
Longview Philanthropy (filter this donor) 15,000,000.00 15,000,000.00
Centre for Effective Altruism (filter this donor) Effective altruism/movement growth FB Site 13,940,000.00 13,940,000.00
Lightcone Infrastructure (filter this donor) Epistemic institutions FB WP Site 2,000,000.00 2,000,000.00
Cornell University (filter this donor) FB Tw WP Site 1,500,000.00 1,500,000.00
University of California, Berkeley (filter this donor) FB Tw WP Site 1,400,000.00 1,400,000.00
SecureBio (filter this donor) 1,200,000.00 1,200,000.00
Legal Priorities Project (filter this donor) 1,180,000.00 1,180,000.00
Non-trivial Pursuits (filter this donor) 1,000,000.00 1,000,000.00
Sage (filter this donor) 700,000.00 700,000.00
Rethink Priorities (filter this donor) Cause prioritization Site 700,000.00 700,000.00
Giving What We Can (filter this donor) FB Tw WP Site 700,000.00 700,000.00
Institute for Progress (filter this donor) 615,000.00 615,000.00
Manifold Markets (filter this donor) 500,000.00 500,000.00
Virginia Tech (filter this donor) 500,000.00 500,000.00
MITRE (filter this donor) 485,000.00 485,000.00
Charity Entrepreneurship (filter this donor) 470,000.00 470,000.00
Council on Strategic Risks (filter this donor) 400,000.00 400,000.00
Rational Animations (filter this donor) 400,000.00 400,000.00
University of Cambridge (filter this donor) FB Tw WP Site 380,000.00 380,000.00
1Day Sooner (filter this donor) 350,000.00 350,000.00
High Impact Athletes (filter this donor) 350,000.00 350,000.00
Global Guessing (filter this donor) 336,000.00 336,000.00
Association for Long Term Existence and Resilience (filter this donor) 320,000.00 320,000.00
The Center for Election Science (filter this donor) 300,000.00 300,000.00
Brian Christian (filter this donor) 300,000.00 300,000.00
Good Judgment Project (filter this donor) 300,000.00 300,000.00
AI Safety Camp (filter this donor) 290,000.00 290,000.00
University of Utah (filter this donor) FB Tw WP Site 280,000.00 280,000.00
Nonlinear (filter this donor) 250,000.00 250,000.00
Apollo Academic Surveys (filter this donor) 250,000.00 250,000.00
University of Ottawa (filter this donor) WP Site 250,000.00 250,000.00
Peter Hrosso (filter this donor) 230,000.00 230,000.00
AI Safety Support (filter this donor) 200,000.00 200,000.00
James Lin (filter this donor) 190,000.00 190,000.00
Nathan Young (filter this donor) 182,000.00 182,000.00
Justin Mares (filter this donor) 140,000.00 140,000.00
EffiSciences (filter this donor) 135,000.00 135,000.00
Prometheus Science Bowl (filter this donor) 100,000.00 100,000.00
Columbia University (filter this donor) FB Tw WP Site 100,000.00 100,000.00
Siddharth Hiregowdara (filter this donor) 100,000.00 100,000.00
Total -- -- 48,023,000.00 48,023,000.00

Skipping spending graph as there is at most one year’s worth of donations.

Donation amounts by influencer and year

Sorry, we couldn't find any influencer information.

Donation amounts by disclosures and year

Sorry, we couldn't find any disclosures information.

Donation amounts by country and year

If you hover over a cell for a given country and year, you will get a tooltip with the number of donees and the number of donations.

For the meaning of “classified” and “unclassified”, see the page clarifying this.

Country Number of donations Number of donees Total 2022
France 1 1 135,000.00 135,000.00
Classified total 1 1 135,000.00 135,000.00
Unclassified total 42 39 47,888,000.00 47,888,000.00
Total 43 40 48,023,000.00 48,023,000.00

Skipping spending graph as there is at most one year’s worth of donations.

Full list of documents in reverse chronological order (7 documents)

Title (URL linked)Publication dateAuthorPublisherAffected donorsAffected doneesAffected influencersDocument scopeCause areaNotes
Future Fund June 2022 Update2022-06-30Nick Beckstead Leopold Aschenbrenner Avital Balwit William MacAskill Ketan Ramakrishnan FTX Future FundFTX Future Fund Manifold Markets ML Safety Scholars Program Andi Peng Braden Leach Thomas Kwa SecureBio Ray Amjad Apollo Academic Surveys Justin Mares Longview Philanthropy Atlas Fellowship Effective Ideas Blog Prize Ought Swift Centre for Applied Forecasting Federation for American Scientists Public Editor Project Quantified Uncertainty Research Institute Moncef Slaoui AI Impacts EA Critiques and Red Teaming Prize Broad donor strategyLongtermism|AI safety|Biosecurity and pandemic preparedness|Effective altruismThis lengthy blog post, cross-posted at https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update (GW, IR) to the Effective Altruism Forum, goes into detail regarding the grantmaking of the FTX Future Fund so far, and learnings from this grantmaking. The post reports having made 262 grants and investments, with $132 million in total spend. Three funding models are in use: regranting ($31 million so far), open call ($26 million so far), and staff-led grantmaking ($73 million so far).
Some clarifications on the Future Fund's approach to grantmaking (GW, IR)2022-05-09Nick Beckstead Effective Altruism ForumFTX Future Fund Broad donor strategyLongtermismThis blog post is written partly in response to the concerns that https://forum.effectivealtruism.org/posts/HWaH8tNdsgEwNZu8B/free-spending-ea-might-be-a-big-problem-for-optics-and (GW, IR) raises. The post clarifies the processes and safeguards the FTX Future Fund has in place for reviewing grants, and explains how FTX is able to manage a large grant volume with a small team: mainly by relying on regranting. The post also clarifies that FTX has not granted much for community-building, so some of the concerns specifically related to community-building grants don't matter quite yet.
FTX Future Fund and Longtermism2022-03-17Rhys Lindmark Open Philanthropy FTX Future Fund Miscellaneous commentaryLongtermism|Global health and developmentThis blog post, cross-posted at https://forum.effectivealtruism.org/posts/fDLmDe8HQq2ueCxk6/ftx-future-fund-and-longtermism (GW, IR) to the EA Forum is written in a fun format and including charts and memes. The post talks about the change to the EA funding landscape with the arrival of FTX Future Fund, including both an increase in the amount of funding and the shift toward longtermism.
Some thoughts on recent Effective Altruism funding announcements. It's been a big week in Effective Altruism2022-03-03James Ozden Open Philanthropy FTX Future Fund FTX Community Fund FTX Climate Fund Mercy For Animals Charity Entrepreneurship Miscellaneous commentaryLongtermism|Animal welfare|Global health and development|AI safety|Climate changeIn this blog post, cross-posted at https://forum.effectivealtruism.org/posts/Wpr5ssnNW5JPDDPvd/some-thoughts-on-recent-effective-altruism-funding (GW, IR) to the EA Forum, James Ozden discusses recent increases in funding by donors aligned with effective altruism (EA) and makes forecasts for the amount of annual money moved by 2025. Highlights of the post: 1. The entry of the FTX Future Fund is expected to increase the proportion of funds allocated to longtermist causes to increase to become more in line with what EA leaders think it should be (based on the data that https://80000hours.org/2021/08/effective-altruism-allocation-resources-cause-areas/ compiles). 2. Grantmaking capacity needs to be scaled up to match the increase in available funds. 3. The EA movement may need to shift from marginal thinking to coordination dynamics, as their funding amounts are no longer as marginal. 4. Entrepreneurs, founders, and incubators are needed. 6. We need to be more ambitious.
Announcing the Future Fund2022-02-28Nick Beckstead Leopold Aschenbrenner Avital Balwit William MacAskill Ketan Ramakrishnan FTX Future FundFTX Future Fund Request for proposalsLongtermismIn this blog post, cross-posted at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) to the EA Forum, the FTX Future Fund announces an open call for applications. It links to https://ftxfuturefund.org/projects/ for the list of project ideas that they are interested in funding, and links to https://ftxfuturefund.org/our-2022-plans/ for more information on the 2022 plans. The version https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) on the EA Forum includes the content of both posts from the FTX website.
Announcing Our Regranting Program2022-02-28Nick Beckstead Leopold Aschenbrenner FTX Future FundFTX Future Fund Request for proposalsLongtermismThe FTX Future Fund announces its regranting program that "will offer discretionary budgets to independent part-time grantmakers, to be spent in the next ~6 months. Budgets will typically be in the $250k-few million range. We've already invited a first cohort of 21 regrantors to test the program." The post also invites people to apply to be regrantors or recommend others as regrantors.
2021 AI Alignment Literature Review and Charity Comparison (GW, IR)2021-12-23Ben Hoskin Effective Altruism ForumBen Hoskin Effective Altruism Funds: Long-Term Future Fund Survival and Flourishing Fund FTX Future Fund Future of Humanity Institute Future of Humanity Institute Centre for the Governance of AI Center for Human-Compatible AI Machine Intelligence Research Institute Global Catastrophic Risk Institute Centre for the Study of Existential Risk OpenAI Google Deepmind Anthropic Alignment Research Center Redwood Research Ought AI Impacts Global Priorities Institute Center on Long-Term Risk Centre for Long-Term Resilience Rethink Priorities Convergence Analysis Stanford Existential Risk Initiative Effective Altruism Funds: Long-Term Future Fund Berkeley Existential Risk Initiative 80,000 Hours Survival and Flourishing Fund Review of current state of cause areaAI safetyCross-posted to LessWrong at https://www.lesswrong.com/posts/C4tR3BEpuWviT7Sje/2021-ai-alignment-literature-review-and-charity-comparison (GW, IR) This is the sixth post in a tradition of annual blog posts on the state of AI safety and the work of various organizations in the space over the course of the year; the post is structured similarly to the previous year's post https://forum.effectivealtruism.org/posts/K7Z87me338BQT3Mcv/2020-ai-alignment-literature-review-and-charity-comparison (GW, IR) but has a few new features. The author mentions that he has several conflicts of interest that he cannot individually disclose. He also starts collecting "second preferences" data this year for all the organizations he talks to, which is where the organization would like to see funds go, other than itself. The Long-Term Future Fund is the clear winner here. He also announces that he's looking for a research assistant to help with next year's post given the increasing time demands and his reduced time availability. His final rot13'ed donation decision is to donate to the Long-Term Future Fund so that sufficiently skilled AI safety researchers can make a career with LTFF funding; his second preference for donations is BERI. Many other organizations that he considers to be likely to be doing excellent work are either already well-funded or do not provide sufficient disclosure.

Full list of donations in reverse chronological order (43 donations)

Graph of top 10 donees by amount, showing the timeframe of donations

Graph of donations and their timeframes
DoneeAmount (current USD)Amount rank (out of 43)Donation dateCause areaURLInfluencerNotes
AI Safety Camp290,000.00292022-06AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "partially support the salaries for AI Safety Camp’s two directors and to support logistical expenses at its physical camp."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
University of California, Berkeley (Earmark: Sergey Levine)600,000.00122022-06AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a project to study how large language models integrated with offline reinforcement learning pose a risk of machine deception and persuasion."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
1Day Sooner350,000.00222022-06Biosecurity and pandemic preparedness/coordination effortshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support 1DS’ work on pandemic preparedness, including advocacy for advance market purchase commitments, collaboration with the UK Pandemic Ethics Accelerator on challenge studies, and advocacy with 1Day Africa and the West African Health Organization for a global pandemic insurance fund."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
Nathan Young182,000.00372022-06Epistemic institutions/forecastinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the creation of a website for collaboratively creating public forecasting questions for a range of prediction aggregators and markets."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Legal Priorities Project700,000.0082022-06--https://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support one year of operating expenses and salaries at the Legal Priorities Project, a longtermist legal research and field-building organization."

Other notes: This is the second grant from FTX Future Fund to Legal Priorities Project; the preceding grant of $480,000 was in April 2022. Intended funding timeframe in months: 12.
Prometheus Science Bowl100,000.00412022-05AI safety/talent pipelinehttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a competition for work on Eliciting Latent Knowledge, an open problem in AI alignment, for talented high school and college students who are participating in Prometheus Science Bowl."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
University of California, Berkeley (Earmark: Anca Dragan)800,000.0072022-05AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a project to develop interactive AI algorithms for alignment that can uncover the causal features in human reward systems, and thereby help AI systems learn underlying human values that generalize to new situations."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Association for Long Term Existence and Resilience320,000.00252022-05AI safety|Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support ALTER, an academic research and advocacy organization, which hopes to investigate, demonstrate, and foster useful ways to improve the future in the short term, and to safeguard and improve the long-term trajectory of humanity. The organization's initial focus is building bridges to academia via conferences and grants to find researchers who can focus on AI safety, and on policy for reducing biorisk."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
University of Cambridge (Earmark: Gabriel Recchia)380,000.00212022-05AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support research on how to fine-tune GPT-3 models to identify flaws in other fine-tuned language models' arguments for the correctness of their outputs, and to test whether these help nonexpert humans successfully judge such arguments."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
AI Safety Support200,000.00352022-05AI safety/talent pipelinehttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "for general funding for community building and managing the talent pipeline for AI alignment researchers. AI Safety Support’s work includes one-on-one coaching, events, and research training programs."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
University of Utah (Earmark: Daniel Brown)280,000.00302022-05AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support research on value alignment in AI systems, practical algorithms for efficient value alignment verification, and user studies and experiments to test these algorithms."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Brian Christian300,000.00262022-05AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the completion of a book which explores the nature of human values and the implications for aligning AI with human preferences."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
James Lin190,000.00362022-05AI safety|Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21. The grant recipient had written a blog post https://forum.effectivealtruism.org/posts/qoB8MHe94kCEZyswd/i-want-future-perfect-but-for-science-publications (GW, IR) on 2022-03-08 (during the grant application period) describing the idea.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "allow a reputable technology publication to engage 2-5 undergraduate student interns to write about topics including AI safety, alternative proteins, and biosecurity." See https://forum.effectivealtruism.org/posts/qoB8MHe94kCEZyswd/i-want-future-perfect-but-for-science-publications (GW, IR) for the grantee's original vision.

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Institute for Progress480,000.00162022-05Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the Institute’s research and policy engagement work on high skilled immigration, biosecurity, and pandemic prevention."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Institute for Progress (Earmark: Nikki Teran)135,000.00392022-05Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the creation of biosecurity policy priorities via conversations with experts in security, technology, policy, and advocacy. It will develop position papers, research papers, and agendas for the biosecurity community."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
MITRE (Earmark: Michael Jacob)485,000.00152022-05Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support research that we hope will be used to help strengthen the bioweapons convention and guide proactive actions to better secure those facilities or stop the dangerous work being done there."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Charity Entrepreneurship470,000.00182022-05Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the incubation of new charities that will work on health security."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Virginia Tech (Earmark: Guoliang Liu)500,000.00132022-05Biosecurity and pandemic preparedness/personal protective equipmenthttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a project to develop a new material -- an ultra-thin polymer-based thin film -- for use in next-generation Personal Protective Equipment which is both more effective and more comfortable."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Justin Mares140,000.00382022-05Biosecurity and pandemic preparedness/disinfectionhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support research on the feasibility of inactivating viruses via electromagnetic radiation."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Council on Strategic Risks400,000.00192022-05Biosecurity and pandemic preparedness/coordination effortshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: The grant description says: "We recommended a grant to support a project which will develop and advance ideas for strengthening regional and multilateral cooperation for addressing biological risks and filling gaps in current international institutions. These efforts include promoting the creation of a center with the capacity to rapidly respond to emerging infectious disease threats to prioritize blunting the impact of such events as well as quickly saving lives, and cooperative mechanisms to enhance biosafety and biosecurity while reducing the potential risks of spaces such as high-containment laboratories."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
Rational Animations400,000.00192022-05Effective altruism/information corpushttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Organizational general support

Intended use of funds: Grant to "support the creation of animated videos on topics related to rationality and effective altruism to explain these topics for a broader audience."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
Giving What We Can700,000.0082022-05Effective altruism/effective givinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Organizational general support

Intended use of funds: Grant to "support Giving What We Can’s mission to create a world in which giving effectively and significantly is a cultural norm."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
Non-trivial Pursuits1,000,000.0062022-05Effective altruismhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Organizational general support

Intended use of funds: Grant to "support outreach to help students to learn about career options, develop their skills, and plan their careers to work on the world’s most pressing problems."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
Good Judgment Project300,000.00262022-05Epistemic institutions/forecastinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a Good Judgment initiative to produce forecasts on 10 Our World in Data data sets/charts."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Peter Hrosso230,000.00342022-05Epistemic institutions/forecastinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a project aimed at training large language models to represent the probability distribution over question answers in a prediction market."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Manifold Markets500,000.00132022-05Epistemic institutions/forecastinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support Manifold Markets in building a charity prediction market, as an experiment for enabling effective forecasters to direct altruistic donations."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Apollo Academic Surveys250,000.00312022-05Epistemic institutionshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support Apollo’s work aggregating the views of academic experts in many different fields and making them freely available online."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Global Guessing336,000.00242022-05Epistemic institutions/forecastinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support Global Guessing’s forecasting coverage on the Russian invasion of Ukraine, which they will also use to build tools and infrastructure to support future forecasting work."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Sage700,000.0082022-05Epistemic institutions/forecastinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the creation of a pilot version of a forecasting platform, and a paid forecasting team, to make predictions about questions relevant to high-impact research."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Cornell University (Earmark: Lionel Levine)1,500,000.0042022-04AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support Prof. Levine, as well as students and collaborators, to work on alignment theory research at the Cornell math department."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).

Other notes: The June 2022 update https://ftxfuturefund.org/future-fund-june-2022-update/ by the FTX Future Fund highlights the grant as one of its example grants.
Columbia University (Earmark: Claudia Shi)100,000.00412022-04AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the work of a PhD student [Claudia Shi] working on AI safety at Columbia University."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Intended funding timeframe in months: 36
EffiSciences135,000.00392022-04Effective altruism|AI safety|Biosecurity and pandemic preparedness|Climate changehttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support EffiSciences’s work promoting high impact research on global priorities (e.g. AI safety, biosecurity, and climate change) among French students and academics, and building up a community of people willing to work on important topics."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).

Other notes: Affected countries: France.
High Impact Athletes350,000.00222022-04Effective altruism/effective givinghttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Organizational general support

Intended use of funds: Grant to "support HIA’s work encouraging professional athletes to donate more of their earnings to high impact charities and causes, and to promote a culture of giving among their fans."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
Nonlinear250,000.00312022-04Effective altruism/information corpushttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the maintenance of a library of high-quality audio content on the world’s most pressing problems, and a fund to provide productivity-enhancing equipment and support staff for people working on important social issues."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
Legal Priorities Project480,000.00162022-04--https://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the Legal Priorities Project’s ongoing research and outreach activities. This will allow LPP to pay two new hires and to put on a summer institute for non-US law students in Oxford."

Donor retrospective of the donation: FTX Future Fund would make a further grant of $700,000 in June 2022, about two months later, indicating continued satisfaction with the grantee.
Siddharth Hiregowdara100,000.00412022-03AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the production of high quality materials for learning about AI safety work."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
SecureBio (Earmark: Kevin Esvelt)1,200,000.0052022-03Biosecurity and pandemic preparednesshttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: According to https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Open_call (GW, IR) this grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: The grant description states: "We recommended a grant to support the hiring of several key staff for Dr. Kevin Esvelt’s pandemic prevention work. SecureBio is working to implement universal DNA synthesis screening, build a reliable early warning system, and coordinate the development of improved personal protective equipment and its delivery to essential workers when needed."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).

Other notes: The June 2022 update https://ftxfuturefund.org/future-fund-june-2022-update/ by the FTX Future Fund highlights the grant as one of its example grants.
University of Ottawa (Earmark: Emilio I. Alarcón)250,000.00312022-03Biosecurity and pandemic preparedness/disinfectionhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support a project to develop new plastic surfaces incorporating molecules that can be activated with low-energy visible light to eradicate bacteria and kill viruses continuously."

Donor reason for selecting the donee: The grant description says: "If successful, this project will change how plastic surfaces are currently decontaminated."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made close to the application window for the open call (2022-02-28 to 2022-03-21).
The Center for Election Science300,000.00262022-03Epistemic institutions|Politicshttps://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Direct project expenses

Intended use of funds: The grant description says: "We recommended a grant to support the development of statewide ballot initiatives to institute approval voting. Approval voting is a simple voting method reform that lets voters select all the candidates they wish."
Rethink Priorities700,000.0082022-03--https://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support Rethink’s research and projects aimed at improving humanity’s long-term prospects." Rethink Priorities also does non-human-centric work (such as research into animal welfare) and more neartermist work, and the grant seems limited to the longtermist work.
Centre for Effective Altruism13,940,000.0022022-03Effective altruismhttps://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Organizational general support

Intended use of funds: Grant for "general support for their activities, including running conferences, supporting student groups, and maintaining online resources."
Lightcone Infrastructure2,000,000.0032022-02Effective altruism|AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Organizational general support

Intended use of funds: Grant to "support Lightcone’s ongoing projects including running the LessWrong forum, hosting conferences and events, and maintaining an office space for Effective Altruist organizations."
Longview Philanthropy15,000,000.0012022-02--https://ftxfuturefund.org/our-grants/?_funding_stream=ad-hoc-- Donation process: This grant is part of staff-led grantmaking by FTX Future Fund. https://forum.effectivealtruism.org/posts/paMYXYFYbbjpdjgbt/future-fund-june-2022-update#Staff_led_grantmaking_in_more_detail (GW, IR) says: "Unlike the open call and regranting, these grants and investments are not a test of a particular potentially highly scalable funding model. These are projects we funded because we became aware of them and thought they were good ideas."

Intended use of funds (category): Regranting

Intended use of funds: Grant to "support Longview’s independent grantmaking on global priorities research, nuclear weapons policy, and other longtermist issues."

Similarity to other donors

Sorry, we couldn't find any similar donors.