Luke Ding|Alex Foster|Denise Melchin|Matt Wage|Tara MacAulay money moved

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of March 2022. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Full list of documents in reverse chronological order (8 documents)

Title (URL linked)Publication dateAuthorPublisherAffected donorsAffected doneesDocument scopeNotes
EA Meta Fund and Long-Term Future Fund are looking for applications again until October 11th2019-09-13Denise Melchin Effective Altruism ForumEffective Altruism Funds: Meta Fund Effective Altruism Funds: Long-Term Future Fund Request for proposalsThe blog post announces that two of the funds under Effective Altruism Funds, namely the Long-Term Future Fund and the EA Meta Fund, are open for rolling applications. The application window for the current round ends on October 11. This is a followup to a similar post https://forum.effectivealtruism.org/posts/wuKRAX9uD9akupTJ3/long-term-future-fund-and-ea-meta-fund-applications-open (GW, IR) for the previous grant round
EA Meta Fund: we are open to applications2019-01-05Denise Melchin Effective Altruism ForumEffective Altruism Funds: Meta Fund Request for proposalsThe post announces the existence of a form at https://docs.google.com/forms/d/e/1FAIpQLSeID5kjD9zsvlwgqB3hlX54EINg6_pY6sYl4hm7s-bYuDGiwA/viewform through which one can apply for consideration for receiving a grant from the EA Meta Fund. Submissions made by midnight GMT on January 20 will be considered for the grant distribution to be announced in mid-February, but applications made after this date will be considered for future rounds
EA Meta Fund AMA: 20th Dec 20182018-12-19Alex Foster Denise Melchin Matt Wage Effective Altruism ForumEffective Altruism Funds: Meta Fund Effective Altruism Funds: Meta Fund Donee AMAThe post is an Ask Me Anything (AMA) for the Effective Altruism Meta Fund. The questions and answers are in the post comments. Questions are asked by a number of people including alexherwix, Luke Muehlhauser, Tee Barnett,and Peter Hurford. Answers are provided by Denise Melchin, Matt Wage, and Alex Foster, three of the five people managing the fund. The other two, Luke Ding and Tara MacAulay, do not post any comment replies, but are referenced in some of the replies. The questions include how the meta fund sees its role, how much time they expect to spend allocating grants, what sort of criteria they use for evaluating opportunities, and what data inform their decisions
Scaling OFTW: Our First Hire And Funding From The Open Philanthropy Project2018-08-01Rossa O'Keeffe-O'Donovan One for the WorldOpen Philanthropy Project Luke Ding One for the World Donee periodic updateOne for the World announces grants to it recommended by GiveWell, of $153,750 from the Open Philanthropy Project and $51,250 from Luke Ding. The funding is to cover two years of expenses, including hiring a COO for the first year, and a CEO in the second year. The post also announces the hiring of Evan McVail as COO, fulfilling part of the plan for the grant
EA Funds Beta Launch2017-02-28Tara MacAulay Centre for Effective AltruismEffective Altruism Funds: Meta Fund Effective Altruism Funds: Long-Term Future Fund Effective Altruism Funds: Animal Welfare Fund Effective Altruism Funds: Global Health and Development Fund Effective Altruism Funds: Meta Fund Effective Altruism Funds: Long-Term Future Fund Effective Altruism Funds: Animal Welfare Fund Effective Altruism Funds: Global Health and Development Fund LaunchTara MacAulay of the Centre for Effective Altruism (CEA), the parent of Effective Altruism Funds, describes the beta launch of the project. CEA will revisit within three months to decide whether to make the EA Funds permanent
EAs write about where they give2016-12-09Julia Wise Effective Altruism ForumBlake Borgeson Eva Vivalt Ben Kuhn Alexander Gordon-Brown and Denise Melchin Elizabeth Van Nostrand Machine Intelligence Research Institute Center for Applied Rationality AidGrade Charity Science: Health 80,000 Hours Centre for Effective Altruism Tostan Periodic donation list documentationJulia Wise got submissions from multiple donors about their donation plans and put them together in a single post. The goal was to cover people outside of organizations that publish such posts for their employees
CEA Staff Donation Decisions 20162016-12-06Sam Deere Centre for Effective AltruismWilliam MacAskill Michelle Hutchinson Tara MacAulay Alison Woodman Seb Farquhar Hauke Hillebrandt Marinella Capriati Sam Deere Max Dalton Larissa Hesketh-Rowe Michael Page Stefan Schubert Pablo Stafforini Amy Labenz Centre for Effective Altruism 80,000 Hours Against Malaria Foundation Schistosomiasis Control Initiative Animal Charity Evaluators Charity Science Health New Incentives Project Healthy Children Deworm the World Initiative Machine Intelligence Research Institute StrongMinds Future of Humanity Institute Future of Life Institute Centre for the Study of Existential Risk Effective Altruism Foundation Sci-Hub Vote.org The Humane League Foundational Research Institute Periodic donation list documentationCentre for Effective Altruism (CEA) staff describe their donation plans. The donation amounts are not disclosed.
Join Wall Street. Save the world.2013-03-31Dylan Matthews Washington PostJason Trigg Matt Wage Peter Singer Jeff Kaufman and Julia Wise GiveWell top charities Against Malaria Foundation GiveDirectly Miscellaneous commentaryThe Washington Post article introduces the concept of "earning to give" to the general public, taking the examples of Jason Trigg and Matt Wage, who deliberately chose careers in finance in order to earn more, to give more. It discusses the philosophy of Peter Singer which has been inspirational to these individuals. The Boston-based couple Jeff Kaufman and Julia Wise, who donate a large amount of money every year to charity, are also discussed.

Full list of donations in reverse chronological order (62 donations)

DonorDoneeAmount (current USD)Donation dateCause areaURLNotes
Effective Altruism Funds: Long-Term Future FundHigh Impact Policy Engine60,000.002019-08-30Effective altruism/government policyhttps://app.effectivealtruism.org/funds/far-future/payouts/4UBI3Q0TBGbWcIZWCh4EQV Donation process: Grantee applied through the online application process, and was selected based on review by the fund managers. Helen Toner was the fund manager most excited about the grant, and responsible for the public write-up

Intended use of funds (category): Direct project expenses

Intended use of funds: According to the grant write-up: "This grant funds part of the cost of a full-time staff member for two years, plus some office and travel costs." Also: "HIPE’s primary activities are researching how to have a positive impact in the UK government; disseminating their findings via workshops, blog posts, etc.; and providing one-on-one support to interested individuals."

Donor reason for selecting the donee: The grant write-up says: "Our reasoning for making this grant is based on our impression that HIPE has already been able to gain some traction as a volunteer organization, and on the fact that they now have the opportunity to place a full-time staff member within the Cabinet Office. [...] The fact that the Cabinet Office is willing to provide desk space and cover part of the overhead cost for the staff member suggests that HIPE is engaging successfully with its core audiences.

Donor reason for donating that amount (rather than a bigger or smaller amount): Explicit calculations for the amount are not included, but the grant write-up says that it funds "part of the cost of a full-time staff member for two years, plus some office and travel costs." At around the same time, the Meta Fund grants $40,000 to HIPE, also to cover these costs. It is likely that the combined $100,000 covers part or all of the cost.
Percentage of total donor spend in the corresponding batch of donations: 13.67%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round, as well as by the opportunity that has been opened by the potential for a two-year job in the UK civil service if HIPE secures funding
Intended funding timeframe in months: 24

Donor thoughts on making further donations to the donee: The write-up says: "HIPE does not yet have robust ways of tracking its impact, but they expressed strong interest in improving their impact tracking over time. We would hope to see a more fleshed-out impact evaluation if we were asked to renew this grant in the future."

Other notes: Helen Toner, the fund manager most excited about the grant and the author of the grant write-up, writes: "I’ll add that I personally see promise in the idea of services that offer career discussion, coaching, and mentoring in more specialized settings. (Other fund members may agree with this, but it was not part of our discussion when deciding whether to make this grant, so I’m not sure.)". Affected countries: United Kingdom.
Effective Altruism Funds: Long-Term Future FundStag Lynn23,000.002019-08-30AI safety/upskillinghttps://app.effectivealtruism.org/funds/far-future/payouts/4UBI3Q0TBGbWcIZWCh4EQV Donation process: Grantee applied through the online application process, and was selected based on review by the fund managers. Alex Zhu was the fund manager most excited about the grant, and responsible for the public write-up. Alex Zhu's write-up disclosed a potential conflict of interest because Stag was living with him and helping him with odd jobs. So, comments from Oliver Habryka, another fund manager, are also included

Intended use of funds (category): Living expenses during research project

Intended use of funds: Grantee's "current intention is to spend the next year improving his skills in a variety of areas (e.g. programming, theoretical neuroscience, and game theory) with the goal of contributing to AI safety research, meeting relevant people in the x-risk community, and helping out in EA/rationality related contexts wherever he can (eg, at rationality summer camps like SPARC and ESPR)." Two projects he may pursue include (1) working to implement certificates of impact in the EA/X-risk community, (2) working as an unpaid personal assistant to someone in EA who is sufficiently busy for this form of assistance to be useful, and sufficiently productive for the assistance to be valuable

Donor reason for selecting the donee: Alex Zhu, the fund manager most excited about the grant, writes: "I recommended funding Stag because I think he is smart, productive, and altruistic, has a track record of doing useful work, and will contribute more usefully to reducing existential risk by directly developing his capabilities and embedding himself in the EA community than he would by finishing his undergraduate degree or working a full-time job." Oliver Habryka, another fund manager, writes: "I’ve interacted with Stag in the past and have broadly positive impressions of him, in particular his capacity for independent strategic thinking." He cites Stag's success in Latvian and Galois Mathematics Olympiads, and Stag's contributions to improving ESPR and SPARC, as well as Stag's decision to contribute to those projects, taking this as "another signal of Stag’s talent at selecting and/or improving projects."

Donor reason for donating that amount (rather than a bigger or smaller amount): No amount-specific reason given, but the amount is likely selected to cover a reasonable fraction of living costs for a year
Percentage of total donor spend in the corresponding batch of donations: 5.24%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round
Intended funding timeframe in months: 12
Effective Altruism Funds: Long-Term Future FundRoam Research10,000.002019-08-30Rationality improvementhttps://app.effectivealtruism.org/funds/far-future/payouts/4UBI3Q0TBGbWcIZWCh4EQV Donation process: Grantee applied through the online application process, and was selected based on review by the fund managers. Alex Zhu was the fund manager most excited about the grant, and responsible for the public write-up

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to support the continued development of Roam, a web application from Conor White-Sullivan filling a similar niche as Workflowy. Roam automates the Zettelkasten method, "a note-taking / document-drafting process based on physical index cards." The grant write-up says: "This funding will support Roam’s general operating costs, including expenses for Conor, one employee, and several contractors."

Donor reason for selecting the donee: Fund manager Alex Zhu writes: "On my inside view, if Roam succeeds, an experienced user of the note-taking app Workflowy will get at least as much value switching to Roam as they got from using Workflowy in the first place. (Many EAs, myself included, see Workflowy as an integral part of our intellectual process, and I think Roam might become even more integral than Workflowy" and links to Sarah Constantin's posts on Roam: https://www.facebook.com/sarah.constantin.543/posts/242611079943317 and https://srconstantin.posthaven.com/how-to-make-a-memex Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Long-Term Future FundAlexander Gietelink Oldenziel30,000.002019-08-30AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/4UBI3Q0TBGbWcIZWCh4EQV Donation process: Grantee applied through the online application process, and was selected based on review by the fund managers. Alex Zhu was the fund manager most excited about the grant, and responsible for the public write-up

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to support the work of Alexander Gietelink Oldenziel who is interning at the Machine Intelligence Research Institute (MIRI) at the time of the grant. The grant money provides additional resources for the grantee to continue digging deeper into the topics after his internship at MIRI ends (while staying in regular contact with MIRI researchers); the write-up estimates that it will last him 1.5 years.

Donor reason for selecting the donee: The reasons are roughly similar to the Long-Term Future Fund's past reasons for supporting MIRI and its research agenda, as outlined in the April 2019 report https://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Also, Alex Zhu says in the grant write-up: "I have also spoken to him in some depth, and was impressed both by his research taste and clarity of thought."

Donor reason for donating that amount (rather than a bigger or smaller amount): Amount chosen to be sufficient to allow the grantee to continue digging into AI safety for 1.5 years after his internship with MIRI ends
Percentage of total donor spend in the corresponding batch of donations: 6.83%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round, and also by the grantee's internship with MIRI coming to an end
Intended funding timeframe in months: 18
Effective Altruism Funds: Long-Term Future FundAlexander Siegenfeld20,000.002019-08-30AI safety/deconfusion researchhttps://app.effectivealtruism.org/funds/far-future/payouts/4UBI3Q0TBGbWcIZWCh4EQV Donation process: Grantee applied through the online application process, and was selected based on review by the fund managers. Alex Zhu was the fund manager most excited about the grant, and responsible for the public write-up

Intended use of funds (category): Living expenses during research project

Intended use of funds: Grant for "Characterizing the properties and constraints of complex systems and their external interactions." Specifically, the grantee's "His goal is to get a better conceptual understanding of multi-level world models by coming up with better formalisms for analyzing complex systems at differing levels of scale, building off of the work of Yaneer Bar-Yam." Also: "Alexander plans to publish a paper on his research; it will be evaluated by researchers at MIRI, helping him decide how best to pursue further work in this area."

Donor reason for selecting the donee: Alex Zhu says in the grant write-up: "I decided to recommend funding to Alexander because I think his research directions are promising, and because I was personally impressed by his technical abilities and his clarity of thought. Tsvi Benson-Tilsen, a MIRI researcher, was also impressed enough by Alexander to recommend that the Fund support him." A conflict of interest is also declared: "Alexander and I have been friends since our undergraduate years at MIT."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta Fund80,000 Hours200,000.002019-08-23Effective altruism/movement growth/career counselinghttps://app.effectivealtruism.org/funds/ea-community/payouts/3pxoLG7aRWtETC1lECC6LK Donation process: Part of the July 2019 EA Meta Fund grants round. The grant page says: "7 of the 9 grantees in this round applied through this process." However, it is not clear if 80,000 Hours submitted an application, it may well be one of the two grantees that did not. It also seems that 80,000 Hours is considered as a grantee in each grant round; it received grants in all three previous grant rounds

Intended use of funds (category): Organizational general support

Intended use of funds: 80,000 Hours plans to hirre 5 full-time equivalents (FTE) over 2019 and anothe 5 over 2020. The additional funding gives them the budget needed to pay for these hires

Donor reason for selecting the donee: Donor believes 80,000 Hours is one of the highest impact-per-dollar meta organizations, for reasons similar to those explained in previous grant rounds https://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW (November 2018) and https://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b (March 2019)

Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is similar to amounts granted in previous grant rounds; it is probably at the larger end of the amount the Meta Fund feels comfortable granting to a single organization in one grant round (the total amount granted in the grant round if $466,000, so this amount is around 43% of that total). Also, the amount ($200,000) fills half the funding shortfall of 80,000 Hours from the 2018 fundraiser (of $400,000)
Percentage of total donor spend in the corresponding batch of donations: 42.90%

Donor reason for donating at this time (rather than earlier or later): The EA Meta Fund team is excited to see 80,000 Hours expand headcount and operations. Currently, the expansion is bottlenecked on money: "Right now, they cannot fully commit to hiring in 2020 as their expansion budget has not been filled. Ideally, they would already be searching for those hires, so they are being somewhat slowed down by their lack of funding."
Effective Altruism Funds: Meta FundRethink Priorities12,000.002019-08-23Cause prioritizationhttps://app.effectivealtruism.org/funds/ea-community/payouts/3pxoLG7aRWtETC1lECC6LK Donation process: Part of the July 2019 EA Meta Fund grants round. The grant page says: "7 of the 9 grantees in this round applied through this process." From the description, it appears that Rethink Priorities is one of the 7 grantees who applied

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant covers two projects: (1) geopolitical implications of climate change on mass migration (2) cost-effectiveness of the Treaty on the Prohibition of Nuclear Weapons

Donor reason for selecting the donee: The grant page references the March 2019 grant https://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b and says: "we believe the reasons behind this grant still stand." Further: "We noted in the previous round that we expect there is particular value in Rethink Priorities undertaking commissioned research into areas that are neglected by other researchers. In this round, Rethink Priorities applied for funding to support two specific research projects, both of which were suggested by researchers at other high-impact organizations, and require collaboration with those researchers. We view this as an indication that these projects will provide value."

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely based on amount requested by donee and needed for the two projects
Percentage of total donor spend in the corresponding batch of donations: 2.70%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round, also by the readiness of these projects to be funded
Effective Altruism Funds: Meta FundEffective Altruism Community Building Grants120,000.002019-08-23Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/3pxoLG7aRWtETC1lECC6LK Donation process: Part of the July 2019 EA Meta Fund grants round. The grant page says: "7 of the 9 grantees in this round applied through this process." However, it is not clear if Effective Altruism Community Building Grants submitted an application, it may well be one of the two grantees that did not.

Intended use of funds (category): Direct project expenses

Intended use of funds: The money will add to the pot of money available to Effective Altruism Community Building Grants to disburse

Donor reason for selecting the donee: The grant page says: "This is an early-stage grant; we expect the experimental value to be greater than the direct impact of the grant." Specifically, community building grants seem impotant, and the EA Meta Fund committee believes that decisions on how to allocate community building grants are best handled by the Effective Altruism Community Building Grants. Also: "While the program is relatively new, having been launched 18 months ago, the team has carried out some initial impact assessment; so far, the results seem to be positive."

Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is likely chosen keeping in mind the "early-stage grant" nature
Percentage of total donor spend in the corresponding batch of donations: 26.90%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round, as well as relative maturity of EA CBG and completion of its initial impact assessment

Donor thoughts on making further donations to the donee: The grant page says: "In the future, if we continue to support EA CBG, we plan to dig deeper into their individual grant outcomes and to further discuss the evaluation process and evidence of impact with the CBG team."

Other notes: The grant page says: "As part of our decision to write this grant, we have referred all our highest-potential community building grants to EA CBG.".
Effective Altruism Funds: Meta FundHigh Impact Policy Engine40,000.002019-08-23Effective altruism/government policyhttps://app.effectivealtruism.org/funds/ea-community/payouts/3pxoLG7aRWtETC1lECC6LK Donation process: Part of the July 2019 EA Meta Fund grants round. The grant page says: "7 of the 9 grantees in this round applied through this process." It is not clear whether HIPE is one of the 7 that applied

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to support HIPE in its fundraising efforts to cover salary and travel costs for one full-time employee for 2 years in the UK government.

Donor reason for selecting the donee: The grant page says: "If HIPE can demonstrate value to the UK government department (e.g., through improving policy-making, staff wellbeing, or staff retention), HIPE believes they would have a reasonably strong chance of being made a permanent project fully funded by the government." Further: "This grant is made experimentally, largely on the promise of the idea and the reputation of the two founding volunteers"

Donor reason for donating that amount (rather than a bigger or smaller amount): Amount less than, but contributing a nontrivial fraction to, the two-year salary and travel costs for one full-time employee that it is for
Percentage of total donor spend in the corresponding batch of donations: 8.58%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round, as well as by the opportunity that has been opened by the potential for a two-year job in the UK civil service if HIPE secures funding
Intended funding timeframe in months: 24

Donor thoughts on making further donations to the donee: The grant page says: "Any further funding will be sensitive to the strength of their team in the future." Affected countries: United Kingdom.
Effective Altruism Funds: Meta FundGeneration Pledge30,000.002019-08-23Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/3pxoLG7aRWtETC1lECC6LK Donation process: Part of the July 2019 EA Meta Fund grants round. The grant page says: "7 of the 9 grantees in this round applied through this process." It is not clear whether Generation Pledge is one of the 7 that applied

Intended use of funds (category): Organizational general support

Intended use of funds: The grant page says: "Generation Pledge mirrors the Founders Pledge model, but working with next gens rather than founders. The world's ultra high net worth families collectively own $31.5 trillion. Several thousand next gens will inherit this wealth. Generation Pledge aims to support those next gens to maximise their social impact."

Donor reason for selecting the donee: The grant page says: "[I]n the past few months they have grown their pledger community significantly. They have an expected pledge value of over $300 million, with a number of sensible discount factors applied. We think that these positive updates are enough to justify Generation Pledge being funded through to a later stage, where they will have the opportunity to prove they can turn these pledges into donations to effective charities."

Donor reason for donating that amount (rather than a bigger or smaller amount): The grant page says: "This grant will contribute towards Generation Pledge's immediate funding gap for 2019, giving them more runway to fundraise and further grow their pledger base."
Percentage of total donor spend in the corresponding batch of donations: 6.44%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round, as well as by the progress made by Generation Pledge in getting $300 million pledged

Donor thoughts on making further donations to the donee: The grant page talks of a "later stage, where they will have the opportunity to prove they can turn these pledges into donations to effective charities." It is likely that Generation Pledge will be considered for further grants if it is able to turn donations to pledges
Effective Altruism Funds: Meta FundEA Coaching23,000.002019-08-23Effective altruism/personal coachinghttps://app.effectivealtruism.org/funds/ea-community/payouts/3pxoLG7aRWtETC1lECC6LK Donation process: Part of the July 2019 EA Meta Fund grants round. The grant page says: "7 of the 9 grantees in this round applied through this process." It is not clear whether EA Coaching is one of the 7 that applied

Intended use of funds (category): Organizational general support

Intended use of funds: Grantee is a productivity coaching organization, currently with just one person, Lynette Bye. The grant page says: "This grant will allow Lynette to offer coaching calls to people working at high-impact organizations at a highly subsidized rate, offer free coaching for select referrals from 80,000 Hours, and hire contractors to help create materials to scale her coaching."

Donor reason for selecting the donee: Grant based on Lynette's impact evaluation https://effectivealtruismcoaching.com/results The grant page says: "Given the early-stage nature of her project, we found the results fairly compelling. A number of her clients working at high-impact organizations have reported significant increases in their hours of productive work." Also: "Lynette focuses on clients working in AI alignment and at high-impact "meta" organizations. She has previously worked with employees at FHI, the Open Philanthropy Project, CEA, CHAI, MIRI, DeepMind, and the Forethought Foundation, and she expects to continue to do so. Given that these organizations focus on particularly high-impact areas, we expect that increasing their productivity should be very valuable.

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundRethink Charity15,000.002019-08-23Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/3pxoLG7aRWtETC1lECC6LK Donation process: Part of the July 2019 EA Meta Fund grants round. The grant page says: "7 of the 9 grantees in this round applied through this process." It is not clear whether Rethink Charity is one of the 7 that applied

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to design and run an updated version ofthe EA Survey in 2019. The grant page says: "Rethink has been collaborating with CEA and other meta organizations to gather input on the 2019 survey. Like last year, Rethink will release anonymized data and summary survey output." There is an accompanying $5,000 grant to CEA, which will use the funding to pay Rethink for "bespoke analysis of the results."

Donor reason for selecting the donee: The grant page says: "[I]it is valuable to generate more empirical data to inform movement-building strategy. In particular, we are aware that some high-impact organizations consistently collaborate with the researchers who analyze the survey, in order to inform their organizational and community-building strategies. [...] Maintaining a sufficiently large survey and in-depth analysis seems like one of the better methods to generate data and insights [about the broad effective altruism movement]."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. Also, the grant page says: "We think part of the benefit of the survey is ensuring data that is collected on a regular basis to allow for consistent year-on-year comparisons. Funding at this stage will allow the survey and analysis to be carried out this year in line with the annual publishing cycle."
Intended funding timeframe in months: 12 Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundCentre for Effective Altruism5,000.002019-08-23Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/3pxoLG7aRWtETC1lECC6LK Donation process: Part of the July 2019 EA Meta Fund grants round. The grant page says: "7 of the 9 grantees in this round applied through this process." CEA is not a direct grantee, but an intermediary for part of a grant; it did not apply in this application process

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant for CEA to use to pay Rethink Charity for "bespoke analysis of the results" of the 2019 EA Survey. This accompanies a separate $15,000 grant to Rethink Charity for the 2019 EA Survey.

Donor reason for selecting the donee: The grant page says: "This financial structure is meant to ensure that a significant fraction of Rethink's analysis will be directly relevant to what CEA is looking to learn from the survey."

Donor reason for donating that amount (rather than a bigger or smaller amount): The amount of $5,000, compared to $15,000 directly granted to Rethink Charity to conduct the survey, probably reflects the nature of power balance that the Meta Fund managers want to engineer between CEA and Rethink Charity, in terms of the extent to which they control the priorities of the EA Survey
Percentage of total donor spend in the corresponding batch of donations: 1.07%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. Also, the grant page says: "We think part of the benefit of the survey is ensuring data that is collected on a regular basis to allow for consistent year-on-year comparisons. Funding at this stage will allow the survey and analysis to be carried out this year in line with the annual publishing cycle."
Intended funding timeframe in months: 12
Effective Altruism Funds: Meta FundRC Forward11,000.002019-08-23Effective altruism/fundraisinghttps://app.effectivealtruism.org/funds/ea-community/payouts/3pxoLG7aRWtETC1lECC6LK Donation process: Part of the July 2019 EA Meta Fund grants round. The grant page says: "7 of the 9 grantees in this round applied through this process." It is not clear whether RC Forward is one of the 7 that applied

Intended use of funds (category): Organizational general support

Intended use of funds: Grantee, RC Forward, is a subsidiary of Rethink Charity that specializes in helping donors in Canada make tax-deductible donations. In 2018, it moved $3.3 million to 25 effective charities with just $50,000 in operational expenses

Donor reason for selecting the donee: The grant page says: "RC Forward still looks to be a relatively strong donation opportunity even when applying the most pessimistic estimates. In 2018, RC Forward is expected to have moved at least $3 for every $1 spent." Also: "Even if only 2% of the total money moved through the platform last year was directly caused by RC Forward, they would beat the break-even point." Further: "Although any potential scale-up is limited to the Canadian market, the platform could process many more donations within that market without a proportional rise in costs." It also links to RC Forward's own analysis at https://forum.effectivealtruism.org/posts/n2Y7z5wvjTuNn5TYg/cost-effectiveness-of-rc-forward (GW, IR)

Donor reason for donating that amount (rather than a bigger or smaller amount): The grant page says: "This grant will fill RC Forward's immediate remaining funding gap for 2019."
Percentage of total donor spend in the corresponding batch of donations: 2.36%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round, and by RC Forward getting close to Giving Season 2019
Effective Altruism Funds: Meta FundEffective Thesis10,000.002019-08-23Effective altruism/research guidancehttps://app.effectivealtruism.org/funds/ea-community/payouts/3pxoLG7aRWtETC1lECC6LK Donation process: Part of the July 2019 EA Meta Fund grants round. The grant page says: "7 of the 9 grantees in this round applied through this process." It is not clear whether Effective Thesis is one of the 7 that applied

Intended use of funds (category): Organizational general support

Intended use of funds: The grant page says: "Effective Thesis advises students on choosing an impactful research topic for their thesis by connecting them with experienced researchers." Further: "Providing students with thesis topic advice requires a small level of input from an experienced researcher, usually 1-3 hours per student." And: "Effective Thesis has a network of 44 coaches with the capacity to take on at least 2x as many students as they help now."

Donor reason for selecting the donee: The grant page says: "Since September 2018, Effective Thesis has received over 200 applications from students who want help with figuring out their thesis topic and provided 60 students with coaching. We share their expectation that a small number of cases likely account for the majority of the project's impact. The best cases so far appear to have been cases where students - who already planned to become researchers in the long-term - substantially changed their focus to higher-impact topics primarily because of their work with Effective Thesis."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Long-Term Future FundMachine Intelligence Research Institute50,000.002019-04-07AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Organizational general support

Donor reason for selecting the donee: Grant investigation and influencer Oliver Habryka believes that MIRI is making real progress in its approach of "creating a fundamental piece of theory that helps humanity to understand a wide range of powerful phenomena" He notes that MIRI started work on the alignment problem long before it became cool, which gives him more confidence that they will do the right thing and even their seemingly weird actions may be justified in ways that are not yet obvious. He also thinks that both the research team and ops staff are quite competent

Donor reason for donating that amount (rather than a bigger or smaller amount): Habryka offers the following reasons for giving a grant of just $50,000, which is small relative to the grantee budget: (1) MIRI is in a solid position funding-wise, and marginal use of money may be lower-impact. (2) There is a case for investing in helping grow a larger and more diverse set of organizations, as opposed to putting money in a few stable and well-funded onrganizations.
Percentage of total donor spend in the corresponding batch of donations: 5.42%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round

Donor thoughts on making further donations to the donee: Oliver Habryka writes: "I can see arguments that we should expect additional funding for the best teams to be spent well, even accounting for diminishing margins, but on the other hand I can see many meta-level concerns that weigh against extra funding in such cases. Overall, I find myself confused about the marginal value of giving MIRI more money, and will think more about that between now and the next grant round."

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) . Despite these, Habryka recommends a relatively small grant to MIRI, because they are already relatively well-funded and are not heavily bottlenecked on funding. However, he ultimately decides to grant some amount to MIRI, giving some explanation. He says he will think more about this before the next funding round.
Effective Altruism Funds: Long-Term Future FundCenter for Applied Rationality150,000.002019-04-07Rationality improvementhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Organizational general support

Intended use of funds: The grant is to help the Center for Applied Rationality (CFAR) survive as an organization for the next few months (i.e., till the next grant round, which is 3 months later) without having to scale down operations. CFAR is low on finances because they did not run a 2018 fundraiser. because they felt that running a fundraiser would be in bad taste after what they considered a messup on their part in the Brent Dill situation

Donor reason for selecting the donee: Grant investigator and main influencer Oliver Habryka thinks CFAR intro workshops have had positive impact in 3 ways: (1) establishing epistemic norms, (2) training, and (3) recruitment into the X-risk network (especially AI safety). He also thinks CFAR faces many challenges, including the departure of many key employees, the difficulty of attracting top talent, and a dilution of its truth-seeking focus. However, he is enthusiastic about joint CFAR/MIRI workshops for programmers, where CFAR provides instructors. His final reason for donating is to avoid CFAR having to scale down due to its funding shortfall because it didn't run the 2018 fundraiser

Donor reason for donating that amount (rather than a bigger or smaller amount): The grant amount, which is the largest in this grant round from the EA Long-Term Future Fund, is chosen to be sufficient for CFAR to continue operating as usual till the next grant round from the EA Long-Term Future Fund (in about 3 months). Habryka further elaborates in https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-recommendations#uhH4ioNbdaFrwGt4e (GW, IR) in reply to Milan Griffes, explaining why the grant is large and unrestricted
Percentage of total donor spend in the corresponding batch of donations: 16.25%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round, as well as by CFAR's time-sensitive financial situation; the grant round is a few months after the end of 2018, so the shortfall of funds raised because of not conducting the 2018 fundraiser is starting to hit on the finances
Intended funding timeframe in months: 3

Donor thoughts on making further donations to the donee: Grant investigator and main influencer Oliver Habryka writes: "I didn’t have enough time this grant round to understand how the future of CFAR will play out; the current grant amount seems sufficient to ensure that CFAR does not have to take any drastic action until our next grant round. By the next grant round, I plan to have spent more time learning and thinking about CFAR’s trajectory and future, and to have a more confident opinion about what the correct funding level for CFAR is."

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) In the comments, Milan Griffes asks why such a large, unrestricted grant is being made to CFAR despite these concerns, and also what Habryka hopes to learn about CFAR before the next grant round. There are replies from Peter McCluskey and Habryka, with some further comment back-and-forth.
Effective Altruism Funds: Long-Term Future FundOught50,000.002019-04-07AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Organization financial buffer

Intended use of funds: No specific information is shared on how the funds will be used at the margin, but the general description gives an idea: "Ought is a nonprofit aiming to implement AI alignment concepts in real-world applications"

Donor reason for selecting the donee: Donor is explicitly interested in diversifying funder base for donee, who currently receives almost all its funding from only two sources and is trying to change that. Othewise, same reason as with last round of funds https://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi namely "We believe that Ought’s approach is interesting and worth trying, and that they have a strong team. [...] Part of the aim of the grant is to show Ought as an example of the type of organization we are likely to fund in the future."

Donor reason for donating that amount (rather than a bigger or smaller amount): In write-up for previous grant at https://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi of $10,000, donor says: "Our understanding is that hiring is currently more of a bottleneck for them than funding, so we are only making a small grant." The amount this time is bigger ($50,000) but the general principle likely continues to apply
Percentage of total donor spend in the corresponding batch of donations: 5.42%

Donor reason for donating at this time (rather than earlier or later): In the previous grant round, donor had said "Part of the aim of the grant is to show Ought as an example of the type of organization we are likely to fund in the future." Thus, it makes sense to donate again in this round

Other notes: The grant reasoning is written up by Matt Wage and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) but the comments on the post do not discuss this specific grant.
Effective Altruism Funds: Long-Term Future FundForetold70,000.002019-04-07Forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant will be mainly used by Ozzie Gooen to pay programmers to work on Foretold at http://www.foretold.io/ a forecasting application that handles full probability distributions. This includes work on Ken.js, a private version of Wikidata that Gooen has started integrating with Foretold

Donor reason for selecting the donee: Grant investigator and main influencer Oliver Habryka gives these reasons for the grant, as well as other forecasting-related grants made to Anthony Aguirre (Metaculus) and Jacob Lagerros: (1) confusion about what is progress and what problems need solving, (2) need for many people to collaborate and document, (3) low-hanging fruit in designing better online platforms for making intellectual progress -- Habryka works on LessWrong 2.0 for that reason, and Gooen has past experience in the space with his building of Guesstimate, (4) promise and tractability for forecasting platforms in particular (for instance, work by Philip Tetlock and work by Robin Hanson), (5) Even though some platforms, such as Predictionbook and Guesstimate, did not get the traction they expected, others like the Good Judgment Project have been successful, so one should not overgeneralize from a few failures. In addition, Habryka has a positive impression of Gooen in both in-person interaction and online writing

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 7.58%

Donor reason for donating at this time (rather than earlier or later): Timing determined partly by timing of grant round. Gooen was a recipient of a previous $20,000 grant from the same fund (the EA Long-Term Future Fund) and found the money very helpful. He applied for more money in this round to scale the project up further

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) The comments discuss this and the other forecasting grants, and include the question "why are you acting as grant-givers here rather than as special interest investors?" It is also included in a list of potentially concerning grants in a portfolio evaluation comment https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions#d4YHzSJnNWmyxf6HM (GW, IR) by Evan Gaensbauer.
Effective Altruism Funds: Long-Term Future FundEffective Altruism Zürich17,900.002019-04-07AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Direct project expenses

Intended use of funds: A two-day workshop by Alex Lintz and collaborators from EA Zürich for effective altruists interested in AI governance careers, with the goals of giving participants background on the space, offering career advice, and building community.

Donor reason for selecting the donee: Donor writes: "We agree with their assessment that this space is immature and hard to enter, and believe their suggested plan for the workshop looks like a promising way to help participants orient to careers in AI governance."

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 1.93%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. No specific timing-related considerations are discussed
Intended funding timeframe in months: 1

Other notes: The grant reasoning is written up by Helen Toner and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) but the comments on the post do not discuss this specific grant.
Effective Altruism Funds: Long-Term Future FundTessa Alexanian26,250.002019-04-07Biosecurity and pandemic preparednesshttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Direct project expenses

Intended use of funds: A one day biosecurity summit, immediately following the SynBioBeta industry conference.

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 2.84%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. No specific timing-related considerations are discussed
Intended funding timeframe in months: 1

Other notes: The grant reasoning is written up by Matt Wage and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) but the comments on the post do not discuss this specific grant.
Effective Altruism Funds: Long-Term Future FundShahar Avin40,000.002019-04-07AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Direct project expenses

Intended use of funds: Hiring an academic research assistant and other miscellaneous research expenses, for scaling up scenario role-play for AI strategy research and training.

Donor reason for selecting the donee: Donor writes: "We think positively of Shahar’s past work (for example this report), and multiple people we trust recommended that we fund him." The linked report is https://maliciousaireport.com/

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 4.33%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. No specific timing-related considerations are discussed

Other notes: The grant reasoning is written up by Matt Wage and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) but the comments on the post do not discuss this specific grant.
Effective Altruism Funds: Long-Term Future FundLucius Caviola50,000.002019-04-07Effective altruism/long-termismhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted). Donee also applied to the EA Meta Fund (another of the Effective Altruism Funds) and the total funding for the donee was split between the funds

Intended use of funds (category): Living expenses during research project

Intended use of funds: Part of the costs for a 2-year postdoc at Harvard working with Professor Joshua Greene. Grantee plans to study the psychology of effective altruism and long-termism. The funding from the Long-Term Future Fund is roughly intended to cover the part of the costs that corresponds to the work on long-termism

Donor reason for donating that amount (rather than a bigger or smaller amount): Total funding requested by the donee appears to be $130,000. Of this, $80,000 is provided by the EA Meta Fund in their March 2019 grant round https://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b to cover the donee's work on effective altruism, while the remaining $50,000 is provided through this grant by the Long-Term Future Fund, and covers the work on long-termism. The reason for splitting funding in this way is not articulated
Percentage of total donor spend in the corresponding batch of donations: 5.42%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. No specific timing-related considerations are discussed. However, the write-up for the $80,000 grant provided by the EA Meta Fund https://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b calls the grant a "time-bounded, specific opportunity that requires funding to initiate and explore" and similar reasoning may also apply to the $50,000 Long-Term Future Fund grant
Intended funding timeframe in months: 24

Other notes: The grant reasoning is written up by Matt Wage and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) but the comments on the post do not discuss this specific grant.
Effective Altruism Funds: Long-Term Future FundNikhil Kunapuli30,000.002019-04-07AI safety/deconfusion researchhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Living expenses during research project

Intended use of funds: Grantee is doing independent deconfusion research for AI safety. His approach is to develop better foundational understandings of various concepts in AI safety, like safe exploration and robustness to distributional shift, by exploring these concepts in complex systems science and theoretical biology, domains outside of machine learning for which these concepts are also applicable.

Donor reason for selecting the donee: Fund manager Alex Zhu says: "I recommended that we fund Nikhil because I think Nikhil’s research directions are promising, and because I personally learn a lot about AI safety every time I talk with him."

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 3.25%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. No specific timing-related considerations are discussed

Donor thoughts on making further donations to the donee: Alex Zhu, in his grant write-up, says that the quality of the work will be assessed by researchers at MIRI. Although it is not explicitly stated, it is likely that this evaluation will influence the decision of whether to make further grants

Other notes: The grant reasoning is written up by Alex Zhu and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) but the comments on the post do not discuss this specific grant.
Effective Altruism Funds: Long-Term Future FundAnand Srinivasan30,000.002019-04-07AI safety/deconfusion researchhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Living expenses during research project

Intended use of funds: Grantee is doing independent deconfusion research for AI safety. His angle of attack is to develop a framework that will allow researchers to make provable claims about what specific AI systems can and cannot do, based off of factors like their architectures and their training processes.

Donor reason for selecting the donee: Grantee worked with main grant influencer Alex Zhu at an enterprise software company that they cofounded. Alex Zhu says in his grant write-up: "I recommended that we fund Anand because I think Anand’s research directions are promising, and I personally learn a lot about AI safety every time I talk with him."

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 3.25%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. No specific timing-related considerations are discussed

Donor thoughts on making further donations to the donee: Alex Zhu, in his grant write-up, says that the quality of the work will be assessed by researchers at MIRI. Although it is not explicitly stated, it is likely that this evaluation will influence the decision of whether to make further grants

Other notes: The quality of grantee's work will be judged by researchers at the Machine Intelligence Research Institute. The grant reasoning is written up by Alex Zhu and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) but the comments on the post do not discuss this specific grant.
Effective Altruism Funds: Long-Term Future FundEffective Altruism Russia28,000.002019-04-07Rationality improvementhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to Mikhail Yugadin for Effective Altruism Russia to give copies of Harry Potter and the Methods of Rationality to the winners of EGMO 2019 and IMO 2020.

Donor reason for selecting the donee: In the grant write-up, Oliver Habryka explains his evaluation of the grant as based on three questions: (1) What effects does reading HPMOR have on people? (2) How good of a target group are Math Olympiad winners for these effects? (3) Is the team competent enough to execute on their plan?

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee). The comments include more discussion of the unit economics of the grant, and whether the effective cost of $43/copy is reasonable
Percentage of total donor spend in the corresponding batch of donations: 3.03%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. No specific timing-related considerations are discussed. The need to secure money in advance of the events for which the money will be used likely affected the timing of the application

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) There is a lot of criticism and discussion of the grant in the comments.
Effective Altruism Funds: Long-Term Future FundAlex Turner30,000.002019-04-07AI safety/agent foundationshttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Living expenses during research project

Intended use of funds: Grant for building towards a “Limited Agent Foundations” thesis on mild optimization and corrigibility. Grantee is a third-year computer science PhD student funded by a graduate teaching assistantship; to dedicate more attention to alignment research, he is applying for one or more trimesters of funding (spring term starts April 1).

Donor reason for selecting the donee: In the grant write-up, Oliver Habryka explains that he is excited by (a) Turner's posts to LessWrong reviewing many math textbooks useful for thinking about the alignment problem, (b) Turner not being intimidated by the complexity of the problem, and (c) Turner writing up his thoughts and hypotheses in a clear way, seeking feedback on them early, and making a set of novel contributions to an interesting sub-field of AI Alignment quite quickly (in the form of his work on impact measures, on which he recently collaborated with the DeepMind AI Safety team).

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 3.25%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. No specific timing-related considerations are discussed
Intended funding timeframe in months: 4

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) The comments on the post do not discuss this specific grant.
Effective Altruism Funds: Long-Term Future FundDavid Girardo30,000.002019-04-07AI safety/deconfusion researchhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Living expenses during research project

Intended use of funds: Grantee is doing independent deconfusion research for AI safety. His angle of attack is to elucidate the ontological primitives for representing hierarchical abstractions, drawing from his experience with type theory, category theory, differential geometry, and theoretical neuroscience.

Donor reason for selecting the donee: The main investigator and influencer for the grant, Alex Zhu, finds the research directions promising. Tsvi Benson-Tilsen, a MIRI researcher, has also recommended that grantee get funding.

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 3.25%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. No specific timing-related considerations are discussed

Donor thoughts on making further donations to the donee: The quality of the grantee's work will be assessed by researchers at MIRI

Other notes: The grant reasoning is written up by Alex Zhu and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) but the comments on the post do not discuss this specific grant.
Effective Altruism Funds: Long-Term Future FundTegan McCaslin30,000.002019-04-07AI safety/forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Living expenses during research project

Intended use of funds: Grant for independent research projects relevant to AI forecasting and strategy, including (but not necessarily limited to) some of the following: (1) Does the trajectory of AI capability development match that of biological evolution? (2) How tractable is long-term forecasting? (3) How much compute did evolution use to produce intelligence? (4)Benchmarking AI capabilities against insects. Short doc on (1) and (2) at https://docs.google.com/document/d/1hTLrLXewF-_iJiefyZPF6L677bLrUTo2ziy6BQbxqjs/edit

Donor reason for selecting the donee: Reasons for the grant from Oliver Habryka, the main influencer, include: (1) It's easier to relocate someone who has already demonstrated trust and skills than to find someone completely new, (2.1) It's important to give good researchers runway while they find the right place. Habryka notes: "my brief assessment of Tegan’s work was not the reason why I recommended this grant, and if Tegan asks for a new grant in 6 months to focus on solo research, I will want to spend significantly more time reading her output and talking with her, to understand how these questions were chosen and what precise relation they have to forecasting technological progress in AI."

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee). Habryka also mentions that he is interested only in providing limited runway, and would need to assess much more carefully for a more long-term grant
Percentage of total donor spend in the corresponding batch of donations: 3.25%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. However, it is also related to the grantee's situation (she has just quit her job at AI Impacts, and needs financial runway to continue pursuing promising research projects)
Intended funding timeframe in months: 6

Donor thoughts on making further donations to the donee: The grant investigator Oliver Habryka notes: "if Tegan asks for a new grant in 6 months to focus on solo research, I will want to spend significantly more time reading her output and talking with her, to understand how these questions were chosen and what precise relation they have to forecasting technological progress in AI."

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) The comments on the post do not discuss this specific grant, but a grant to Lauren Lee that includes somewhat similar reasoning (providing people runway after they leave their jobs, so they can explore better) attracts some criticism.
Effective Altruism Funds: Long-Term Future FundMetaculus70,000.002019-04-07Forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to Anothony Aguirre to expand the Metaculus prediction platform along with its community. Metaculus.com is a fully-functional prediction platform with ~10,000 registered users and >120,000 predictions made to date on more than >1000 questions. The two major high-priority expansions are: (1) An integrated set of extensions to improve user interaction and information-sharing. This would include private messaging and notifications, private groups, a prediction “following” system to create micro-teams within individual questions, and various incentives and systems for information-sharing. (2) Link questions into a network. Users would express links between questions, from very simple (“notify me regarding question Y when P(X) changes substantially) to more complex (“Y happens only if X happens, but not conversely”, etc.) Information can also be gleaned from what users actually do.

Donor reason for selecting the donee: The grant investigator and main influencer, Oliver Habryka, refers to reasoning included in the grant to Ozzie Gooen for Foretold, that is made in the same batch of grants and described at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) He also lists these reasons for liking Metaculus: (1) Valuable service in the past few years, (2) Cooperation with the X-risk space to get answers to important questions

Donor reason for donating that amount (rather than a bigger or smaller amount): The grantee requested $150,000, but Oliver Habryka, the grant investigator, was not confident enough in the grant to recommend the full amount. Some concerns mentioned: (1) Lack of a dedicated full-time resource, (2) Overlap with the Good Judgment Project, that reduces its access to resources and people
Percentage of total donor spend in the corresponding batch of donations: 7.58%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) The comments discuss this and the other forecasting grants, and include the question "why are you acting as grant-givers here rather than as special interest investors?" It is also included in a list of potentially concerning grants in a portfolio evaluation comment https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions#d4YHzSJnNWmyxf6HM (GW, IR) by Evan Gaensbauer.
Effective Altruism Funds: Long-Term Future FundRobert Miles39,000.002019-04-07AI safety/content creation/videohttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to create video content on AI alignment. Grantee has a YouTube channel at https://www.youtube.com/channel/UCLB7AzTwc6VFZrBsO2ucBMg (average 20,000 views per video) and also creates videos for the Computerphile channel https://www.youtube.com/watch?v=3TYT1QfdfsM&t=2s (often more than 100,000 views per video)

Donor reason for selecting the donee: Grant investigator and main influencer Oliver Habryka favors the grant for these reasons: (1) Grantee explains AI alignment as primarily a technical problem, not a moral or political problem, (2) Grantee does not politicize AI safety, (3) Grantee's goal is to create interest in these problems from future researchers, and not to simply get as large of an audience as possible. Habryka notes that the grantee is the first skilled person in the X-risk community working full-time on producing video content. "Being the very best we have in this skill area, he is able to help the community in a number of novel ways (for example, he’s already helping existing organizations produce videos about their ideas)." In the previous grant round, the grantee had requested funding for a collaboration with RAISE to produce videos for them, but Habryka felt it was better to fund the grantee directly and allow him to decide which organizations he wanted to help with his videos

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 4.22%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR).
Effective Altruism Funds: Long-Term Future FundJacob Lagerros27,000.002019-04-07AI safety/forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Living expenses during research project|Direct project expenses

Intended use of funds: Grant to build a private platform where AI safety and policy researchers have direct access to a base of superforecaster-equivalents. Lagerros previously received two grants to work on the project: a half-time salary from Effective Altruism Grants, and a grant for direct project expenses from Berkeley Existential Risk Initiative.

Donor reason for selecting the donee: Grant investigator and main influencer notes the same high-level reasons for the grant as for similar grants to Anothony Aguirre (Metaculus) and Ozzie Gooen (Foretold); the general reasons are explained in the grant writeup for Gooen. Habryka also mentions Lagerros being around the community for 3 years, and having done useful owrk and received other funding. Habryka mentions he did not assess the grant in detail; the main reason for granting from the Long-Term Future Fund was due to logistical complications with other grantmakers (FHI and BERI), who already vouched for the value of the project

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 2.92%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) The comments discuss this and the other forecasting grants, and include the question "why are you acting as grant-givers here rather than as special interest investors?" It is also included in a list of potentially concerning grants in a portfolio evaluation comment https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions#d4YHzSJnNWmyxf6HM (GW, IR) by Evan Gaensbauer.
Effective Altruism Funds: Long-Term Future FundOrpheus Lummis10,000.002019-04-07AI safety/upskillinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Living expenses during research project

Intended use of funds: Grant for upskilling in contemporary AI techniques, deep RL and AI safety, before pursuing a ML PhD. Notable planned subprojects: (1) Engaging with David Krueger’s AI safety reading group at Montreal Institute for Learning Algorithms (2) Starting & maintaining a public index of AI safety papers, to help future literature reviews and to complement https://vkrakovna.wordpress.com/ai-safety-resources/ as a standalone wiki-page (eg at http://aisafetyindex.net ) (3) From-scratch implementation of seminal deep RL algorithms (4) Going through textbooks: Goodfellow Bengio Courville 2016, Sutton Barto 2018 (5) Possibly doing the next AI Safety camp (6) Building a prioritization tool for English Wikipedia using NLP, building on the literature of quality assessment (https://paperpile.com/shared/BZ2jzQ) (7) Studying the AI Alignment literature

Donor reason for selecting the donee: Grant investigator and main influencer Oliver Habryka is impressed with the results of the AI Safety Unconference organized by Lummis after NeurIPS with Long-Term Future Fund money. However, he is not confident of the grant, writing: "I don’t know Orpheus very well, and while I have received generally positive reviews of their work, I haven’t yet had the time to look into any of those reviews in detail, and haven’t seen clear evidence about the quality of their judgment." Habryka also favors more time for self-study and reflection, and is excited about growing the Montral AI alignment community. Finally, Habryka thinks the grant amount is small and is unlikely to have negative consequences

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee). The small amount is also one reason grant investigator Oliver Habryka is comfortable making the grant despite not investigating thoroughly
Percentage of total donor spend in the corresponding batch of donations: 1.08%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) The comments on the post do not discuss this specific grant.
Effective Altruism Funds: Long-Term Future FundLauren Lee20,000.002019-04-07Rationality communityhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Living expenses during research project

Intended use of funds: Grant for working to prevent burnout and boost productivity within the EA and X-risk communities. From the grant application: (1) Grant requested to spend the coming year thinking about rationality and testing new projects. (2) The goal is to help individuals and orgs in the x-risk community orient towards and achieve their goals. (A) Training the skill of dependability. (B) Thinking clearly about AI risk. (C) Reducing burnout. (3) Measurable outputs include programs with 1-on-1 sessions with individuals or orgs, X-risk orgs spending time/money on services, writings or talks, workshops with feedback forms, and improved personal effectiveness

Donor reason for selecting the donee: Grant investigator and main influencer Habryka describes his grant reasoning as follows: "In sum, this grant hopefully helps Lauren to recover from burning out, get the new rationality projects she is working on off the ground, potentially identify a good new niche for her to work in (alone or at an existing organization), and write up her ideas for the community."

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 2.17%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round
Intended funding timeframe in months: 6

Donor thoughts on making further donations to the donee: Grant investigator and main influencer Oliver Habrkya qualifies the likelihood of giving another grant as follows: "I think that she should probably aim to make whatever she does valuable enough that individuals and organizations in the community wish to pay her directly for her work. It’s unlikely that I would recommend renewing this grant for another 6 month period in the absence of a relatively exciting new research project/direction, and if Lauren were to reapply, I would want to have a much stronger sense that the projects she was working on were producing lots of value before I decided to recommend funding her again."

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) The grant receives criticism in the comments, including 'This is ridiculous, I'm sure she's a great person but please don't use the gift you received to provide sinecures to people "in the community"'.
Effective Altruism Funds: Long-Term Future FundKocherga50,000.002019-04-07Rationality communityhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to Vyacheslav Matyuhin for Kocherga, an offline community hub for rationalists and EAs in Moscow. Kocherga's concrete plans with the grant include: (1) Add 2 more people to the team. (2) Implement a new community-building strategy. (3) Improve the rationalty workshops.

Donor reason for selecting the donee: Grant investigator and main influencer Oliver Habryka notes that the Russian rationality community has been successful, with projects such as https://lesswrong.ru (Russian translation of LessWrong sequences), kickstarter to distribute copies of HPMOR, and Kocherga, a financially self-sustaining anti-cafe in Moscow that hosts a variety of events for roughly 100 attendees per week. The grant reasoning references the LessWrong post https://www.lesswrong.com/posts/WmfapdnpFfHWzkdXY/rationalist-community-hub-in-moscow-3-years-retrospective (GW, IR) by Kocherga. The grant is being made by the Long-Term Future Fund because the EA Meta Fund decided not to make it

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 5.42%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR). Affected countries: Russia.
Effective Altruism Funds: Long-Term Future FundConnor Flexman20,000.002019-04-07AI safety/forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Living expenses during research project

Intended use of funds: Grant to perform independent research in collaboration with John Salvatier

Donor reason for selecting the donee: The grant was originally requested by John Salvatier (who is already funded by an EA Grant), as a grant to Salvatier to hire Flexman to help him. But Oliver Habryka (the primary person on whose recommendation the grant was made) ultimately decided to give the money to Flexman to give him more flexibility to switch if the work with Salvatier does not go well. Despite the reservations, Habryka considers significant negative consequences unlkely. Habryka also says: "I assign some significant probability that this grant can help Connor develop into an excellent generalist researcher of a type that I feel like EA is currently quite bottlenecked on." Habryka has two other reservations: potential conflict of interest because he lives in the same house as the recipient, and lack of concrete, externally verifiable evidence of competence

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 2.17%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) Habryka was the primary person on whose recommendation the grant was made. Habryka replies to a comment giving ideas on what independent research Flexman might produce if he stops working with Salvatier.
Effective Altruism Funds: Long-Term Future FundEli Tyre30,000.002019-04-07Rationality improvementhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to support projects for rationality and community building interventions. Example projects: facilitating conversations between top people in AI alignment, organization advanced workshops on double crux, doing independent research projects such as https://www.lesswrong.com/posts/tj8QP2EFdP8p54z6i/historical-mathematicians-exhibit-a-birth-order-effect-too (GW, IR) (evaluating burth order effects in mathematicians), providing new EAs and rationalists with advice and guidance on how to get traction on working on important problems, and helping John Salvatier develop techniques around skill transfer. Grant investigator and main influencer Oliver Habryka writes: "the goal of this grant is to allow [Eli Tyre] to take actions with greater leverage by hiring contractors, paying other community members for services, and paying for other varied expenses associated with his projects."

Donor reason for selecting the donee: Grant investigation and main influencer is excited about the projects Tyre is interested in working on, and writes: "Eli has worked on a large variety of interesting and valuable projects over the last few years, many of them too small to have much payment infrastructure, resulting in him doing a lot of work without appropriate compensation. I think his work has been a prime example of picking low-hanging fruit by using local information and solving problems that aren’t worth solving at scale, and I want him to have resources to continue working in this space."

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 3.25%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round

Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decision (GW, IR).
Effective Altruism Funds: Long-Term Future FundAI Safety Camp25,000.002019-04-07AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to fund an upcoming camp in Madrid being organized by AI Safety Camp in April 2019. The camp consists of several weeks of online collaboration on concrete research questions, culminating in a 9-day intensive in-person research camp. The goal is to support aspiring researchers of AI alignment to boost themselves into productivity.

Donor reason for selecting the donee: The grant investigator and main influencer Oliver Habryka mentions that: (1) He has a positive impression of the organizers and has received positive feedback from participants in the first two AI Safety Camps. (2) A greater need to improve access to opportunities in AI alignment for people in Europe. Habryka also mentions an associated greater risk of making the AI Safety Camp the focal point of the AI safety community in Europe, which could cause problems if the quality of the people involved isn't high. He mentions two more specific concerns: (a) Organizing long in-person events is hard, and can lead to conflict, as the last two camps did. (b) People who don't get along with the organizers may find themselves shut out of the AI safety network

Donor reason for donating that amount (rather than a bigger or smaller amount): Likely to be the amount requested by the donee in the application (this is not stated explicitly by either the donor or the donee)
Percentage of total donor spend in the corresponding batch of donations: 2.71%

Donor reason for donating at this time (rather than earlier or later): Timing determined by the timing of the camp (which is scheduled for April 2019; the grant is being made around the same time) as well as the timing of the grant round
Intended funding timeframe in months: 1

Donor thoughts on making further donations to the donee: Grant investigator and main influencer Habryka writes: "I would want to engage with the organizers a fair bit more before recommending a renewal of this grant"

Donor retrospective of the donation: The August 2019 grant round would include a $41,000 grant to AI Safety Camp for the next camp, with some format changes. However, in the write-up for that grant round, Habryka says " In April I said I wanted to talk with the organizers before renewing this grant, and I expected to have at least six months between applications from them, but we received another application this round and I ended up not having time for that conversation." Also: "I will not fund another one without spending significantly more time investigating the program."

Other notes: Grantee in the grant document is listed as Johannes Heidecke, but the grant is for the AI Safety Camp. The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) Grant decision was coordinated with Effective Altruism Grants (specifically, Nicole Ross of CEA) who had considered also making a grant to the camp. Effective Altruism Grants ultimately decided against making the grant, and the Long-Term Future Fund made it instead. Nicole Ross, in the evaluation by EA Grants, mentions the same concerns that Habryka does: interpersonal conflict and people being shut out of the AI safety community if they don't get along with the camp organizers.
Effective Altruism Funds: Long-Term Future FundAI Safety Camp41,000.002019-04-07AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Donation process: Grantee applied through the online application process, and was selected based on review by the fund managers. Oliver Habryka was the fund manager most excited about the grant, and responsible for the public write-up

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to fund the 4th AI Safety Camp (AISC) - a research retreat and program for prospective AI safety researchers. From the grant application: "Compared to past iterations, we plan to change the format to include a 3 to 4-day project generation period and team formation workshop, followed by a several-week period of online team collaboration on concrete research questions, a 6 to 7-day intensive research retreat, and ongoing mentoring after the camp. The target capacity is 25 - 30 participants, with projects that range from technical AI safety (majority) to policy and strategy research."

Donor reason for selecting the donee: Habryka, in his grant write-up, says: "I generally think that hackathons and retreats for researchers can be very valuable, allowing for focused thinking in a new environment. I think the AI Safety Camp is held at a relatively low cost, in a part of the world (Europe) where there exist few other opportunities for potential new researchers to spend time thinking about these topics, and some promising people have attended. " He also notes two positive things: (1) The attendees of the second camp all produced an artifact of their research (e.g. an academic writeup or code repository). (2) Changes to the upcoming camp address some concerns raised in feedback on previous camps.

Donor reason for donating that amount (rather than a bigger or smaller amount): No explicit reasons for amount given, but the amount is likely determined by the budget requested by the grantee. For comparison, the amount granted for the previous AI safety camp was $25,000, i.e., a smaller amount. The increased grant size is likely due to the new format of the camp making it longer
Percentage of total donor spend in the corresponding batch of donations: 9.34%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round as well as intended timing of the 4th AI Safety Camp the grant is for
Intended funding timeframe in months: 1

Donor thoughts on making further donations to the donee: Habryka writes: "I will not fund another one without spending significantly more time investigating the program."

Other notes: Habryka notes: "After signing off on this grant, I found out that, due to overlap between the organizers of the events, some feedback I got about this camp was actually feedback about the Human Aligned AI Summer School, which means that I had even less information than I thought. In April I said I wanted to talk with the organizers before renewing this grant, and I expected to have at least six months between applications from them, but we received another application this round and I ended up not having time for that conversation.".
Effective Altruism Funds: Meta FundLucius Caviola80,000.002019-03-07Effective altruism/talent pipelinehttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Reasons for grant: (1) Lucius Caviola is a well-respected PhD-level academic, with a focus on effective altruism and long-termism, (2) Lucius has been accepted to work for two years as a postdoc researcher in psychology with the highly renowned professor Joshua Greene at Harvard University, on the condition that he can bring his own research funding, (3) Donor believes that very high-quality academic research is a highly impactful activity in expectation, (4) Psychology at the higher levels has been very influential, (5) Donor believes that where foundational researchers focusing on effective altruism have displayed excellence, they should not be bottlenecked on funding considerations wherever possible, (6) This is a time-bounded, specific opportunity that requires funding to initiate and explore, and donor believes both that the value of information from this speculative grant is high, and that the project could have a large potential upside through increasing the quality and quantity of information available to address the world’s biggest problems. Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundForethought Foundation for Global Priorities Research64,000.002019-03-07Cause prioritizationhttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Donor believes that academic research is a high-impact but high-investment way to expand the frontiers of discussion and explore complex, nuanced topics. Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundEffective Altruism Norway18,000.002019-03-07Effective altruism/talent pipelinehttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Grant to run an Operations Camp in Oslo, with the aim of upskilling promising talent and providing opportunities to test fit in key operations roles. Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundEffective Altruism Geneva18,000.002019-03-07Cause prioritizationhttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Grant to run a policy research project to determine what prioritization models are appropriate to run under what circumstances. Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundEffective Altruism Netherlands17,000.002019-03-07Effective altruism/fundraisinghttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Grant to help 35 charities that are either GiveWell-recommended or have received grants from the Open Philanthropy Project to achieve tax-deductible status in the Netherlands. Affected countries: Netherlands; Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundOne for the World15,000.002019-03-07Effective altruism/fundraisinghttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Grant to support the creation and growth of chapters at undergraduate, MBA and law schools. Chapter leaders and student ambassadors encourage their classmates to commit to donating a percentage of their income upon graduation. Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundThe Life You Can Save10,000.002019-03-07Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Grant to support the tenth anniversary relaunch of Peter Singer's book The Life You Can Save; grantee is the eponymous organization. Project has an immediate, one-off funding requirement and donor believes it has high upside, judging from the impact that the original book had. Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta Fund80,000 Hours170,000.002019-03-07Effective altruism/movement growth/career counselinghttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Donation process: This is part of the March 2019 grant round from the EA Meta Fund, comprising "a mixture of larger grants to more established meta groups and smaller grants to fund both younger organizations and specific projects." The grant to 80,000 Hours is among the "larger grants to more established meta groups"

Intended use of funds (category): Organizational general support

Donor reason for selecting the donee: Reasons for grant: (1) High impact per dollar, (2) Highly impactful and cost-effective in the past, for same reasons as discussed in https://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW (previous grant), (3) Undergoing significant growth, (4) Not yet filled funding gap, (5) Big potential upside is reducing talent bottlenecks in cause areas that are crucial and highly technically challenging. Other than (5), all the other reasons are shared as stated with another simultaneous grant made to Founders Pledge

Donor reason for donating at this time (rather than earlier or later): Timing largely determined by timing of the grant round. Also, it is relevant that 80,000 Hours has not yet filled its funding gap

Donor retrospective of the donation: The following $200,000 grant in the July 2019 grant round https://app.effectivealtruism.org/funds/ea-community/payouts/3pxoLG7aRWtETC1lECC6LK suggests that this grant would be considered a success Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundFounders Pledge110,000.002019-03-07Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Donation process: This is part of the March 2019 grant round from the EA Meta Fund, comprising "a mixture of larger grants to more established meta groups and smaller grants to fund both younger organizations and specific projects." The grant to Founders Pledge is among the "larger grants to more established meta groups"

Intended use of funds (category): Organizational general support

Donor reason for selecting the donee: Reasons for grant: (1) High impact per dollar, (2) Highly impactful and cost-effective in the past, for same reasons as discussed in https://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW (previous grant), (3) Undergoing significant growth, (4) Not yet filled funding gap, (5) Big potential upside is long-lasting positive effect on the culture of smart major philanthropy. Other than (5), all the other reasons are shared as stated with another simultaneous grant made to 80,000 Hours

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round. Also relevant is the high room for more funding of the organization Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundRethink Priorities10,000.002019-03-07Cause prioritization/Animal welfarehttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Donation process: This is part of the March 2019 grant round from the EA Meta Fund, comprising "a mixture of larger grants to more established meta groups and smaller grants to fund both younger organizations and specific projects." The grant to 80,000 Hours falls under "younger organizations"

Intended use of funds (category): Organizational general support

Intended use of funds: The grant page says: "They [Rethink Priorities] intend to continue their focus on animal welfare research in 2019, and cover other areas such as improving mental health, strengthening the EA movement, reducing catastrophic risks, and improving the long-term future."

Donor reason for selecting the donee: Donor lists these reasons on the grant page: (1) Well-executed cause prioritisation research is very valuable. (2) Neutral-to-positive impression of published research by Rethink Priorities (RP). (3) Positive impression of adaptiveness and responsiveness to feedback of RP. (4) Exploratoryr value to encourage fund managers and others to investigate RP work. (5) Positive impression of RP's idea of "undertaking commissioned research into neglected areas, where there is a specific need for independent evaluation." (6) Potential for RP to "act as a talent pipeline and provide valuable training, especially as future capacity for EA philanthropic advisory." (7) Unfilled funding gap of $120k for 2019

Donor reason for donating that amount (rather than a bigger or smaller amount): The amount is much less than the funding gap of $120,000 for 2019. It is likely selected to be this small based on the experimental nature of the grant
Percentage of total donor spend in the corresponding batch of donations: 1.95%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round, as well as grantee's trajectory (it has only recently started publishing research)

Donor retrospective of the donation: The donor would make a followup grant of $12,000 in the July 2019 grant round, and continue to stand by the reasoning motivating the first grant
Effective Altruism Funds: Long-Term Future FundAI summer school21,000.002018-11-29AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi Grant to fund the second year of a summer school on AI safety, aiming to familiarize potential researchers with interesting technical problems in the field. Last year’s iteration of this event appears to have gone well, per https://www.lesswrong.com/posts/bXLi3n2jrfqRwoSTH/human-aligned-ai-summer-school-a-summary (GW, IR) and private information available to donor. Donor believes that well-run education efforts of this kind are valuable (where “well-run” refers to the quality of the intellectual content, the participants, and the logistics of the event), and feels confident enough that this particular effort will be well-run. Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Long-Term Future FundAI Safety Unconference4,500.002018-11-29AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi Orpheus Lummis and Vaughn DiMarco are organizing an unconference on AI Alignment on the last day of the NeurIPS conference, with the goal of facilitating networking and research on AI Alignment among a diverse audience of AI researchers with and without safety backgrounds. Based on interaction with the organizers and some participants, the donor feels this project is worth funding. However, the donee is still not sure if the unconference will be held, so the grant is conditional to the donee deciding to proceed. The grant would fully fund the request. Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Long-Term Future FundMachine Intelligence Research Institute40,000.002018-11-29AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi Donation process: Donee submitted grant application through the application form for the November 2018 round of grants from the Long-Term Future Fund, and was selected as a grant recipient

Intended use of funds (category): Organizational general support

Intended use of funds: The grant page links to MIRI's research directions post https://intelligence.org/2018/11/22/2018-update-our-new-research-directions/ and to MIRI's 2018 fundraiser post https://intelligence.org/2018/11/26/miris-2018-fundraiser/ saying "According to their fundraiser post, MIRI believes it will be able to find productive uses for additional funding, and gives examples of ways additional funding was used to support their work this year."

Donor reason for selecting the donee: The grant page links to MIRI's research directions post https://intelligence.org/2018/11/22/2018-update-our-new-research-directions/ and says "We believe that this research represents one promising approach to AI alignment research."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round

Donor retrospective of the donation: The Long-Term Future Fund would make a similarly sized grant ($50,000) in its next grant round in April 2019, suggesting that it was satisfied with the outcome of the grant Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Long-Term Future FundOught10,000.002018-11-29AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi Donation process: Donee submitted grant application through the application form for the November 2018 round of grants from the Long-Term Future Fund, and was selected as a grant recipient

Intended use of funds (category): Organizational general support

Intended use of funds: Grantee is a nonprofit aiming to implement AI alignment concepts in real-world applications.

Donor reason for selecting the donee: The grant page says: "We believe that Ought's approach is interesting and worth trying, and that they have a strong team. [...] Part of the aim of the grant is to show Ought as an example of the type of organization we are likely to fund in the future."

Donor reason for donating that amount (rather than a bigger or smaller amount): The grant page says "Our understanding is that hiring is currently more of a bottleneck for them than funding, so we are only making a small grant."
Percentage of total donor spend in the corresponding batch of donations: 10.47%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round

Donor thoughts on making further donations to the donee: The grant page says "Part of the aim of the grant is to show Ought as an example of the type of organization we are likely to fund in the future." This suggests that Ought will be considered for future grant rounds

Donor retrospective of the donation: The Long-Term Future Fund would make a $50,000 grant to Ought in the April 2019 grant round, suggesting that this grant would be considered a success
Effective Altruism Funds: Long-Term Future FundForetold20,000.002018-11-29Forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi Donation process: Donee submitted grant application through the application form for the November 2018 round of grants from the Long-Term Future Fund, and was selected as a grant recipient

Intended use of funds (category): Organizational general support

Intended use of funds: Ozzie Gooen plans to build an online community of EA forecasters, researchers, and data scientists to predict variables of interest to the EA community. Ozzie proposed using the platform to answer a range of questions, including examples like “How many Google searches will there be for reinforcement learning in 2020?” or “How many plan changes will 80,000 hours cause in 2020?”, and using the results to help EA organizations and individuals to prioritize. The grant funds the project's basic setup and initial testing. The community and tool would later get created with the name Foretold; it is available at https://www.foretold.io/

Donor reason for selecting the donee: The grant decision was made based on past success by Ozzie Gooen with Guesstimate https://www.getguesstimate.com/ as well as belief both in the broad value of the project and the specifics of the project plan.

Donor reason for donating that amount (rather than a bigger or smaller amount): Amount likely determined by the specifics of the project plan and the scope of this round of funding, namely, the project's basic setup and initial testing.
Percentage of total donor spend in the corresponding batch of donations: 20.94%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round, and also by the donee's desire to start the project

Donor retrospective of the donation: The Long-Term Future Fund would make a followup grant of $70,000 to Foretold in the April 2019 grant round https://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl see also https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) for more detail
Effective Altruism Funds: Meta FundCentre for Effective Altruism15,000.002018-11-29Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW Grant write-up talks of the strong track record of the CEA and their success at introducing EA concepts to a large number of people. The write-up concludes: "CEA is aiming to raise $3 million for their 2019 budget, and currently has significant room for more funding due to expansion efforts. At the margin, this grant is expected to contribute towards CEA’s expansion of their events, technology and community health teams.". Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundGlobal Priorities Institute14,000.002018-11-29Cause prioritizationhttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW Grant write-up notes that donee (Global Priorities Institute) is solving an important problem of spreading EA-style ideas in the academic and policy worlds, and has shown impressive progress in its first year. The write-up concludes: "This grant is expected to contribute to GPI’s plans to grow its research team, particularly in economics, in order to publish a strong initial body of papers that defines their research focus. GPI also plans to sponsor DPhil students engaging in global priorities research at the University of Oxford through scholarships and prizes, and to give close support to early career academics with its summer visiting program.". Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundCharity Entrepreneurship14,000.002018-11-29Charity incubationhttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW Grant to support the incubation of new charities, which the donor believes to be highly valuable. The write-up concludes: "Marginal donations will allow Charity Entrepreneurship to expand the number of applicants they can accept to the program and give larger seed grants. The latter is likely to increase both the number of people applying and the level of stability at which the new charities start off, allowing them to focus more on long-term impact rather than short-term fundraising.". Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundRaising for Effective Giving10,000.002018-11-29Effective altruism/meta/fiundraisinghttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW Grant write-up talks of the huge fundraising multiplier of the donee (which is a fundraising organization) as well as its significant growth in recent years. The donor was also impressed by the KPI tracking and estimation of the donee, as well as their directing significant funds to organizations focused on the long-term future. The write-up concludes by saying that the donee "will use this grant to expand their HNW advisory work in 2019. At the margin, this funding will likely enable them to build their research capacity and provide more effective advice to their HNW audience. This will allow them to better target their fundraising for effective organisations.". Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundLet's Fund10,000.002018-11-29Effective altruismhttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW Grant write-up notes that both the idea and the team are potentially promisingm and the quality of initial work is good. It concludes: "Let’s Fund will use additional capital to increase their fundraising ratio through careful outreach and marketing. In the future, they aim to undertake more and better research, especially into global catastrophic risk reduction, and have it evaluated and peer-reviewed by other researchers.". Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta FundBerkeley REACH5,000.002018-11-29Rationality communityhttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW Grant is to provide funds to support the panel for dispute resolution, as the donor "would like the panel to succeed in becoming a trusted resource for the local community, and want to ensure they have the capacity to engage in training or seek professional advice where this could be useful." The grant write-up concludes: "This grant will provide the panel with readily available funding in the event that they wish to seek legal counsel, the services of an independent investigator, or other professional support, particularly if they are faced with a time-sensitive matter.". Percentage of total donor spend in the corresponding batch of donations: 100.00%.
Effective Altruism Funds: Meta Fund80,000 Hours45,000.002018-11-29Effective altruism/movement growth/career counselinghttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW Donation process: Part of the November 2018 grant round from the EA Meta Fund. This is among the "larger grants" of the "mixture of larger grants to more established meta groups and smaller grants to younger organizations" that comprise the grant round. The grant page says: "We see this as our pilot round and hope this set of grants will signal the type of organizations to which the EA Meta Fund plans to donate."

Intended use of funds (category): Organizational general support

Intended use of funds: Although the funding is unrestricted, the grant page says: "At the margin, this grant is expected to contribute towards the budget for their expanded team." It also says: "We think it’s likely that the Open Philanthropy Project will renew their funding of 80,000 Hours this year (although this isn’t confirmed), but their grant will be capped at 50-66% of the total amount raised. Because of this, there is a gap for other donors to fill, who will effectively receive 1:1 or 1:2 matching funding."

Donor reason for selecting the donee: The grant page lists 10 reasons: (1) Grantee has been impressive over the years. (2) Rapid growth of grantee since 2014. (3) Much larger budget for 2019 to support team expansion in 2019 and 2020. (4) Important work (supporting skilled people to pursue high-impact careers). (5) Carefully measured KPI to track their impact: impact-adjusted significant plan changes (IASPCs). (6) New focus on the highset priority career paths. (7) Excellent regular supporter updates (12 so far in 2018). (8) Great explanatory and advisory content that is helping people discover effective altruism. (9) Success at hiring and building a management team in 2018. (10) Marginal grants will go toward a new hire, which is particularly high-impact.

Donor reason for donating that amount (rather than a bigger or smaller amount): The total of $129,000 being distributed in this grant round from the EA Meta Fund is being distributed using a "mixture of larger grants to more established meta groups and smaller grants to younger organizations. For the more established groups, we believe there is strong evidence of their past impact and cost-effectiveness, room for more funding, and high probability that they will have an even greater positive impact in the future." 80,000 Hours receives the single largest grant, and a little over 1/3 of the total money
Percentage of total donor spend in the corresponding batch of donations: 34.88%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the grant round. Also, the timing is particularly good for 80,000 Hours because of the historically large fundraising round 80,000 Hours has for 2019

Donor retrospective of the donation: Further grants in the next two grant rounds https://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b and https://app.effectivealtruism.org/funds/ea-community/payouts/3pxoLG7aRWtETC1lECC6LK suggest that this grant would be considered a success
Effective Altruism Funds: Meta FundFounders Pledge16,000.002018-11-29Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW Donation process: Part of the November 2018 grant round from the EA Meta Fund. This is among the "larger grants" of the "mixture of larger grants to more established meta groups and smaller grants to younger organizations" that comprise the grant round. The grant page says: "We see this as our pilot round and hope this set of grants will signal the type of organizations to which the EA Meta Fund plans to donate."

Intended use of funds (category): Direct project expenses

Intended use of funds: The grant page says: "This grant will support their 2019 plans to expand their US operations and further develop their methodology to evaluate and support high-impact interventions."

Donor reason for selecting the donee: The grant page lists 8 reasons: (1) Success at raising pledges; it has raised $630 million in pledges since 2013. (2) Moved $1.7 million to EA priority cause areas in the past 3 years. (3) Multipliers remain highly attractive after adjustments. (4) Founders Pledge engages with pledgers on how to best do good, and organizes events. This community-building work has additional unmeasured value. (5) Founders Pledge has been growing quickly and consistently: the value of new pledges in 2017 was double the value of pledges in 2015. (7) Founders Pledge has significant room for more funding due to international expansion plans for 2019. Recent growth in the United States has been responsible for about 30% of their total pledge value to date. (8) "We believe there is additional value in Founders Pledge expediting their international expansion, given potential competitive pressures and network effects."

Donor reason for donating that amount (rather than a bigger or smaller amount): The total of $129,000 being distributed in this grant round from the EA Meta Fund is being distributed using a "mixture of larger grants to more established meta groups and smaller grants to younger organizations. For the more established groups, we believe there is strong evidence of their past impact and cost-effectiveness, room for more funding, and high probability that they will have an even greater positive impact in the future." Founders Pledge receives the second largest grant, and about 1/8 of the total grant money
Percentage of total donor spend in the corresponding batch of donations: 12.40%

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the grant round

Donor retrospective of the donation: The followup $110,000 grant https://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b in the March 2019 grant round suggests that this grant would be considered a success

Donation amounts by donee and year

Donee Donors influenced Cause area Metadata Total 2019 2018
80,000 Hours Effective Altruism Funds: Meta Fund (filter this donor) Career coaching/life guidance FB Tw WP Site 415,000.00 370,000.00 45,000.00
Center for Applied Rationality Effective Altruism Funds: Long-Term Future Fund (filter this donor) Rationality FB Tw WP Site TW 150,000.00 150,000.00 0.00
Lucius Caviola Effective Altruism Funds: Long-Term Future Fund (filter this donor), Effective Altruism Funds: Meta Fund (filter this donor) 130,000.00 130,000.00 0.00
Founders Pledge Effective Altruism Funds: Meta Fund (filter this donor) Effective altruism/donor pledges FB Tw WP Site 126,000.00 110,000.00 16,000.00
Effective Altruism Community Building Grants Effective Altruism Funds: Meta Fund (filter this donor) 120,000.00 120,000.00 0.00
High Impact Policy Engine Effective Altruism Funds: Long-Term Future Fund (filter this donor), Effective Altruism Funds: Meta Fund (filter this donor) 100,000.00 100,000.00 0.00
Machine Intelligence Research Institute Effective Altruism Funds: Long-Term Future Fund (filter this donor) AI safety FB Tw WP Site CN GS TW 90,000.00 50,000.00 40,000.00
Foretold Effective Altruism Funds: Long-Term Future Fund (filter this donor) 90,000.00 70,000.00 20,000.00
Metaculus Effective Altruism Funds: Long-Term Future Fund (filter this donor) 70,000.00 70,000.00 0.00
AI Safety Camp Effective Altruism Funds: Long-Term Future Fund (filter this donor) 66,000.00 66,000.00 0.00
Forethought Foundation for Global Priorities Research Effective Altruism Funds: Meta Fund (filter this donor) Cause prioritization Site 64,000.00 64,000.00 0.00
Ought Effective Altruism Funds: Long-Term Future Fund (filter this donor) AI safety Site 60,000.00 50,000.00 10,000.00
Kocherga Effective Altruism Funds: Long-Term Future Fund (filter this donor) 50,000.00 50,000.00 0.00
Shahar Avin Effective Altruism Funds: Long-Term Future Fund (filter this donor) 40,000.00 40,000.00 0.00
Robert Miles Effective Altruism Funds: Long-Term Future Fund (filter this donor) 39,000.00 39,000.00 0.00
Nikhil Kunapuli Effective Altruism Funds: Long-Term Future Fund (filter this donor) 30,000.00 30,000.00 0.00
Alexander Gietelink Oldenziel Effective Altruism Funds: Long-Term Future Fund (filter this donor) 30,000.00 30,000.00 0.00
Generation Pledge Effective Altruism Funds: Meta Fund (filter this donor) 30,000.00 30,000.00 0.00
Alex Turner Effective Altruism Funds: Long-Term Future Fund (filter this donor) 30,000.00 30,000.00 0.00
Eli Tyre Effective Altruism Funds: Long-Term Future Fund (filter this donor) 30,000.00 30,000.00 0.00
David Girardo Effective Altruism Funds: Long-Term Future Fund (filter this donor) 30,000.00 30,000.00 0.00
Anand Srinivasan Effective Altruism Funds: Long-Term Future Fund (filter this donor) 30,000.00 30,000.00 0.00
Tegan McCaslin Effective Altruism Funds: Long-Term Future Fund (filter this donor) 30,000.00 30,000.00 0.00
Effective Altruism Russia Effective Altruism Funds: Long-Term Future Fund (filter this donor) 28,000.00 28,000.00 0.00
Jacob Lagerros Effective Altruism Funds: Long-Term Future Fund (filter this donor) 27,000.00 27,000.00 0.00
Tessa Alexanian Effective Altruism Funds: Long-Term Future Fund (filter this donor) 26,250.00 26,250.00 0.00
EA Coaching Effective Altruism Funds: Meta Fund (filter this donor) 23,000.00 23,000.00 0.00
Stag Lynn Effective Altruism Funds: Long-Term Future Fund (filter this donor) 23,000.00 23,000.00 0.00
Rethink Priorities Effective Altruism Funds: Meta Fund (filter this donor) Cause prioritization Site 22,000.00 22,000.00 0.00
AI summer school Effective Altruism Funds: Long-Term Future Fund (filter this donor) 21,000.00 0.00 21,000.00
Connor Flexman Effective Altruism Funds: Long-Term Future Fund (filter this donor) 20,000.00 20,000.00 0.00
Lauren Lee Effective Altruism Funds: Long-Term Future Fund (filter this donor) 20,000.00 20,000.00 0.00
Centre for Effective Altruism Effective Altruism Funds: Meta Fund (filter this donor) Effective altruism/movement growth FB Site 20,000.00 5,000.00 15,000.00
Alexander Siegenfeld Effective Altruism Funds: Long-Term Future Fund (filter this donor) 20,000.00 20,000.00 0.00
Effective Altruism Geneva Effective Altruism Funds: Meta Fund (filter this donor) 18,000.00 18,000.00 0.00
Effective Altruism Norway Effective Altruism Funds: Meta Fund (filter this donor) 18,000.00 18,000.00 0.00
Effective Altruism Zürich Effective Altruism Funds: Long-Term Future Fund (filter this donor) 17,900.00 17,900.00 0.00
Effective Altruism Netherlands Effective Altruism Funds: Meta Fund (filter this donor) 17,000.00 17,000.00 0.00
Rethink Charity Effective Altruism Funds: Meta Fund (filter this donor) Effective altruism/Community project coordination FB Site 15,000.00 15,000.00 0.00
One for the World Effective Altruism Funds: Meta Fund (filter this donor) 15,000.00 15,000.00 0.00
Global Priorities Institute Effective Altruism Funds: Meta Fund (filter this donor) Cause prioritization Site 14,000.00 0.00 14,000.00
Charity Entrepreneurship Effective Altruism Funds: Meta Fund (filter this donor) 14,000.00 0.00 14,000.00
RC Forward Effective Altruism Funds: Meta Fund (filter this donor) 11,000.00 11,000.00 0.00
Raising for Effective Giving Effective Altruism Funds: Meta Fund (filter this donor) Effective altruism/Fundraising/Poker and sports FB Tw WP Site 10,000.00 0.00 10,000.00
Orpheus Lummis Effective Altruism Funds: Long-Term Future Fund (filter this donor) 10,000.00 10,000.00 0.00
Roam Research Effective Altruism Funds: Long-Term Future Fund (filter this donor) 10,000.00 10,000.00 0.00
Let's Fund Effective Altruism Funds: Meta Fund (filter this donor) 10,000.00 0.00 10,000.00
Effective Thesis Effective Altruism Funds: Meta Fund (filter this donor) 10,000.00 10,000.00 0.00
The Life You Can Save Effective Altruism Funds: Meta Fund (filter this donor) Effective altruism/movement growth/fundraising FB Tw Site 10,000.00 10,000.00 0.00
Berkeley REACH Effective Altruism Funds: Meta Fund (filter this donor) Rationality community FB Site 5,000.00 0.00 5,000.00
AI Safety Unconference Effective Altruism Funds: Long-Term Future Fund (filter this donor) 4,500.00 0.00 4,500.00
Total ---- -- 2,309,650.00 2,085,150.00 224,500.00

Graph of spending by donee and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by donee and year (cumulative)

Graph of spending should have loaded here

Donation amounts by donor and year for influencer Luke Ding|Alex Foster|Denise Melchin|Matt Wage|Tara MacAulay

Donor Donees Total 2019 2018
Effective Altruism Funds: Long-Term Future Fund (filter this donee) AI Safety Camp (filter this donee), AI Safety Unconference (filter this donee), AI summer school (filter this donee), Alex Turner (filter this donee), Alexander Gietelink Oldenziel (filter this donee), Alexander Siegenfeld (filter this donee), Anand Srinivasan (filter this donee), Center for Applied Rationality (filter this donee), Connor Flexman (filter this donee), David Girardo (filter this donee), Effective Altruism Russia (filter this donee), Effective Altruism Zürich (filter this donee), Eli Tyre (filter this donee), Foretold (filter this donee), High Impact Policy Engine (filter this donee), Jacob Lagerros (filter this donee), Kocherga (filter this donee), Lauren Lee (filter this donee), Lucius Caviola (filter this donee), Machine Intelligence Research Institute (filter this donee), Metaculus (filter this donee), Nikhil Kunapuli (filter this donee), Orpheus Lummis (filter this donee), Ought (filter this donee), Roam Research (filter this donee), Robert Miles (filter this donee), Shahar Avin (filter this donee), Stag Lynn (filter this donee), Tegan McCaslin (filter this donee), Tessa Alexanian (filter this donee) 1,202,650.00 1,107,150.00 95,500.00
Effective Altruism Funds: Meta Fund (filter this donee) 80,000 Hours (filter this donee), Berkeley REACH (filter this donee), Centre for Effective Altruism (filter this donee), Charity Entrepreneurship (filter this donee), EA Coaching (filter this donee), Effective Altruism Community Building Grants (filter this donee), Effective Altruism Geneva (filter this donee), Effective Altruism Netherlands (filter this donee), Effective Altruism Norway (filter this donee), Effective Thesis (filter this donee), Forethought Foundation for Global Priorities Research (filter this donee), Founders Pledge (filter this donee), Generation Pledge (filter this donee), Global Priorities Institute (filter this donee), High Impact Policy Engine (filter this donee), Let's Fund (filter this donee), Lucius Caviola (filter this donee), One for the World (filter this donee), Raising for Effective Giving (filter this donee), RC Forward (filter this donee), Rethink Charity (filter this donee), Rethink Priorities (filter this donee), The Life You Can Save (filter this donee) 1,107,000.00 978,000.00 129,000.00
Total -- 2,309,650.00 2,085,150.00 224,500.00

Graph of spending by donee and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by donee and year (cumulative)

Graph of spending should have loaded here