Oliver Habryka|Alex Zhu|Matt Wage|Helen Toner|Matt Fallshaw money moved

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of December 2019. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Full list of documents in reverse chronological order (12 documents)

Title (URL linked)Publication dateAuthorPublisherAffected donorsAffected doneesDocument scopeNotes
Thoughts on the EA Hotel2019-04-25Oliver Habryka Effective Altruism ForumEffective Altruism Funds EA Hotel Evaluator review of doneeWith permission from Greg Colbourn of the EA Hotel, Habryka publicly posts the feedback he sent to the EA Hotel, who was rejected from the April 2019 funding round by the Long Term Future Fund. Habryka first lists three reasons he is excited about the Hotel: (a) Providing a safety net, (b) Acting on historical interest, (c) Building high-dedication cultures. He articulates three concrete models of concerns: (1) Initial overeagerness to publicize the EA Hotel (a point he now believes is mostly false, based on Greg Colbourn's response), (2) Significant chance of the EA Hotel culture becoming actively harmful for residents, (3) No good candidate to take charge of long-term logistics of running the hotel. Habryka concludes by saying he thinks all his concerns can be overcome. At the moment, he thinks the hotel should be funded for the next year, but is unsure of whether they should be given money to buy the hotel next door. The comment replies include one by Greg Colbourn, giving his backstory on the media attention (re: (1)) and discussing the situation with (2) and (3). There are also other replies, including one from casebash, who stayed at the hotel for a significant time
Major Donation: Long Term Future Fund Application Extended 1 Week2019-02-16Oliver Habryka Effective Altruism ForumEffective Altruism Funds Effective Altruism Funds Request for proposalsThe blog post announces that the EA Long-Term Future Fund has received a large donation, which doubles the amount of money available for granting to ~$1.2 million. It extends the deadline for applications at at https://docs.google.com/forms/d/e/1FAIpQLSeDTbCDbnIN11vcgHM3DKq6M0cZ3itAy5GIPK17uvTXcz8ZFA/viewform?usp=sf_link by 1 week, to 2019-02-24 midnight PST. The application form was previously annonced at https://forum.effectivealtruism.org/posts/oFeGLaJ5bZBBRbjC9/ea-funds-long-term-future-fund-is-open-to-applications-until and supposed to be open till 2019-02-07 for the February 2019 round of grants. Cross-posted to LessWrong at https://www.lesswrong.com/posts/ZKsSuxHWNGiXJBJ9Z/major-donation-long-term-future-fund-application-extended-1
EA Funds: Long-Term Future fund is open to applications until Feb. 7th2019-01-17Oliver Habryka Effective Altruism ForumEffective Altruism Funds Request for proposalsCross-posted to LessWrong at https://www.lesswrong.com/posts/dvGE8JSeFHtmHC6Gb/ea-funds-long-term-future-fund-is-open-to-applications-until The post seeks proposals for the Long-Term Future Fund. Proposals must be submitted by 2019-02-07 at https://docs.google.com/forms/d/e/1FAIpQLSeDTbCDbnIN11vcgHM3DKq6M0cZ3itAy5GIPK17uvTXcz8ZFA/viewform?usp=sf_link to be considered for the round of grants being announced mid-February. From the application, excerpted in the post: "We are particularly interested in small teams and individuals that are trying to get projects off the ground, or that need less money than existing grant-making institutions are likely to give out (i.e. less than ~$100k, but more than $10k). Here are a few examples of project types that we're open to funding an individual or group for (note that this list is not exhaustive)"
EA Meta Fund AMA: 20th Dec 20182018-12-19Alex Foster Denise Melchin Matt Wage Effective Altruism ForumEffective Altruism Funds Effective Altruism Funds Donee AMAThe post is an Ask Me Anything (AMA) for the Effective Altruism Meta Fund. The questions and answers are in the post comments. Questions are asked by a number of people including alexherwix, Luke Muehlhauser, Tee Barnett,and Peter Hurford. Answers are provided by Denise Melchin, Matt Wage, and Alex Foster, three of the five people managing the fund. The other two, Luke Ding and Tara MacAulay, do not post any comment replies, but are referenced in some of the replies. The questions include how the meta fund sees its role, how much time they expect to spend allocating grants, what sort of criteria they use for evaluating opportunities, and what data inform their decisions
Long-Term Future Fund AMA2018-12-18Helen Toner Oliver Habryka Alex Zhu Matt Fallshaw Effective Altruism ForumEffective Altruism Funds Effective Altruism Funds Donee AMAThe post is an Ask Me Anything (AMA) for the Long-Term Future Find. The question and answers are in the post comments. Questions are asked by a number of people including Luke Muehlhauser, Josh You, Peter Hurford, Alex Foster, and Robert Jones. Fund managers Oliver Habryka, Matt Fallshaw, Helen Toner, and Alex Zhu respond in the comments. Fund manager Matt Wage does not appear to have participated. Questions cover the amount of time spent evaluating grants, the evaluation criteria, the methods of soliciting grants, and research that would help the team
EA Funds: Long-Term Future fund is open to applications until November 24th (this Saturday)2018-11-20Oliver Habryka Effective Altruism ForumEffective Altruism Funds Request for proposalsThe post seeks proposals for the CEA Long-Term Future Fund. Proposals must be submitted by 2018-11-24 at https://docs.google.com/forms/d/e/1FAIpQLSf46ZTOIlv6puMxkEGm6G1FADe5w5fCO3ro-RK6xFJWt7SfaQ/viewform in order to be considered for the round of grants to be announced by the end of November 2018
Staff Members’ Personal Donations for Giving Season 20172017-12-18Holden Karnofsky Open Philanthropy ProjectHolden Karnofsky Alexander Berger Nick Beckstead Helen Toner Claire Zabel Lewis Bollard Ajeya Cotra Morgan Davis Michael Levine GiveWell top charities GiveWell GiveDirectly EA Giving Group Berkeley Existential Risk Initiative Effective Altruism Funds Sentience Institute Encompass The Humane League The Good Food Institute Mercy For Animals Compassion in World Farming USA Animal Equality Donor lottery Against Malaria Foundation GiveDirectly Periodic donation list documentationOpen Philanthropy Project staff members describe where they are donating this year, and the considerations that went into the donation decision. By policy, amounts are not disclosed. This is the first standalone blog post of this sort by the Open Philanthropy Project; in previous years, the corresponding donations were documented in the GiveWell staff members donation post
LW 2.0 Strategic Overview2017-09-14Oliver Habryka LessWrongCentre for Effective Altruism Effective Altruism Grants Eric Rogstad LessWrong 2.0 Miscellaneous commentaryHabryka describes his plans for LessWrong 2.0 as its primary developer.
Welcome to Lesswrong 2.02017-06-18Oliver Habryka LessWrong LessWrong 2.0 LaunchPost outlines thinking for LessWrong 2.0, covering in part changes to the codebase, moderation, discourse norms.
Staff members’ personal donations for giving season 20162016-12-09Natalie Crispin GiveWellElie Hassenfeld Holden Karnofsky Natalie Crispin Alexander Berger Timothy Telleen-Lawton Josh Rosenberg Rebecca Raible Helen Toner Sophie Monahan Laura Muñoz Catherine Hollander Andrew Martin Lewis Bollard Chelsea Tabart Sarah Ward Chris Somerville Ajeya Cotra Chris Smith Isabel Arjmand A political campaign GiveWell top charities International Genetically Engineered Machine Foundation UPMC Center for Health Security Donor lottery EA Giving Group GiveDirectly Center for Applied Rationality Malaria Consortium Animal Charity Evaluators Northwest Health Law Advocates StrongMinds Against Malaria Foundation Schistosomiasis Control Initiative The Humane Society of the United States The Humane League Mercy For Animals Humane Society International Compassion in World Farming USA The Good Food Institute Citizens for Farm Animal Protection END Fund Causa Justa Planned Parenthood International Refugee Assistance Project Periodic donation list documentationGiveWell and Open Philanthropy Project staff describe their annual donation plans for 2016. Some of these are tentative and get superseded by further events. Also, not all employees are present in the document (participation is optional). Amounts donated are not included, per a decision by GiveWell
Donor lotteries: demonstration and FAQ2016-12-07Carl Shulman Effective Altruism ForumTimothy Telleen-Lawton Gregory Lewis Ajeya Cotra Rohin Shah Helen Toner Nicole Ross Howie Lempel Rebecca Raible Pablo Stafforini Aaron Gertler Brayden McLean Benjamin Hoffman Catherine Olsson Eric Herboso Ian David Moss Glenn Willen Jacob Steinhardt Brandon Reinhart Donor lottery Donee donation caseCarl Shulman announces a donor lottery coordinated/sponsored by Paul Christiano, and provides a FAQ discussing questions people might have for participating in the lottery
Staff members’ personal donations for giving season 20152015-12-09Elie Hassenfeld GiveWellElie Hassenfeld Holden Karnofsky Natalie Crispin Alexander Berger Timothy Telleen-Lawton Sean Conley Josh Rosenberg Jake Marcus Rebecca Raible Milan Griffes Helen Toner Sophie Monahan Laura Muñoz Catherine Hollander Andrew Martin Claire Zabel Nicole Ross Lewis Bollard GiveWell top charities Against Malaria Foundation GiveWell GiveDirectly Wikimedia Foundation Center for Global Development Martha’s Table Country Dance and Song Society Northwest Health Law Advocates Mercy For Animals The Humane League Animal Charity Evaluators Raising for Effective Giving Humane Society of te United States Periodic donation list documentationGiveWell and Open Philanthropy Project staff describe their annual donation plans for 2015. Some of these are tentative and get superseded by further events. Also, not all employees are present in the document (participation is optional). Amounts donated are not included, per a decision by GiveWell

Full list of donations in reverse chronological order (48 donations)

DonorDoneeAmount (current USD)Donation dateCause areaURLNotes
Effective Altruism FundsEffective Altruism Zürich17,900.002019-04-07AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl The grant reasoning is written up by Helen Toner and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions but the comments on the post do not discuss this specific grant. Earmark: Alex Lintz; Percentage of total donor spend in the corresponding batch of donations: 1.93%.
Effective Altruism FundsTessa Alexanian26,250.002019-04-07Biosecurity and pandemic preparednesshttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl The grant reasoning is written up by Matt Wage and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions but the comments on the post do not discuss this specific grant. Percentage of total donor spend in the corresponding batch of donations: 2.84%.
Effective Altruism FundsShahar Avin40,000.002019-04-07AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl The grant reasoning is written up by Matt Wage and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions but the comments on the post do not discuss this specific grant. Percentage of total donor spend in the corresponding batch of donations: 4.33%.
Effective Altruism FundsLucius Caviola50,000.002019-04-07Effective altruism/long-termismhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl The grant reasoning is written up by Matt Wage and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions but the comments on the post do not discuss this specific grant. Percentage of total donor spend in the corresponding batch of donations: 5.42%.
Effective Altruism FundsOught50,000.002019-04-07AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl The grant reasoning is written up by Matt Wage and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions but the comments on the post do not discuss this specific grant. Percentage of total donor spend in the corresponding batch of donations: 5.42%.
Effective Altruism FundsNikhil Kunapuli30,000.002019-04-07AI safety/deconfusion researchhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl The grant reasoning is written up by Alex Zhu and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions but the comments on the post do not discuss this specific grant. Percentage of total donor spend in the corresponding batch of donations: 3.25%.
Effective Altruism FundsAnand Srinivasan30,000.002019-04-07AI safety/deconfusion researchhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl The quality of grantee's work will be judged by researchers at the Machine Intelligence Research Institute. The grant reasoning is written up by Alex Zhu and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions but the comments on the post do not discuss this specific grant. Percentage of total donor spend in the corresponding batch of donations: 3.25%.
Effective Altruism FundsEffective Altruism Russia28,000.002019-04-07Rationality improvementhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions There is a lot of criticism and discussion of the grant in the comments. Earmark: Mikhail Yugadin; Percentage of total donor spend in the corresponding batch of donations: 3.03%.
Effective Altruism FundsAlex Turner30,000.002019-04-07AI safety/agent foundationshttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions The comments on the post do not discuss this specific grant. Percentage of total donor spend in the corresponding batch of donations: 3.25%.
Effective Altruism FundsDavid Girardo30,000.002019-04-07AI safety/deconfusion researchhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Grantee is doing independent deconfusion research for AI safety. His angle of attack is to elucidate the ontological primitives for representing hierarchical abstractions, drawing from his experience with type theory, category theory, differential geometry, and theoretical neuroscience. Tsvi Benson-Tilsen, a MIRI researcher, has also recommended that grantee get funding. The grant reasoning is written up by Alex Zhu and is also included in the cross-post of the grant decision to the Effective Altruism Forum at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions but the comments on the post do not discuss this specific grant. Percentage of total donor spend in the corresponding batch of donations: 3.25%.
Effective Altruism FundsOrpheus Lummis10,000.002019-04-07AI safety/upskillinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Grant for upskilling in contemporary AI techniques, deep RL and AI safety, before pursuing a ML PhD. The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions The comments on the post do not discuss this specific grant. Percentage of total donor spend in the corresponding batch of donations: 1.08%.
Effective Altruism FundsTegan McCaslin30,000.002019-04-07AI safety/forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Grant for independent research projects relevant to AI forecasting and strategy. The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions Habryka's reasons for the grant include: (1) It's easier to relocate someone who has already demonstrated trust and skills than to find someone completely new, (2.1) It's important to give good researchers runway while they find the right place. Habryka notes: "my brief assessment of Tegan’s work was not the reason why I recommended this grant, and if Tegan asks for a new grant in 6 months to focus on solo research, I will want to spend significantly more time reading her output and talking with her, to understand how these questions were chosen and what precise relation they have to forecasting technological progress in AI." The comments on the post do not discuss this specific grant, but a grant to Lauren Lee that includes somewhat similar reasoning (providing people runway after they leave their jobs, so they can explore better) attracts some criticism. Percentage of total donor spend in the corresponding batch of donations: 3.25%.
Effective Altruism FundsMetaculus70,000.002019-04-07Forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Grant to Anthony Aguirre to expand the Metaculus prediction platform along with its community. Metaculus.com is a fully-functional prediction platform with ~10,000 registered users and >120,000 predictions made to date on more than >1000 questions. The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions Part of Habryka's grant reasoning also applies to the grants made to Ozzie Gooen and Jacob Lagerros in the same round of grants. He also mentions that the requested grant amount was $150,000, but he did not feel comfortable granting the full amount. The comments discuss this and the other forecasting grants, and include the question "why are you acting as grant-givers here rather than as special interest investors?" It is also included in a list of potentially concerning grants in a portfolio evaluation comment https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions#d4YHzSJnNWmyxf6HM by Evan Gaensbauer. Earmark: Anthony Aguirre; Percentage of total donor spend in the corresponding batch of donations: 7.58%.
Effective Altruism FundsLauren Lee20,000.002019-04-07Rationality communityhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Grant for working to prevent burnout and boost productivity within the EA and X-risk communities. The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions Habryka describes his grant reasoning as follows: "In sum, this grant hopefully helps Lauren to recover from burning out, get the new rationality projects she is working on off the ground, potentially identify a good new niche for her to work in (alone or at an existing organization), and write up her ideas for the community." He also qualifies the likelihood of giving another grant as follows: "I think that she should probably aim to make whatever she does valuable enough that individuals and organizations in the community wish to pay her directly for her work. It’s unlikely that I would recommend renewing this grant for another 6 month period in the absence of a relatively exciting new research project/direction, and if Lauren were to reapply, I would want to have a much stronger sense that the projects she was working on were producing lots of value before I decided to recommend funding her again." The grant receives criticism in the comments, including 'This is ridiculous, I'm sure she's a great person but please don't use the gift you received to provide sinecures to people "in the community"'. Percentage of total donor spend in the corresponding batch of donations: 2.17%.
Effective Altruism FundsForetold70,000.002019-04-07Forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Grant to Ozzie Gooen mainly for Foretold, a forecasting application available at https://www.foretold.io/ (currently needs one to contact Gooen for early access). Foretold that handles full probability distributions. The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions His reasons for the grant include: (1) confusion about what counts as progress and what specific problems need solving, (2) need for many people to collaborate on solving problems and to document the progress they have made so far, (3) low-hanging fruit in designing better online platforms for making intellectual progress (which is why I chose to work on LessWrong + AI Alignment Forum + EA Forum) -- Habryka works on LessWrong 2.0 for that reason, and Gooen has past experience in the space with his building of Guesstimate, (4) promise and tractability for forecasting platforms in particular (for instance, work by Philip Tetlock and work by Robin Hanson), (5) Even though some platforms, such as Predictionbook and Guesstimate, did not get the traction they expected, others like the Good Judgment Project have been successful, so one should not overgeneralize from a few failures, (6) Habryka has a positive impression of Gooen both in in-person interaction and in online writing. The reasoning for the grant is similar to the reasoning for the grants to Anthony Aguirre (Metaculus) and Jacob Lagerros made in the same round. The comments discuss this and the other forecasting grants, and include the question "why are you acting as grant-givers here rather than as special interest investors?" It is also included in a list of potentially concerning grants in a portfolio evaluation comment https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions#d4YHzSJnNWmyxf6HM by Evan Gaensbauer. Earmark: Ozzie Gooen; Percentage of total donor spend in the corresponding batch of donations: 7.58%.
Effective Altruism FundsAI Safety Camp25,000.002019-04-07AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Grantee in the grant document is listed as Johannes Heidecke, but the grant is for the AI Safety Camp. Grant is for supporting aspiring researchers of AI alignment to boost themselves into productivity. The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions Grant decision was coordinated with Effective Altruism Grants (specifically, Nicole Ross of CEA) who had considered also making a grant to the camp. Effective Altruism Grants ultimately decided against making the grant, and the Long Term Future Fund made it instead. Reasons for the grant include a positive impression of AI Safety Camp and the people involved, despite not knowing the organizers well. Another reason is a lack of opportunities for people in Europe to productively work on AI Alignment related problems. Concerns include the difficulty of organizing long in-person events, and frictions between attendees and organizers driving people away from AI safety. Similar concerns were also expressed by Nicole Ross in the evaluation by EA Grants. Earmark: Johannes Heidecke; Percentage of total donor spend in the corresponding batch of donations: 2.71%.
Effective Altruism FundsKocherga50,000.002019-04-07Rationality communityhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Grant to Vyacheslav Matyuhin for Kocherga, an offline community hub for rationalists and EAs in Moscow. The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions Habryka notes that the Russian rationality community has been successful, with projects such as https://lesswrong.ru (Russian translation of LessWrong sequences), kickstarter to distribute copies of HPMOR, and Kocherga, a financially self-sustaining anti-cafe in Moscow that hosts a variety of events for roughly 100 attendees per week. The grant is to the team working on Kocherga. The grant reasoning references the LessWrong post https://www.lesswrong.com/posts/WmfapdnpFfHWzkdXY/rationalist-community-hub-in-moscow-3-years-retrospective by Kocherga. The grant is being made by the Long Term Future Fund because the EA Meta Fund decided not to make it. Earmark: Vyacheslav Matyuhin; affected countries: Russia; Percentage of total donor spend in the corresponding batch of donations: 5.42%.
Effective Altruism FundsJacob Lagerros27,000.002019-04-07AI safety/forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Grant to build a private platform where AI safety and policy researchers have direct access to a base of superforecaster-equivalents. Lagerros previously received two grants to work on the project: a half-time salary from Effective Altruism Grants, and a grant for direct project expenses from Berkeley Existential Risk Initiative. The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions Habryka notes the same object-level reasons for the grant as for grants to Ozzie Gooen (for Foretold) and Anthony Aguirre (for Metaculus). He also mentions Lagerros being around the community for 3 years, and having done useful owrk and received other funding. Habryka mentions he did not assess the grant in detail; the main reason for granting from the Long Term Future Fund was due to logistical complications with other grantmakers (FHI and BERI), who already vouched for the value of the project. Percentage of total donor spend in the corresponding batch of donations: 2.92%.
Effective Altruism FundsConnor Flexman20,000.002019-04-07AI safety/forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Grant to perform independent research in collaboration with John Salvatier. The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions Habryka was the primary person on whose recommendation the grant was made. He has two other reservations: potential conflict of interest because he lives in the same house as the recipient, and lack of concrete, externally verifiable evidence of competence. The grant was originally requested by John Salvatier, as a grant to Salvatier to hire Flexman to help him. But Habryka ultimately decided to give the money to Flexman to give him more flexibility to switch if the work with Salvatier does not go well. Despite the reservations, Habryka considers significant negative consequences. Percentage of total donor spend in the corresponding batch of donations: 2.17%.
Effective Altruism FundsEli Tyre30,000.002019-04-07Rationality improvementhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Grant to support projects for rationality and community building interventions. The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions Example projects: facilitating conversations between top people in AI alignment, organization advanced workshops on double crunx, doing independent research projects such as https://www.lesswrong.com/posts/tj8QP2EFdP8p54z6i/historical-mathematicians-exhibit-a-birth-order-effect-too (evaluating burth order effects in mathematicians), providing new EAs and rationalists with advice and guidance on how to get traction on working on important problems, and helping John Salvatier develop techniques around skill transfer. Percentage of total donor spend in the corresponding batch of donations: 3.25%.
Effective Altruism FundsRobert Miles39,000.002019-04-07AI safety/content creation/videohttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl Grant to create video content on AI alignment. Grantee has a YouTube channel at https://www.youtube.com/channel/UCLB7AzTwc6VFZrBsO2ucBMg (average 20,000 views per video) and also creates videos for the Computerphile channel https://www.youtube.com/watch?v=3TYT1QfdfsM&t=2s (often more than 100,000 views per video). The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions Reasons Habryka favors the grant: (1) Grantee explains AI alignment as primarily a technical problem, not a moral or political problem, (2) Grantee does not politicize AI safety, (3) Grantee's goal is to create interest in these problems from future researchers, and not to simply get as large of an audience as possible. Habryka notes that the grantee is the first skilled person in the X-risk community working full-time on producing video content. Being the very best we have in this skill area, he is able to help the community in a number of novel ways (for example, he’s already helping existing organizations produce videos about their ideas). In the previous grant round, the grantee had requested funding for a collaboration with RAISE to produce videos for them, but Habryka felt it was better to fund the grantee directly and allow him to decide which organizations he wanted to help with his videos. Percentage of total donor spend in the corresponding batch of donations: 4.22%.
Effective Altruism FundsMachine Intelligence Research Institute50,000.002019-04-07AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions Habryka believes that MIRI is making real progress in its approach of "creating a fundamental piece of theory that helps humanity to understand a wide range of powerful phenomena" He notes that MIRI started work on the alignment problem long before it became cool, which gives him more confidence that they will do the right thing and even their seemingly weird actions may be justified in ways that are not yet obvious. He also thinks that both the research team and ops staff are quite competent. Despite these, Habryka recommends a relatively small grant to MIRI, because they are already relatively well-funded and are not heavily bottlenecked on funding. However, he ultimately decides to grant some amount to MIRI, giving some explanation. He says he will think more about this before the next funding round. Percentage of total donor spend in the corresponding batch of donations: 5.42%.
Effective Altruism FundsCenter for Applied Rationality150,000.002019-04-07Rationality improvementhttps://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions The grant is for general support and is the largest grant in this round of grants from the Long Term Future Fund. The grant fills in a dire funding gap that CFAR faces because they did not run a fundraiser in 2018, a decision they made because they felt that running a fundraiser would be in bad taste after what they considered a messup on their part in the Brent Dill situation. Habryka also thinks that CFAR intro workshops have been very valuable, both in terms of improving epistemic norms in the EA and rationality community and in terms of helping grow interest in AI safety. Habryka also thinks that CFAR is in a tough spot right now, with many of its founding employees having left, difficulty attracting top talents, and a dilution of the original truth-seeking focus. Habryka concludes by saying: "I didn’t have enough time this grant round to understand how the future of CFAR will play out; the current grant amount seems sufficient to ensure that CFAR does not have to take any drastic action until our next grant round. By the next grant round, I plan to have spent more time learning and thinking about CFAR’s trajectory and future, and to have a more confident opinion about what the correct funding level for CFAR is." In the comments, Milan Griffes asks why such a large, unrestricted grant is being made to CFAR despite these concerns, and also what Habryka hopes to learn about CFAR before the next grant round. There are replies from Peter McCluskey and Habryka, with some further comment back-and-forth. Percentage of total donor spend in the corresponding batch of donations: 16.25%.
Effective Altruism Funds80,000 Hours170,000.002019-03-07Effective altruism/movement growth/career counselinghttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Reasons for grant: (1) High impact per dollar, (2) Highly impactful and cost-effective in the past, for same reasons as discussed in https://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW (previous grant), (3) Undergoing significant growth, (4) Not yet filled funding gap, (5) Big potential upside is reducing talent bottlenecks in cause areas that are crucial and highly technically challenging. Other than (5), all the other reasons are shared as stated with another simultaneous grant made to Founders Pledge. Percentage of total donor spend in the corresponding batch of donations: 33.20%.
Effective Altruism FundsFounders Pledge110,000.002019-03-07Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Reasons for grant: (1) High impact per dollar, (2) Highly impactful and cost-effective in the past, for same reasons as discussed in https://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW (previous grant), (3) Undergoing significant growth, (4) Not yet filled funding gap, (5) Big potential upside is long-lasting positive effect on the culture of smart major philanthropy. Other than (5), all the other reasons are shared as stated with another simultaneous grant made to 80,000 Hours. Percentage of total donor spend in the corresponding batch of donations: 21.48%.
Effective Altruism FundsLucius Caviola80,000.002019-03-07Effective altruism/talent pipelinehttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Reasons for grant: (1) Lucius Caviola is a well-respected PhD-level academic, with a focus on effective altruism and long-termism, (2) Lucius has been accepted to work for two years as a postdoc researcher in psychology with the highly renowned professor Joshua Greene at Harvard University, on the condition that he can bring his own research funding, (3) Donor believes that very high-quality academic research is a highly impactful activity in expectation, (4) Psychology at the higher levels has been very influential, (5) Donor believes that where foundational researchers focusing on effective altruism have displayed excellence, they should not be bottlenecked on funding considerations wherever possible, (6) This is a time-bounded, specific opportunity that requires funding to initiate and explore, and donor believes both that the value of information from this speculative grant is high, and that the project could have a large potential upside through increasing the quality and quantity of information available to address the world’s biggest problems. Percentage of total donor spend in the corresponding batch of donations: 15.63%.
Effective Altruism FundsForethought Foundation for Global Priorities Research64,000.002019-03-07Cause prioritizationhttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Donor believes that academic research is a high-impact but high-investment way to expand the frontiers of discussion and explore complex, nuanced topics. Percentage of total donor spend in the corresponding batch of donations: 12.50%.
Effective Altruism FundsEffective Altruism Norway18,000.002019-03-07Effective altruism/talent pipelinehttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Grant to run an Operations Camp in Oslo, with the aim of upskilling promising talent and providing opportunities to test fit in key operations roles. Percentage of total donor spend in the corresponding batch of donations: 3.52%.
Effective Altruism FundsEffective Altruism Geneva18,000.002019-03-07Cause prioritizationhttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Grant to run a policy research project to determine what prioritization models are appropriate to run under what circumstances. Percentage of total donor spend in the corresponding batch of donations: 3.52%.
Effective Altruism FundsEffective Altruism Netherlands17,000.002019-03-07Effective altruism/fundraisinghttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Grant to help 35 charities that are either GiveWell-recommended or have received grants from the Open Philanthropy Project to achieve tax-deductible status in the Netherlands. Affected countries: Netherlands; Percentage of total donor spend in the corresponding batch of donations: 3.32%.
Effective Altruism FundsOne for the World15,000.002019-03-07Effective altruism/fundraisinghttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Grant to support the creation and growth of chapters at undergraduate, MBA and law schools. Chapter leaders and student ambassadors encourage their classmates to commit to donating a percentage of their income upon graduation. Percentage of total donor spend in the corresponding batch of donations: 2.93%.
Effective Altruism FundsRethink Priorities10,000.002019-03-07Cause prioritizationhttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Grantee is currently focused on how to apply cost-effectiveness frameworks to uncertain domains, interventions aimed at animals welfare and understanding EA movement growth. Percentage of total donor spend in the corresponding batch of donations: 1.95%.
Effective Altruism FundsThe Life You Can Save10,000.002019-03-07Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/1hVfcvrzRbpXUWYht4bu3b Grant to support the tenth anniversary relaunch of Peter Singer's book The Life You Can Save; grantee is the eponymous organization. Project has an immediate, one-off funding requirement and donor believes it has high upside, judging from the impact that the original book had. Percentage of total donor spend in the corresponding batch of donations: 1.95%.
Effective Altruism FundsMachine Intelligence Research Institute40,000.002018-11-29AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi Grant made from the Long-Term Future Fund. Donor believes that the new research directions outlined by donee at https://intelligence.org/2018/11/22/2018-update-our-new-research-directions/ are promising, and donee fundraising post suggests it could productively absorb additional funding. Percentage of total donor spend in the corresponding batch of donations: 41.88%.
Effective Altruism FundsOught10,000.002018-11-29AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi Grant made to implement AI alignment concepts in real-world applications. Donee seems more hiring-constrained than fundraising-constrained, hence only a small amount, but donor does believe that donee has a promising approach. Percentage of total donor spend in the corresponding batch of donations: 10.47%.
Effective Altruism FundsAI summer school21,000.002018-11-29AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi Grant to fund the second year of a summer school on AI safety, aiming to familiarize potential researchers with interesting technical problems in the field. Last year’s iteration of this event appears to have gone well, per https://www.lesswrong.com/posts/bXLi3n2jrfqRwoSTH/human-aligned-ai-summer-school-a-summary and private information available to donor. Donor believes that well-run education efforts of this kind are valuable (where “well-run” refers to the quality of the intellectual content, the participants, and the logistics of the event), and feels confident enough that this particular effort will be well-run. Percentage of total donor spend in the corresponding batch of donations: 21.99%.
Effective Altruism FundsForetold20,000.002018-11-29Forecastinghttps://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi Ozzie Gooen sought funding to build an online community of EA forecasters, researchers, and data scientists to predict variables of interest to the EA community. Ozzie proposed using the platform to answer a range of questions, including examples like “How many Google searches will there be for reinforcement learning in 2020?” or “How many plan changes will 80,000 hours cause in 2020?”, and using the results to help EA organizations and individuals to prioritize. The grant decision was made based on past success by Ozzie Gooen with Guesstimate https://www.getguesstimate.com/ as well as belief both in the broad value of the project and the specifics of the project plan. The project would later be named Foretold, and receive a followup grant of $70,000 in the April 2019 round of grants https://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl from the Long Term Future Fund; see also https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions for more detail. Earmark: Ozzie Gooen; Percentage of total donor spend in the corresponding batch of donations: 20.94%.
Effective Altruism FundsAI Safety Unconference4,500.002018-11-29AI safetyhttps://app.effectivealtruism.org/funds/far-future/payouts/3JnNTzhJQsu4yQAYcKceSi Orpheus Lummis and Vaughn DiMarco are organizing an unconference on AI Alignment on the last day of the NeurIPS conference, with the goal of facilitating networking and research on AI Alignment among a diverse audience of AI researchers with and without safety backgrounds. Based on interaction with the organizers and some participants, the donor feels this project is worth funding. However, the donee is still not sure if the unconference will be held, so the grant is conditional to the donee deciding to proceed. The grant would fully fund the request. Percentage of total donor spend in the corresponding batch of donations: 4.71%.
Effective Altruism Funds80,000 Hours45,000.002018-11-29Effective altruism/movement growth/career counselinghttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW The grant write-up gives a number of factors that influence the grant decision. These include the excellent track record of 80,000 Hours, including cost-effectiveness and rapid growth, the planned 2019 expansion of 80,000 Hours, and the importance of the work 80,000 Hours is doing. The write-up concludes: "At the margin, this grant is expected to contribute towards the budget for their expanded team. A typical marginal hire could give one-on-one advice to around 300 additional people per year, which 80,000 Hours estimates will produce around 400 impact-adjusted plan changes over the following two years. This seems like a highly effective use of funding.". Percentage of total donor spend in the corresponding batch of donations: 34.88%.
Effective Altruism FundsFounders Pledge16,000.002018-11-29Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW The grant write-up talks about how Founders Pledge has a huge amount of money pledged and has successfully moved a lot of money, with a rapid growth rate. The grant money will support their 2019 plans to expand their US operations and further develop their methodology to evaluate and support high-impact interventions. Percentage of total donor spend in the corresponding batch of donations: 12.40%.
Effective Altruism FundsCentre for Effective Altruism15,000.002018-11-29Effective altruism/movement growthhttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW Grant write-up talks of the strong track record of the CEA and their success at introducing EA concepts to a large number of people. The write-up concludes: "CEA is aiming to raise $3 million for their 2019 budget, and currently has significant room for more funding due to expansion efforts. At the margin, this grant is expected to contribute towards CEA’s expansion of their events, technology and community health teams.". Percentage of total donor spend in the corresponding batch of donations: 11.63%.
Effective Altruism FundsGlobal Priorities Institute14,000.002018-11-29Cause prioritizationhttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW Grant write-up notes that donee (Global Priorities Institute) is solving an important problem of spreading EA-style ideas in the academic and policy worlds, and has shown impressive progress in its first year. The write-up concludes: "This grant is expected to contribute to GPI’s plans to grow its research team, particularly in economics, in order to publish a strong initial body of papers that defines their research focus. GPI also plans to sponsor DPhil students engaging in global priorities research at the University of Oxford through scholarships and prizes, and to give close support to early career academics with its summer visiting program.". Percentage of total donor spend in the corresponding batch of donations: 10.85%.
Effective Altruism FundsCharity Entrepreneurship14,000.002018-11-29Charity incubationhttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW Grant to support the incubation of new charities, which the donor believes to be highly valuable. The write-up concludes: "Marginal donations will allow Charity Entrepreneurship to expand the number of applicants they can accept to the program and give larger seed grants. The latter is likely to increase both the number of people applying and the level of stability at which the new charities start off, allowing them to focus more on long-term impact rather than short-term fundraising.". Percentage of total donor spend in the corresponding batch of donations: 10.85%.
Effective Altruism FundsRaising for Effective Giving10,000.002018-11-29Effective altruism/meta/fiundraisinghttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW Grant write-up talks of the huge fundraising multiplier of the donee (which is a fundraising organization) as well as its significant growth in recent years. The donor was also impressed by the KPI tracking and estimation of the donee, as well as their directing significant funds to organizations focused on the long-term future. The write-up concludes by saying that the donee "will use this grant to expand their HNW advisory work in 2019. At the margin, this funding will likely enable them to build their research capacity and provide more effective advice to their HNW audience. This will allow them to better target their fundraising for effective organisations.". Percentage of total donor spend in the corresponding batch of donations: 7.75%.
Effective Altruism FundsLet's Fund10,000.002018-11-29Effective altruismhttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW Grant write-up notes that both the idea and the team are potentially promisingm and the quality of initial work is good. It concludes: "Let’s Fund will use additional capital to increase their fundraising ratio through careful outreach and marketing. In the future, they aim to undertake more and better research, especially into global catastrophic risk reduction, and have it evaluated and peer-reviewed by other researchers.". Percentage of total donor spend in the corresponding batch of donations: 7.75%.
Effective Altruism FundsBerkeley REACH5,000.002018-11-29Rationality communityhttps://app.effectivealtruism.org/funds/ea-community/payouts/2dyBJqJBSIq6sAGU6gMYQW Grant is to provide funds to support the panel for dispute resolution, as the donor "would like the panel to succeed in becoming a trusted resource for the local community, and want to ensure they have the capacity to engage in training or seek professional advice where this could be useful." The grant write-up concludes: "This grant will provide the panel with readily available funding in the event that they wish to seek legal counsel, the services of an independent investigator, or other professional support, particularly if they are faced with a time-sensitive matter.". Percentage of total donor spend in the corresponding batch of donations: 3.88%.
Open Philanthropy ProjectCenter for a New American Security260,000.002017-08Global catastrophic riskshttps://www.openphilanthropy.org/focus/global-catastrophic-risks/miscellaneous/center-for-a-new-american-security-richard-danzig-outreach-on-technological-risk Grant awarded to support outreach by Richard Danzig,1 former Secretary of the Navy, on technological risks. Specifically, this funding will allow Mr. Danzig to revise and publish an already-drafted manuscript exploring and providing guidance on issues facing the US government related to potential risks from advanced technology (e.g., biosecurity, cybersecurity, and artificial intelligence risks). The funding would be used by Dr. Danzig to produce Technology Roulette https://www.cnas.org/publications/reports/technology-roulette a report intended for the national security community detailing the management of risks from losing control of advanced technology. Earmark: Richard Danzig; announced: 2017-10-16.
Open Philanthropy ProjectUCLA School of Law1,536,222.002017-05AI safetyhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/ucla-artificial-intelligence-governance Grant to support work on governance related to AI risk led by Edward Parson and Richard Re. Earmark: Edward Parson,Richard Re; announced: 2017-07-27.

Donation amounts by donee and year

Donee Donors influenced Cause area Metadata Total 2019 2018 2017
UCLA School of Law Open Philanthropy Project (filter this donor) Tw WP Site 1,536,222.00 0.00 0.00 1,536,222.00
Center for a New American Security Open Philanthropy Project (filter this donor) 260,000.00 0.00 0.00 260,000.00
80,000 Hours Effective Altruism Funds (filter this donor) Career coaching/life guidance FB Tw WP Site 215,000.00 170,000.00 45,000.00 0.00
Center for Applied Rationality Effective Altruism Funds (filter this donor) Rationality FB Tw WP Site TW 150,000.00 150,000.00 0.00 0.00
Lucius Caviola Effective Altruism Funds (filter this donor) 130,000.00 130,000.00 0.00 0.00
Founders Pledge Effective Altruism Funds (filter this donor) Effective altruism/donor pledges FB Tw WP Site 126,000.00 110,000.00 16,000.00 0.00
Foretold Effective Altruism Funds (filter this donor) 90,000.00 70,000.00 20,000.00 0.00
Machine Intelligence Research Institute Effective Altruism Funds (filter this donor) AI safety FB Tw WP Site CN GS TW 90,000.00 50,000.00 40,000.00 0.00
Metaculus Effective Altruism Funds (filter this donor) 70,000.00 70,000.00 0.00 0.00
Forethought Foundation for Global Priorities Research Effective Altruism Funds (filter this donor) Cause prioritization Site 64,000.00 64,000.00 0.00 0.00
Ought Effective Altruism Funds (filter this donor) AI safety Site 60,000.00 50,000.00 10,000.00 0.00
Kocherga Effective Altruism Funds (filter this donor) 50,000.00 50,000.00 0.00 0.00
Shahar Avin Effective Altruism Funds (filter this donor) 40,000.00 40,000.00 0.00 0.00
Robert Miles Effective Altruism Funds (filter this donor) 39,000.00 39,000.00 0.00 0.00
Alex Turner Effective Altruism Funds (filter this donor) 30,000.00 30,000.00 0.00 0.00
Eli Tyre Effective Altruism Funds (filter this donor) 30,000.00 30,000.00 0.00 0.00
Nikhil Kunapuli Effective Altruism Funds (filter this donor) 30,000.00 30,000.00 0.00 0.00
David Girardo Effective Altruism Funds (filter this donor) 30,000.00 30,000.00 0.00 0.00
Tegan McCaslin Effective Altruism Funds (filter this donor) 30,000.00 30,000.00 0.00 0.00
Anand Srinivasan Effective Altruism Funds (filter this donor) 30,000.00 30,000.00 0.00 0.00
Effective Altruism Russia Effective Altruism Funds (filter this donor) 28,000.00 28,000.00 0.00 0.00
Jacob Lagerros Effective Altruism Funds (filter this donor) 27,000.00 27,000.00 0.00 0.00
Tessa Alexanian Effective Altruism Funds (filter this donor) 26,250.00 26,250.00 0.00 0.00
AI Safety Camp Effective Altruism Funds (filter this donor) 25,000.00 25,000.00 0.00 0.00
AI summer school Effective Altruism Funds (filter this donor) 21,000.00 0.00 21,000.00 0.00
Lauren Lee Effective Altruism Funds (filter this donor) 20,000.00 20,000.00 0.00 0.00
Connor Flexman Effective Altruism Funds (filter this donor) 20,000.00 20,000.00 0.00 0.00
Effective Altruism Norway Effective Altruism Funds (filter this donor) 18,000.00 18,000.00 0.00 0.00
Effective Altruism Geneva Effective Altruism Funds (filter this donor) 18,000.00 18,000.00 0.00 0.00
Effective Altruism Zürich Effective Altruism Funds (filter this donor) 17,900.00 17,900.00 0.00 0.00
Effective Altruism Netherlands Effective Altruism Funds (filter this donor) 17,000.00 17,000.00 0.00 0.00
One for the World Effective Altruism Funds (filter this donor) 15,000.00 15,000.00 0.00 0.00
Centre for Effective Altruism Effective Altruism Funds (filter this donor) Effective altruism/movement growth FB Site 15,000.00 0.00 15,000.00 0.00
Charity Entrepreneurship Effective Altruism Funds (filter this donor) 14,000.00 0.00 14,000.00 0.00
Global Priorities Institute Effective Altruism Funds (filter this donor) Cause prioritization Site 14,000.00 0.00 14,000.00 0.00
Orpheus Lummis Effective Altruism Funds (filter this donor) 10,000.00 10,000.00 0.00 0.00
Let's Fund Effective Altruism Funds (filter this donor) 10,000.00 0.00 10,000.00 0.00
Raising for Effective Giving Effective Altruism Funds (filter this donor) Effective altruism/Fundraising/Poker and sports FB Tw WP Site 10,000.00 0.00 10,000.00 0.00
Rethink Priorities Effective Altruism Funds (filter this donor) Cause prioritization Site 10,000.00 10,000.00 0.00 0.00
The Life You Can Save Effective Altruism Funds (filter this donor) Effective altruism/movement growth/fundraising FB Tw Site 10,000.00 10,000.00 0.00 0.00
Berkeley REACH Effective Altruism Funds (filter this donor) Rationality community FB Site 5,000.00 0.00 5,000.00 0.00
AI Safety Unconference Effective Altruism Funds (filter this donor) 4,500.00 0.00 4,500.00 0.00
Total ---- -- 3,455,872.00 1,435,150.00 224,500.00 1,796,222.00

Graph of spending by donee and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by donee and year (cumulative)

Graph of spending should have loaded here

Donation amounts by donor and year for influencer Oliver Habryka|Alex Zhu|Matt Wage|Helen Toner|Matt Fallshaw

Donor Donees Total 2019 2018 2017
Open Philanthropy Project (filter this donee) Center for a New American Security (filter this donee), UCLA School of Law (filter this donee) 1,796,222.00 0.00 0.00 1,796,222.00
Effective Altruism Funds (filter this donee) 80,000 Hours (filter this donee), AI Safety Camp (filter this donee), AI Safety Unconference (filter this donee), AI summer school (filter this donee), Alex Turner (filter this donee), Anand Srinivasan (filter this donee), Berkeley REACH (filter this donee), Center for Applied Rationality (filter this donee), Centre for Effective Altruism (filter this donee), Charity Entrepreneurship (filter this donee), Connor Flexman (filter this donee), David Girardo (filter this donee), Effective Altruism Geneva (filter this donee), Effective Altruism Netherlands (filter this donee), Effective Altruism Norway (filter this donee), Effective Altruism Russia (filter this donee), Effective Altruism Zürich (filter this donee), Eli Tyre (filter this donee), Forethought Foundation for Global Priorities Research (filter this donee), Foretold (filter this donee), Founders Pledge (filter this donee), Global Priorities Institute (filter this donee), Jacob Lagerros (filter this donee), Kocherga (filter this donee), Lauren Lee (filter this donee), Let's Fund (filter this donee), Lucius Caviola (filter this donee), Machine Intelligence Research Institute (filter this donee), Metaculus (filter this donee), Nikhil Kunapuli (filter this donee), One for the World (filter this donee), Orpheus Lummis (filter this donee), Ought (filter this donee), Raising for Effective Giving (filter this donee), Rethink Priorities (filter this donee), Robert Miles (filter this donee), Shahar Avin (filter this donee), Tegan McCaslin (filter this donee), Tessa Alexanian (filter this donee), The Life You Can Save (filter this donee) 1,659,650.00 1,435,150.00 224,500.00 0.00
Total -- 3,455,872.00 1,435,150.00 224,500.00 1,796,222.00

Graph of spending by donee and year (incremental, not cumulative)

Graph of spending should have loaded here

Graph of spending by donee and year (cumulative)

Graph of spending should have loaded here