This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of March 2022. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.
|Donors list page||http://rationality.org/about/top-donors|
|Transparency and financials page||http://rationality.org/about/official-records|
|Donation case page||http://lesswrong.com/lw/n39/why_cfar_the_view_from_2015/|
|Open Philanthropy Project grant review||http://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support|
|Timelines wiki page||https://timelines.issarice.com/wiki/Timeline_of_Center_for_Applied_Rationality|
|Org Watch page||https://orgwatch.issarice.com/?organization=Center+for+Applied+Rationality|
|Key people||Julia Galef|Anna Salamon|
|Cause area||Count||Median||Mean||Minimum||10th percentile||20th percentile||30th percentile||40th percentile||50th percentile||60th percentile||70th percentile||80th percentile||90th percentile||Maximum|
|Open Philanthropy Project (filter this donee)||3,614,000.00||375,000.00||0.00||1,560,000.00||340,000.00||1,339,000.00||0.00||0.00|
|Berkeley Existential Risk Initiative (filter this donee)||1,200,000.00||0.00||0.00||1,100,000.00||100,000.00||0.00||0.00||0.00|
|Effective Altruism Funds: Long-Term Future Fund (filter this donee)||324,021.00||0.00||150,000.00||174,021.00||0.00||0.00||0.00||0.00|
|Future of Life Institute (filter this donee)||111,757.00||0.00||0.00||0.00||0.00||0.00||111,757.00||0.00|
|Survival and Flourishing Fund (filter this donee)||110,000.00||0.00||110,000.00||0.00||0.00||0.00||0.00||0.00|
|Blake Borgeson (filter this donee)||90,000.00||0.00||0.00||0.00||0.00||90,000.00||0.00||0.00|
|Richard Schwall (filter this donee)||75,000.00||0.00||0.00||0.00||0.00||75,000.00||0.00||0.00|
|Loren Merritt (filter this donee)||40,000.00||0.00||0.00||0.00||0.00||0.00||0.00||40,000.00|
|Zvi Mowshowitz (filter this donee)||4,000.00||0.00||0.00||0.00||4,000.00||0.00||0.00||0.00|
|Jacob Falkovich (filter this donee)||2,000.00||0.00||0.00||0.00||0.00||2,000.00||0.00||0.00|
|Nick Brown (filter this donee)||558.00||0.00||0.00||0.00||0.00||398.57||159.43||0.00|
|Aaron Gertler (filter this donee)||150.00||0.00||0.00||0.00||0.00||100.00||0.00||50.00|
|Raymond Arnold (filter this donee)||150.00||0.00||0.00||0.00||0.00||0.00||150.00||0.00|
|EA Giving Group (filter this donee)||0.00||0.00||0.00||0.00||0.00||0.00||0.00||0.00|
|Title (URL linked)||Publication date||Author||Publisher||Affected donors||Affected donees||Document scope||Cause area||Notes|
|CFAR: Progress Report & Future Plans (GW, IR)||2019-12-19||Adam Scholl||Center for Applied Rationality||Center for Applied Rationality||Donee periodic update||Rationality improvement||The blog post describes the progress in 2018 and 2019 of the Center for Applied Rationality (CFAR), including work on AI Risk for Computer Scientists (AIRCS) workshops conducted jointly with MIRI, European workshops, instructor training, and internal improvements. The post also describes plans for 2020, including efforts to mitigate employee burnout, a plan to publicly release the workshop handbook, and hiring and fundraising plans|
|Committee for Effective Altruism Support||2019-02-27||Open Philanthropy Project||Open Philanthropy Project||Centre for Effective Altruism Berkeley Existential Risk Initiative Center for Applied Rationality Machine Intelligence Research Institute Future of Humanity Institute||Broad donor strategy||Effective altruism|AI safety||The document announces a new approach to setting grant sizes for the largest grantees who are "in the effective altruism community" including both organizations explicitly focused on effective altruism and other organizations that are favorites of and deeply embedded in the community, including organizations working in AI safety. The committee comprises Open Philanthropy staff and trusted outside advisors who are knowledgeable about the relevant organizations. Committee members review materials submitted by the organizations; gather to discuss considerations, including room for more funding; and submit “votes” on how they would allocate a set budget between a number of grantees (they can also vote to save part of the budget for later giving). Votes of committee members are averaged to arrive at the final grant amounts. Example grants whose size was determined by the community is the two-year support to the Machine Intelligence Research Institute (MIRI) https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support-2019 and one-year support to the Centre for Effective Altruism (CEA) https://www.openphilanthropy.org/giving/grants/centre-effective-altruism-general-support-2019|
|EA Giving Tuesday Donation Matching Initiative 2018 Retrospective (GW, IR)||2019-01-06||Avi Norowitz||Effective Altruism Forum||Avi Norowitz William Kiely||Against Malaria Foundation Malaria Consortium GiveWell Effective Altruism Funds: Meta Fund Effective Altruism Funds: Long-Term Future Fund Effective Altruism Funds: Animal Welfare Fund Effective Altruism Funds: Global Health and Development Fund Alliance to Feed the Earth in Disasters Effective Animal Advocacy Fund The Humane League The Good Food Institute Animal Charity Evaluators Machine Intelligence Research Institute Faunalytics Wild-Aniaml Suffering Research GiveDirectly Center for Applied Rationality Effective Altruism Foundation Cool Earth Schistosomiasis Control Initiative New Harvest Evidence Action Centre for Effective Altruism Animal Equality Compassion in World Farming USA Innovations for Poverty Action Global Catastrophic Risk Institute Future of Life Institute Animal Charity Evaluators Recommended Charity Fund Sightsavers The Life You Can Save One Step for Animals Helen Keller International 80,000 Hours Berkeley Existential Risk Initiative Vegan Outreach Encompass Iodine Global Network Otwarte Klatki Charity Science Mercy For Animals Coalition for Rainforest Nations Fistula Foundation Sentience Institute Better Eating International Forethought Foundation for Global Priorities Research Raising for Effective Giving Clean Air Task Force The END Fund||Miscellaneous commentary||The blog post describes an effort by a number of donors coordinated at https://2018.eagivingtuesday.org/donations to donate through Facebook right after the start of donation matching on Giving Tuesday. Based on timestamps of donations and matches, donations were matched till 14 seconds after the start of matching. Despite the very short time window of matching, the post estimates that $469,000 (65%) of the donations made were matched|
|CFAR 2017 Fundraiser||2017-12-20||Pete Michaud||Center for Applied Rationality||Center for Applied Rationality||Donee donation case||Rationality improvement||CFAR gives an update on its plans and hopes for 2018, and asks supporters to donate. CFAR hopes to drive much greater impact by focusing on its summer programs -- the AI Summer Fellows Program (AISFP) and Workshops for AI Safety Strategy (WAISS), as well as more experimental x-risk workshops. The main uncertain variable in its plan is whether and when it will start being able to use its own dedicated venue to conduct events|
|CFAR 2017 Impact Report||2017-12-20||Dan Keys||Center for Applied Rationality||Center for Applied Rationality||Donee periodic update||Rationality improvement||The document is an impact report for CFAR in 2017. It breaks the impact question into two parts: what impact does CFAR have on people, and what impact do these people have on the world relative to the counterfactual, particularly in the domain of AI risk?|
|2017 AI Safety Literature Review and Charity Comparison (GW, IR)||2017-12-20||Ben Hoskin||Effective Altruism Forum||Ben Hoskin||Machine Intelligence Research Institute Future of Humanity Institute Global Catastrophic Risk Institute Centre for the Study of Existential Risk AI Impacts Center for Human-Compatible AI Center for Applied Rationality Future of Life Institute 80,000 Hours||Review of current state of cause area||AI safety||The lengthy blog post covers all the published work of prominent organizations focused on AI risk. It is an annual refresh of https://forum.effectivealtruism.org/posts/nSot23sAjoZRgaEwa/2016-ai-risk-literature-review-and-charity-comparison (GW, IR) -- a similar post published a year before it. The conclusion: "Significant donations to the Machine Intelligence Research Institute and the Global Catastrophic Risks Institute. A much smaller one to AI Impacts."|
|Alright so I just finished the CFAR workshop||2017-05-08||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Review of CFAR workshop by somebody on Tumblr|
|Bay Area II: CFAR Workshop||2017-04-30||John Aslanides||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Write-up about experience at the Summer 2016 Workshop for AI Researchers run by CFAR|
|CFAR Workshop February 2017 Review||2017-02-28||Owen Shen||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Review of CFAR workshop of somebody (unaffiliated with CFAR) who mentored at the workshop recently. Also mirrored at http://lesswrong.com/lw/ool/cfar_workshop_review_february_2017/ on LessWrong|
|Post-CFAR Workshop Compilation||2017-02-18||Owen Shen||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Review of post-CFAR experience by soembody who recently mentored at a CFAR workshop. Five parts, all in one PDF; also accessible separately on the web starting with https://mindlevelup.wordpress.com/2017/02/18/ontologies-and-operating-systems-post-cfar-1/|
|2016 AI Risk Literature Review and Charity Comparison (GW, IR)||2016-12-13||Ben Hoskin||Effective Altruism Forum||Ben Hoskin||Machine Intelligence Research Institute Future of Humanity Institute OpenAI Center for Human-Compatible AI Future of Life Institute Centre for the Study of Existential Risk Leverhulme Centre for the Future of Intelligence Global Catastrophic Risk Institute Global Priorities Project AI Impacts Xrisks Institute X-Risks Net Center for Applied Rationality 80,000 Hours Raising for Effective Giving||Review of current state of cause area||AI safety||The lengthy blog post covers all the published work of prominent organizations focused on AI risk. References https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/machine-intelligence-research-institute-general-support#sources1007 for the MIRI part of it but notes the absence of information on the many other orgs. The conclusion: "The conclusion: "Donate to both the Machine Intelligence Research Institute and the Future of Humanity Institute, but somewhat biased towards the former. I will also make a smaller donation to the Global Catastrophic Risks Institute."|
|Staff members’ personal donations for giving season 2016||2016-12-09||Natalie Crispin||GiveWell||Elie Hassenfeld Holden Karnofsky Natalie Crispin Alexander Berger Timothy Telleen-Lawton Josh Rosenberg Rebecca Raible Helen Toner Sophie Monahan Laura Muñoz Catherine Hollander Andrew Martin Lewis Bollard Chelsea Tabart Sarah Ward Chris Somerville Ajeya Cotra Chris Smith Isabel Arjmand||A political campaign GiveWell top charities International Genetically Engineered Machine Foundation UPMC Center for Health Security Donor lottery EA Giving Group GiveDirectly Center for Applied Rationality Malaria Consortium Animal Charity Evaluators Northwest Health Law Advocates StrongMinds Against Malaria Foundation Schistosomiasis Control Initiative The Humane Society of the United States The Humane League Mercy For Animals Humane Society International Compassion in World Farming USA The Good Food Institute Citizens for Farm Animal Protection END Fund Causa Justa Planned Parenthood International Refugee Assistance Project||Periodic donation list documentation||GiveWell and Open Philanthropy Project staff describe their annual donation plans for 2016. Some of these are tentative and get superseded by further events. Also, not all employees are present in the document (participation is optional). Amounts donated are not included, per a decision by GiveWell|
|EAs write about where they give||2016-12-09||Julia Wise||Effective Altruism Forum||Blake Borgeson Eva Vivalt Ben Kuhn Alexander Gordon-Brown and Denise Melchin Elizabeth Van Nostrand||Machine Intelligence Research Institute Center for Applied Rationality AidGrade Charity Science: Health 80,000 Hours Centre for Effective Altruism Tostan||Periodic donation list documentation||Global health and development, AI risk||Julia Wise got submissions from multiple donors about their donation plans and put them together in a single post. The goal was to cover people outside of organizations that publish such posts for their employees|
|Considerations against pledging donations for the rest of your life||2016-12-07||Andrew Critch||Center for Applied Rationality||Miscellaneous commentary||In this post about earning to give, Andrew Critch argues against pledging to donate in perpetuity, even though he encourages people to donate during years when it makes sense. He writes: "If you either (1) make less than $100k/year, or (2) might be willing to make less than that at some future time in order to work directly on something the world needs you to do (besides giving), I would not be surprised to find myself recommending against you pledging to always donate 10% of your income every year." He talks about his own time spent on setting up the Center for Applied Rationality, and feels that donating money instead of spending resources creating CFAR would have been worse. He writes later: "Expecting variance + respecting your judgement = not pledging"|
|An EA at a CFAR Rationality Workshop: Thoughts and Review (GW, IR)||2016-06-06||Gleb Tsipursky||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Review of CFAR workshop attended. See http://lesswrong.com/lw/nog/review_and_thoughts_on_current_version_of_cfar/ for a cross-post to LessWrong|
|Field notes from the CFAR workshop — Part 1 — Personal mythology and the spinning plates||2016-05-26||Tetiana Ivanova||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Part 1 of review of CFAR workshop and of personal takeaway and impact. See https://medium.com/@tanyaivanova/field-notes-from-the-cfar-workshop-part-2-fifty-shades-of-focus-319429d73aa1 for part 2|
|Field notes from the CFAR workshop — Part 2 — fifty shades of focus||2016-05-26||Tetiana Ivanova||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Part 2 of review of CFAR workshop and of personal takeaway and impact. See https://medium.com/@tanyaivanova/field-notes-from-the-cfar-workshop-part-1-personal-mythology-and-the-spinning-plates-489002354cf7 for part 1|
|CFAR 3 year retrospective||2016-01-28||Malcolm Ocean||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Three-year retrospective from CFAR workshop attendee, who has applied techniques learned extensively and attended Alumni Workshops|
|Where should you donate to have the most impact during giving season 2015?||2015-12-24||Robert Wiblin||80,000 Hours||Against Malaria Foundation Giving What We Can GiveWell AidGrade Effective Altruism Outreach Animal Charity Evaluators Machine Intelligence Research Institute Raising for Effective Giving Center for Applied Rationality Johns Hopkins Center for Health Security Ploughshares Fund Future of Humanity Institute Future of Life Institute Centre for the Study of Existential Risk Charity Science Deworm the World Initiative Schistosomiasis Control Initiative GiveDirectly||Evaluator consolidated recommendation list||Global health and development,Effective altruism/movement growth,Rationality improvement,Biosecurity and pandemic preparedness,AI risk,Global catastrophic risks||Robert Wiblin draws on GiveWell recommendations, Animal Charity Evaluators recommendations, Open Philanthropy Project writeups, staff donation writeups and suggestions, as well as other sources (including personal knowledge and intuitions) to come up with a list of places to donate|
|Why CFAR? The view from 2015 (GW, IR)||2015-12-23||Pete Michaud||Center for Applied Rationality||Center for Applied Rationality||Donee donation case||Rationality improvement||Reports 2015 as a year of consolidation and streamlining. Fewer workshops, more efforts to streamline both workshops and continued research on improving rationality. Consolidated original three goals (Competence, Epistemic Rationality, and Do-Gooding) into single framework of applying epistemic rationality. Emphasis on concept of double crux|
|Peter McCluskey's favorite charities||2015-12-06||Peter McCluskey||Peter McCluskey||Center for Applied Rationality Future of Humanity Institute AI Impacts GiveWell GiveWell top charities Future of Life Institute Centre for Effective Altruism Brain Preservation Foundation Multidisciplinary Association for Psychedelic Studies Electronic Frontier Foundation Methuselah Mouse Prize SENS Research Foundation Foresigh Institute||Evaluator consolidated recommendation list||The page discusses the favorite charities of Peter McCluskey and his opinion on their current room for more funding in light of their financial situation and expansion plans|
|Unquantified Self||2015-12-03||Elizabeth Van Nostrand||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Review of experience at CFAR workshop and personal takeaways|
|The Boston CFAR alumni workshop||2015-07-08||Alex Altair||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Review of CFAR alumni workshop|
|Hamming questions and bottlenecks||2015-05-17||Victoria Krakovna||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Discussion of CFAR worshop and the Hamming question (about the most important things one is working on)|
|CFAR in 2014: Continuing to climb out of the startup pit, heading toward a full prototype (GW, IR)||2014-12-26||Anna Salamon||Center for Applied Rationality||Center for Applied Rationality||Donee donation case||Rationality improvement||CFAR reports better finances than 2013 (more stability), improvements to its curriculum, attempts to do a full prototype. There is a long discussion of how it redefined the way epistemic rationality fit into the curriculum (moving away from a separate low-rating module to something that is incorporated throughout the course through practical application). Comments are supportive, and include the first appearance of Gleb Tsipursky advertising Intentional Insights|
|CFAR workshop review||2014-12-08||Topher Brennan||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Delayed review of CFAR workshop experience, mostly linking to and endorsing reviews http://kajsotala.fi/2014/11/event-report-cfars-rationality-workshop-england/ by Kaj Sotala and http://jesswhittlestone.com/blog/2014/11/25/becoming-more-rational-what-i-got-out-of-the-cfar-workshop by Jess Whittlestone|
|Becoming More Rational: What I got out of attending a CFAR workshop||2014-11-27||Jess Whittlestone||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Review of key lessons learned from a recently attended CFAR workshop|
|Event report: CFAR’s rationality workshop, England||2014-11-26||Kaj Sotala||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Review of key lessons learned from a recently attended CFAR workshop|
|Review: CFAR Workshop||2014-05-09||Aaron Gertler||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Review of CFAR workshop attended|
|Why CFAR? (GW, IR)||2013-12-28||Anna Salamon||Center for Applied Rationality||Center for Applied Rationality||Donee donation case||Rationality improvement||Salamon, the boss of CFAR, explains the long-term goal (Competence, Epistemic Rationality, and Do-Gooding), plan, and progress to date. This is the first year for CFAR as a standalone organization, after its spinoff from MIRI. Funding situation is dire, with an emergency loan currently taken out that is due in three months. Comments include a $40K donation and a number of other donations|
|CFAR - Second Impression and a Bleg||2013-12-26||Benjamin Hoffman||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Review of CFAR along with suggestion that people donate to it for its fundraiser. Also links to official CFAR funding ask https://www.lesswrong.com/posts/JjGs6mDZxeCWkg3ii/why-cfar (GW, IR) (published after original version of post was published|
|A First Impression Review of the CFAR NY Workshop||2013-11-04||Benjamin Hoffman||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Initial impressions of CFAR workshop. See http://benjaminrosshoffman.com/cfar-second-thoughts-and-a-bleg/ for updated thoughts|
|CFAR workshop review||2013-07-01||Ben Kuhn||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Review of workshop attended; credence in workshop being actually valuable rose significantly|
|The Centre for Applied Rationality: a year later from a (somewhat) outside perspective (GW, IR)||2013-05-27||Miranda Dixon-Luinenberg||Center for Applied Rationality||Evaluator review of donee||Rationality improvement||Review of CFAR workshop attended, one year later|
|Donor||Amount (current USD)||Amount rank (out of 25)||Donation date||Cause area||URL||Influencer||Notes|
|Open Philanthropy Project||375,000.00||5||Rationality improvement||https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2020||Claire Zabel||Intended use of funds (category): Organizational general support
Intended use of funds: The grant page says: "CFAR is an adult education nonprofit that seeks to find and develop cognitive tools and to deliver these tools to promising individuals, groups, organizations, and networks focused on solving large and pressing problems. [...] They introduce people to and/or strengthen their connections with the effective altruism (EA) community and way of thinking, which we hope results in people with outstanding potential pursuing more impactful career trajectories. CFAR is particularly interested in growing the talent pipeline for work on potential risks from advanced artificial intelligence (AI)."
Donor reason for selecting the donee: The grant page says: "Our primary interest in these workshops is that we believe they introduce people to and/or strengthen their connections with the effective altruism (EA) community and way of thinking, which we hope results in people with outstanding potential pursuing more impactful career trajectories." Also: "CFAR is particularly interested in growing the talent pipeline for work on potential risks from advanced artificial intelligence (AI). More on our interest in supporting work [...]"
Donor reason for donating that amount (rather than a bigger or smaller amount): Amount chosen to provide one year of operating support
Donor reason for donating at this time (rather than earlier or later): Timing determind by the end of the funding timeframe of the previous two-year grant https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018 made January 2018
Intended funding timeframe in months: 12
Donor thoughts on making further donations to the donee: This is an exit grant, so Open Phil does not plan to make further grants to CFAR. Announced: 2020-04-20.
|Survival and Flourishing Fund||110,000.00||12||Rationality improvement||http://survivalandflourishing.org/||Alex Flint Andrew Critch Eric Rogstad||Donation process: Part of the founding batch of grants for the Survival and Flourishing Fund made in August 2019. The fund is partly a successor to part of the grants program of the Berkeley Existential Risk Initiative (BERI) that handled grantmaking by Jaan Tallinn; see http://existence.org/tallinn-grants-future/ As such, this grant to CFAR may represent a followup to past grants by BERI to CFAR
Intended use of funds (category): Organizational general support
Donor reason for selecting the donee: The grant may represent a followup to past grants by BERI to CFAR
Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round; the Survival and Flourishing Fund is making its first round of grants in August 2019 Percentage of total donor spend in the corresponding batch of donations: 100.00%; announced: 2019-08-29.
|Effective Altruism Funds: Long-Term Future Fund||150,000.00||10||Rationality improvement||https://app.effectivealtruism.org/funds/far-future/payouts/6vDsjtUyDdvBa3sNeoNVvl||Oliver Habryka Alex Zhu Matt Wage Helen Toner Matt Fallshaw||Donation process: Donee submitted grant application through the application form for the April 2019 round of grants from the Long-Term Future Fund, and was selected as a grant recipient (23 out of almost 100 applications were accepted)
Intended use of funds (category): Organizational general support
Intended use of funds: The grant is to help the Center for Applied Rationality (CFAR) survive as an organization for the next few months (i.e., till the next grant round, which is 3 months later) without having to scale down operations. CFAR is low on finances because they did not run a 2018 fundraiser. because they felt that running a fundraiser would be in bad taste after what they considered a messup on their part in the Brent Dill situation
Donor reason for selecting the donee: Grant investigator and main influencer Oliver Habryka thinks CFAR intro workshops have had positive impact in 3 ways: (1) establishing epistemic norms, (2) training, and (3) recruitment into the X-risk network (especially AI safety). He also thinks CFAR faces many challenges, including the departure of many key employees, the difficulty of attracting top talent, and a dilution of its truth-seeking focus. However, he is enthusiastic about joint CFAR/MIRI workshops for programmers, where CFAR provides instructors. His final reason for donating is to avoid CFAR having to scale down due to its funding shortfall because it didn't run the 2018 fundraiser
Donor reason for donating that amount (rather than a bigger or smaller amount): The grant amount, which is the largest in this grant round from the EA Long-Term Future Fund, is chosen to be sufficient for CFAR to continue operating as usual till the next grant round from the EA Long-Term Future Fund (in about 3 months). Habryka further elaborates in https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-recommendations#uhH4ioNbdaFrwGt4e (GW, IR) in reply to Milan Griffes, explaining why the grant is large and unrestricted
Percentage of total donor spend in the corresponding batch of donations: 16.25%
Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of grant round, as well as by CFAR's time-sensitive financial situation; the grant round is a few months after the end of 2018, so the shortfall of funds raised because of not conducting the 2018 fundraiser is starting to hit on the finances
Intended funding timeframe in months: 3
Donor thoughts on making further donations to the donee: Grant investigator and main influencer Oliver Habryka writes: "I didn’t have enough time this grant round to understand how the future of CFAR will play out; the current grant amount seems sufficient to ensure that CFAR does not have to take any drastic action until our next grant round. By the next grant round, I plan to have spent more time learning and thinking about CFAR’s trajectory and future, and to have a more confident opinion about what the correct funding level for CFAR is."
Other notes: The grant reasoning is written up by Oliver Habryka and is available at https://forum.effectivealtruism.org/posts/CJJDwgyqT4gXktq6g/long-term-future-fund-april-2019-grant-decisions (GW, IR) In the comments, Milan Griffes asks why such a large, unrestricted grant is being made to CFAR despite these concerns, and also what Habryka hopes to learn about CFAR before the next grant round. There are replies from Peter McCluskey and Habryka, with some further comment back-and-forth.
|Effective Altruism Funds: Long-Term Future Fund||174,021.00||9||Rationality improvement||https://app.effectivealtruism.org/funds/far-future/payouts/6g4f7iae5Ok6K6YOaAiyK0||Nick Beckstead||Donation process: The grant from the EA Long-Term Future Fund is part of a final set of grant decisions being made by Nick Beckstead (granting $526,000 from the EA Meta Fund and $917,000 from the EA Long-Term Future Fund) as he transitions out of managing both funds. Due to time constraints, Beckstead primarily relies on investigation of the organization done by the Open Philanthropy Project when making its 2018 grant https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018
Intended use of funds (category): Organizational general support
Intended use of funds: Beckstead writes "I recommended these grants with the suggestion that these grantees look for ways to use funding to trade money for saving the time or increasing the productivity of their employees (e.g. subsidizing electronics upgrades or childcare), due to a sense that (i) their work is otherwise much less funding constrained than it used to be, and (ii) spending like this would better reflect the value of staff time and increase staff satisfaction. However, I also told them that I was open to them using these funds to accomplish this objective indirectly (e.g. through salary increases) or using the funds for another purpose if that seemed better to them."
Donor reason for selecting the donee: The grant page references https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018 for Beckstead's opinion of the donee.
Donor reason for donating that amount (rather than a bigger or smaller amount): The grant page says "The amounts I’m granting out to different organizations are roughly proportional to the number of staff they have, with some skew towards MIRI that reflects greater EA Funds donor interest in the Long-Term Future Fund." Also: "I think a number of these organizations could qualify for the criteria of either the Long-Term Future Fund or the EA Community Fund because of their dual focus on EA and longtermism, which is part of the reason that 80,000 Hours is receiving a grant from each fund."
Percentage of total donor spend in the corresponding batch of donations: 18.98%
Donor reason for donating at this time (rather than earlier or later): Timing determined by the timing of this round of grants, which is in turn determined by the need for Beckstead to grant out the money before handing over management of the fund
Donor retrospective of the donation: Even after the fund management being moved to a new team, the EA Long-Term Future Fund would continue making grants to CFAR.
|Berkeley Existential Risk Initiative||300,000.00||8||AI safety||https://web.archive.org/web/20180921215949/http://existence.org/organization-grants/||--||General support. Announced at http://existence.org/2018/09/18/activity-update-july-and-august-2018.html and at http://www.rationality.org/resources/updates/2018/august-newsletter (CFAR lists $600,000 over two years as the amount).|
|Berkeley Existential Risk Initiative||800,000.00||3||AI safety||https://web.archive.org/web/20180731180958/http://existence.org:80/grants https://web.archive.org/web/20180921215949/http://existence.org/organization-grants/||--||General support for the purpose of a permanent venue for CFAR workshops. See announcement at http://existence.org/2018/02/08/activity-update-january-2018.html.|
|Open Philanthropy Project||560,000.00||4||Rationality improvement||https://www.openphilanthropy.org/giving/grants/center-applied-rationality-sparc-2018||Nick Beckstead Nicole Ross||Intended use of funds (category): Direct project expenses
Intended use of funds: Grant "to support the Summer Program on Applied Rationality and Cognition (SPARC). SPARC is a two-week summer program for top high school students to further develop skills related to applied reasoning, with a broad-ranging curriculum."
Donor reason for selecting the donee: The grant page says: "We expect that this program will expand the horizons of some students with extremely high potential and, hopefully, increase their positive impact on the world. We are especially interested in the possibility that participation in SPARC leads to greater awareness of effective altruism and issues important to the effective altruism community."
Donor reason for donating at this time (rather than earlier or later): Open Phil had previously funded SPARC for 2016 and 2017 with the grant https://www.openphilanthropy.org/giving/grants/center-applied-rationality-sparc This grant continues the funding to 2018 (and possibly to later years) Announced: 2018-02.
|Open Philanthropy Project||1,000,000.00||2||Rationality improvement||https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018||Nick Beckstead||Intended use of funds (category): Organizational general support
Intended use of funds: The grant page says: "CFAR is an adult education nonprofit that seeks to find and develop cognitive tools and to deliver these tools to promising individuals, groups, organizations, and networks focused on solving large and pressing problems."
Donor reason for selecting the donee: The grant page says: "Our primary interest in [CFAR] workshops is that we believe they introduce people to and/or strengthen their connections with the effective altruism (EA) community and way of thinking, which we hope results in people with outstanding potential pursuing more impactful career trajectories. CFAR is particularly interested in growing the talent pipeline for work on potential risks from advanced artificial intelligence (AI). More on our interest in supporting work along these lines is here."
Donor thoughts on making further donations to the donee: The grant page says: "CFAR’s performance on this grant will be judged primarily in terms of whether it provides adequate evidence of its programs resulting in improved career trajectories of the sort described above."
Donor retrospective of the donation: The followup grant https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2020 would be an exit grant, suggesting that Open Phil would revise downward its assessment of continued support of CFAR, but still continue to value CFAR enough to help it exit smoothly. Intended funding timeframe in months: 1; announced: 2018-02-28.
|Berkeley Existential Risk Initiative||100,000.00||13||AI safety||https://web.archive.org/web/20180731180958/http://existence.org:80/grants https://web.archive.org/web/20180921215949/http://existence.org/organization-grants/||--||See announcement at http://existence.org/2018/01/11/activity-update-december-2017.html.|
|Zvi Mowshowitz||4,000.00||17||Rationality improvement||https://thezvi.wordpress.com/2017/12/25/the-story-cfar/||--||The document explains the motivation to donate to CFAR in addition to the $10,000 donation to MIRI documented at https://thezvi.wordpress.com/2017/12/17/i-vouch-for-miri/ The reasons for donating to CFAR include: endorsement of rationality improvement, endorsement of CFAR as an organization and its approach to the problem, direct personal benefits, and an especially important time for CFAR because they need enough funding for a venue.|
|Open Philanthropy Project||340,000.00||6||Rationality improvement||https://www.openphilanthropy.org/giving/grants/center-applied-rationality-european-summer-program-rationality||Nick Beckstead||Intended use of funds (category): Direct project expenses
Intended use of funds: Grant to support the "European Summer Program on Rationality (ESPR), a two-week summer workshop for about 40 mathematically gifted students aged 16-19."
Donor reason for selecting the donee: The grant page says: "We are excited about this grant because we expect that ESPR will orient participants to problems that we believe to be high impact, and may lead them to increase their positive impact on the world."
Donor reason for donating at this time (rather than earlier or later): Donation made in time to fund the event for 2017 Announced: 2017-09-27.
|Open Philanthropy Project||1,035,000.00||1||Rationality improvement/effective altruism/movement growth||https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support||Nick Beckstead||Donation process: According to https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support#Our_process "While investigating this grant, we had several conversations with Anna Salamon, as well as with various other contacts of ours in the EA community. Nick Beckstead was the primary investigator for this grant."
Intended use of funds (category): Organizational general support
Intended use of funds: The grant page says: "$915,000 of this grant will support CFAR workshops and organizational improvements. $120,000 of this grant will fund a pilot version of EuroSPARC, an eight-day summer program in Europe run by CFAR for mathematically gifted high school students, modeled on the San Francisco-based Summer Program in Applied Rationality and Cognition (SPARC), which CFAR has helped run for the past three years."
Donor reason for selecting the donee: Stated reasons for the grant include value-alignment, success attracting and cultivating talented people to work on effective altruist causes, and funding being a substantial constraint at present
Donor reason for donating that amount (rather than a bigger or smaller amount): Amount tied to a budget proposed by CFAR, described at https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support#Budget_and_room_for_more_funding first year: $360,000 for organizational improvements, $100,000 for scholarships for CFAR workshops, $120,000 for EuroSPARC 2016, $47,500 for half the salary and benefits of a new staff member splitting time between CFAR operations and SPARC; second year: 360,000 for organizational improvements, $47,500 for half the salary and benefits of a new staff member
Donor thoughts on making further donations to the donee: https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support#Key_questions_for_follow-up lists key follow-up questions
Donor retrospective of the donation: The grant page for a followup January 2018 grant https://www.openphilanthropy.org/giving/grants/center-applied-rationality-general-support-2018 would say: "Since our 2016 funding recommendation, CFAR has largely met its milestones for organizational improvement." The statement, along with the fact that the followup grant would have a comparable size of $1,000,000, suggests that Open Phil would be satisfied with the results of the grant Intended funding timeframe in months: 1; announced: 2016-09-06.
|Open Philanthropy Project||304,000.00||7||Rationality improvement/effective altruism/movement growth||https://www.openphilanthropy.org/giving/grants/center-applied-rationality-sparc||Nick Beckstead||Donation process: According to https://www.openphilanthropy.org/giving/grants/center-applied-rationality-sparc#Our_process "Nick Beckstead, our Program Officer for Scientific Research, spoke with members of SPARC’s staff regarding its program, finances, and future plans."
Intended use of funds (category): Direct project expenses
Intended use of funds: Grant of $137,000 in 2016 and $167,000 in 2017 to support the Summer Program on Applied Rationality and Cognition (SPARC). "SPARC is a two-week summer program for high school students. Students selected to participate in the program typically show exceptional ability in mathematics, with many scoring highly among US participants in national or international math competitions."
Donor reason for selecting the donee: According to https://www.openphilanthropy.org/giving/grants/center-applied-rationality-sparc#Case_for_the_grant "we believe the program is strong, with the potential to have a substantial impact. [...] SPARC attracts unusually talented students. [...] we think very highly of several of the instructors who work at SPARC, some of whom also show strong interest in effective altruism."
Donor reason for donating that amount (rather than a bigger or smaller amount): According to https://www.openphilanthropy.org/giving/grants/center-applied-rationality-sparc#Budget_and_room_for_more_funding "SPARC’s total budget was approximately $90,000 in 2015. This grant will allow it to cover alumni events, travel reimbursement, unexpected contingencies, and some of the expenses associated with hiring a full-time logistics manager, as well as half of the salary and benefits for the new logistics manager, with the other half paid out of CFAR’s general budget. Our understanding is that the two years of support provided by this grant will be sufficient to enable SPARC to hire the new logistics manager and that a third year of support would not materially affect SPARC’s planning."
Donor reason for donating at this time (rather than earlier or later): Grant made shortly before SPARC 2016, and timing likely chosen so that the grant could be used for SPARC 2016
Intended funding timeframe in months: 24
Donor thoughts on making further donations to the donee: The grant page section https://www.openphilanthropy.org/giving/grants/center-applied-rationality-sparc#Key_questions_for_follow-up lists follow-up questions that Open Phil is interested in understanding better for the future Announced: 2016-07-07.
|Aaron Gertler||100.00||22||Rationality improvement||https://aarongertler.net/donations-all-years/||--||10% of the donations that day, the other 90% went to GiveWell. Percentage of total donor spend in the corresponding batch of donations: 100.00%.|
|Blake Borgeson||90,000.00||14||--||http://effective-altruism.com/ea/14u/eas_write_about_where_they_give/||--||Borgeson did not specify numerical amounts but gave percentages, combined these with other info of 300000 dollars to MIRI in https://intelligence.org/2016/08/05/miri-strategy-update-2016/. Percentage of total donor spend in the corresponding batch of donations: 100.00%.|
|EA Giving Group||--||--||Rationality improvement||https://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit||Nick Beckstead||Actual date range: December 2015 to February 2016. Exact date, amount, or fraction not known, but it is the donee with the fifth highest amount donated out of six donees in this period.|
|Future of Life Institute||111,757.00||11||AI safety||https://futureoflife.org/AI/2015awardees#Salamon||--||A conference and education grant. Project title: Anna Salamon.|
|Aaron Gertler||50.00||23||Rationalty improvement||https://aarongertler.net/donations-all-years/||--||Annual matching fundraiser.|
|Loren Merritt||40,000.00||16||Rationality improvement||http://lesswrong.com/lw/jej/why_cfar/ab0o||--||Donation is announced in response to the post http://lesswrong.com/lw/jej/why_cfar/ by Anna Salamon of CFAR, making the case for donating to it.|
|EA Giving Group||--||--||Rationality improvement||https://docs.google.com/spreadsheets/d/1H2hF3SaO0_QViYq2j1E7mwoz3sDjdf9pdtuBcrq7pRU/edit||Nick Beckstead||Actual date range: December 2013 to December 2014. Exact date, amount, or fraction not known, but it is the donee with the second highest amount donated out of five donees in this period.|