This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of December 2019. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.
We do not have any donor information for the donor Oliver Habryka in our system.
|Cause area||Count||Median||Mean||Minimum||10th percentile||20th percentile||30th percentile||40th percentile||50th percentile||60th percentile||70th percentile||80th percentile||90th percentile||Maximum|
If you hover over a cell for a given cause area and year, you will get a tooltip with the number of donees and the number of donations.
Note: Cause area classification used here may not match that used by donor for all cases.
|Cause area||Number of donations||Number of donees||Total|
Skipping spending graph as there is fewer than one year’s worth of donations.
Sorry, we couldn't find any subcause area information.
Skipping spending graph as there is fewer than one year’s worth of donations.
Sorry, we couldn't find any influencer information.
Sorry, we couldn't find any disclosures information.
Sorry, we couldn't find any country information.
|Title (URL linked)||Publication date||Author||Publisher||Affected donors||Affected donees||Document scope||Cause area||Notes|
|Long Term Future Fund and EA Meta Fund applications open until June 28th||2019-06-10||Oliver Habryka||Effective Altruism Forum||Effective Altruism Funds: Meta Fund Effective Altruism Funds: Long-Term Future Fund||Request for proposals||AI safety|Global catastrophic risks|Effective altruism||The blog post announces that two of the funds under Effective Altruism Funds, namely the Long-Term Future Fund and the EA Meta Fund, are open for rolling applications. The application window for the current round ends on June 28. Response time windows will be 3-4 months (i.e., after the end of the corresponding application cycle). In rare cases, grants may be made out-of-cycle. Grant amounts must be at least $10,000, and will generally be under $100,000. The blog post gives guidelines on the kinds of applications that each fund will accept|
|Thoughts on the EA Hotel||2019-04-25||Oliver Habryka||Effective Altruism Forum||Effective Altruism Funds: Long-Term Future Fund||EA Hotel||Evaluator review of donee||Effective altruism/housing||With permission from Greg Colbourn of the EA Hotel, Habryka publicly posts the feedback he sent to the EA Hotel, who was rejected from the April 2019 funding round by the Long Term Future Fund. Habryka first lists three reasons he is excited about the Hotel: (a) Providing a safety net, (b) Acting on historical interest, (c) Building high-dedication cultures. He articulates three concrete models of concerns: (1) Initial overeagerness to publicize the EA Hotel (a point he now believes is mostly false, based on Greg Colbourn's response), (2) Significant chance of the EA Hotel culture becoming actively harmful for residents, (3) No good candidate to take charge of long-term logistics of running the hotel. Habryka concludes by saying he thinks all his concerns can be overcome. At the moment, he thinks the hotel should be funded for the next year, but is unsure of whether they should be given money to buy the hotel next door. The comment replies include one by Greg Colbourn, giving his backstory on the media attention (re: (1)) and discussing the situation with (2) and (3). There are also other replies, including one from casebash, who stayed at the hotel for a significant time|
|Major Donation: Long Term Future Fund Application Extended 1 Week||2019-02-16||Oliver Habryka||Effective Altruism Forum||Effective Altruism Funds: Long-Term Future Fund||Effective Altruism Funds: Long-Term Future Fund||Request for proposals||AI safety|Global catastrophic risks||The blog post announces that the EA Long-Term Future Fund has received a large donation, which doubles the amount of money available for granting to ~$1.2 million. It extends the deadline for applications at at https://docs.google.com/forms/d/e/1FAIpQLSeDTbCDbnIN11vcgHM3DKq6M0cZ3itAy5GIPK17uvTXcz8ZFA/viewform?usp=sf_link by 1 week, to 2019-02-24 midnight PST. The application form was previously annonced at https://forum.effectivealtruism.org/posts/oFeGLaJ5bZBBRbjC9/ea-funds-long-term-future-fund-is-open-to-applications-until and supposed to be open till 2019-02-07 for the February 2019 round of grants. Cross-posted to LessWrong at https://www.lesswrong.com/posts/ZKsSuxHWNGiXJBJ9Z/major-donation-long-term-future-fund-application-extended-1|
|EA Funds: Long-Term Future fund is open to applications until Feb. 7th||2019-01-17||Oliver Habryka||Effective Altruism Forum||Effective Altruism Funds: Long-Term Future Fund||Request for proposals||AI safety|Global catastrophic risks||Cross-posted to LessWrong at https://www.lesswrong.com/posts/dvGE8JSeFHtmHC6Gb/ea-funds-long-term-future-fund-is-open-to-applications-until The post seeks proposals for the Long-Term Future Fund. Proposals must be submitted by 2019-02-07 at https://docs.google.com/forms/d/e/1FAIpQLSeDTbCDbnIN11vcgHM3DKq6M0cZ3itAy5GIPK17uvTXcz8ZFA/viewform?usp=sf_link to be considered for the round of grants being announced mid-February. From the application, excerpted in the post: "We are particularly interested in small teams and individuals that are trying to get projects off the ground, or that need less money than existing grant-making institutions are likely to give out (i.e. less than ~$100k, but more than $10k). Here are a few examples of project types that we're open to funding an individual or group for (note that this list is not exhaustive)"|
|EA Funds: Long-Term Future fund is open to applications until November 24th (this Saturday)||2018-11-20||Oliver Habryka||Effective Altruism Forum||Effective Altruism Funds: Long-Term Future Fund||Request for proposals||AI safety|Global catastrophic risks||The post seeks proposals for the CEA Long-Term Future Fund. Proposals must be submitted by 2018-11-24 at https://docs.google.com/forms/d/e/1FAIpQLSf46ZTOIlv6puMxkEGm6G1FADe5w5fCO3ro-RK6xFJWt7SfaQ/viewform in order to be considered for the round of grants to be announced by the end of November 2018|
|LW 2.0 Strategic Overview||2017-09-14||Oliver Habryka||LessWrong||Centre for Effective Altruism Effective Altruism Grants Eric Rogstad||LessWrong 2.0||Miscellaneous commentary||Rationality improvement||Habryka describes his plans for LessWrong 2.0 as its primary developer.|
|Welcome to Lesswrong 2.0||2017-06-18||Oliver Habryka||LessWrong||LessWrong 2.0||Launch||Rationality improvement||Post outlines thinking for LessWrong 2.0, covering in part changes to the codebase, moderation, discourse norms.|
|Donee||Amount (current USD)||Amount rank (out of 0)||Donation date||Cause area||URL||Influencer||Notes|
Sorry, we couldn't find any similar donors.