Brian Christian donations received

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of July 2025. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Basic donee information

We do not have any donee information for the donee Brian Christian in our system.

Donee donation statistics

Cause areaCountMedianMeanMinimum10th percentile 20th percentile 30th percentile 40th percentile 50th percentile 60th percentile 70th percentile 80th percentile 90th percentile Maximum
Overall 3 66,000 134,634 37,903 37,903 37,903 37,903 66,000 66,000 66,000 300,000 300,000 300,000 300,000
AI safety 3 66,000 134,634 37,903 37,903 37,903 37,903 66,000 66,000 66,000 300,000 300,000 300,000 300,000

Donation amounts by donor and year for donee Brian Christian

Donor Total 2023 2022 2021
FTX Future Fund (filter this donee) 300,000.00 0.00 300,000.00 0.00
Open Philanthropy (filter this donee) 103,903.00 37,903.00 0.00 66,000.00
Total 403,903.00 37,903.00 300,000.00 66,000.00

Full list of documents in reverse chronological order (0 documents)

There are no documents associated with this donee.

Full list of donations in reverse chronological order (3 donations)

Graph of top 10 donors (for donations with known year of donation) by amount, showing the timeframe of donations

Graph of donations and their timeframes
DonorAmount (current USD)Amount rank (out of 3)Donation dateCause areaURLInfluencerNotes
Open Philanthropy37,903.0032023-02AI safety/strategyhttps://www.openphilanthropy.org/grants/brian-christian-psychology-research/-- Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support a DPhil in psychology at the University of Oxford. His research will focus on human preferences, with the goal of informing efforts to align AI systems with human values."

Other notes: Currency info: donation given as 29,700.00 GBP (conversion done via donor calculation).
FTX Future Fund300,000.0012022-05AI safetyhttps://ftxfuturefund.org/our-grants/?_funding_stream=open-call-- Donation process: This grant is a result of the Future Fund's open call for applications originally announced on 2022-02-28 at https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) with a deadline of 2022-03-21.

Intended use of funds (category): Direct project expenses

Intended use of funds: Grant to "support the completion of a book which explores the nature of human values and the implications for aligning AI with human preferences."

Donor reason for donating at this time (rather than earlier or later): Timing determined by timing of the open call https://forum.effectivealtruism.org/posts/2mx6xrDrwiEKzfgks/announcing-the-future-fund-1 (GW, IR) for applications; the grant is made shortly after the application window for the open call (2022-02-28 to 2022-03-21).
Open Philanthropy66,000.0022021-03AI safety/movement growthhttps://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/brian-christian-alignment-book-promotionNick Beckstead Intended use of funds (category): Direct project expenses

Intended use of funds: Contractor agreement "with Brian Christian to support the promotion of his book The Alignment Problem: Machine Learning and Human Values."

Donor reason for selecting the donee: The grant page says: "Our potential risks from advanced artificial intelligence team hopes that the book will generate interest in AI alignment among academics and others."