University of Tübingen donations received

This is an online portal with information on donations that were announced publicly (or have been shared with permission) that were of interest to Vipul Naik. The git repository with the code for this portal, as well as all the underlying data, is available on GitHub. All payment amounts are in current United States dollars (USD). The repository of donations is being seeded with an initial collation by Issa Rice as well as continued contributions from him (see his commits and the contract work page listing all financially compensated contributions to the site) but all responsibility for errors and inaccuracies belongs to Vipul Naik. Current data is preliminary and has not been completely vetted and normalized; if sharing a link to this site or any page on this site, please include the caveat that the data is preliminary (if you want to share without including caveats, please check with Vipul Naik). We expect to have completed the first round of development by the end of July 2024. See the about page for more details. Also of interest: pageview data on analytics.vipulnaik.com, tutorial in README, request for feedback to EA Forum.

Table of contents

Basic donee information

We do not have any donee information for the donee University of Tübingen in our system.

Donee donation statistics

Cause areaCountMedianMeanMinimum10th percentile 20th percentile 30th percentile 40th percentile 50th percentile 60th percentile 70th percentile 80th percentile 90th percentile Maximum
Overall 4 300,000 422,481 224,923 224,923 224,923 300,000 300,000 300,000 575,000 575,000 590,000 590,000 590,000
ACT Fellowship 1 224,923 224,923 224,923 224,923 224,923 224,923 224,923 224,923 224,923 224,923 224,923 224,923 224,923
AI safety 3 575,000 488,333 300,000 300,000 300,000 300,000 575,000 575,000 575,000 590,000 590,000 590,000 590,000

Donation amounts by donor and year for donee University of Tübingen

Donor Total 2023 2021 2017
Open Philanthropy (filter this donee) 1,465,000.00 575,000.00 890,000.00 0.00
John Templeton Foundation (filter this donee) 224,923.00 0.00 0.00 224,923.00
Total 1,689,923.00 575,000.00 890,000.00 224,923.00

Full list of documents in reverse chronological order (0 documents)

There are no documents associated with this donee.

Full list of donations in reverse chronological order (4 donations)

Graph of top 10 donors (for donations with known year of donation) by amount, showing the timeframe of donations

Graph of donations and their timeframes
DonorAmount (current USD)Amount rank (out of 4)Donation dateCause areaURLInfluencerNotes
Open Philanthropy575,000.0022023-02AI safety/technical researchhttps://www.openphilanthropy.org/grants/university-of-tuebingen-adversarial-robustness-research/-- Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support research led by Professor Matthias Bethge on adversarial robustness as a means to improve AI safety."
Open Philanthropy590,000.0012021-02AI safety/technical researchhttps://www.openphilanthropy.org/grants/university-of-tubingen-robustness-research-wieland-brendel/Catherine Olsson Nick Beckstead Intended use of funds (category): Direct project expenses

Intended use of funds: The grant page says the grant is "to support early-career research by Wieland Brendel on robustness as a means to improve AI safety."

Donor reason for selecting the donee: Open Phil made five grants https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/university-of-tuebingen-adversarial-robustness-hein https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-santa-cruz-xie-adversarial-robustness https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-wagner https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-song https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/mit-adversarial-robustness-research for "adversarial robustness research" in January and February 2021, around the time of this grant. It looks like the donor became interested in funding this research topic at this time.

Donor reason for donating at this time (rather than earlier or later): Open Phil made five grants https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/university-of-tuebingen-adversarial-robustness-hein https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-santa-cruz-xie-adversarial-robustness https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-wagner https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-song https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/mit-adversarial-robustness-research for "adversarial robustness research" in January and February 2021, around the time of this grant. It looks like the donor became interested in funding this research topic at this time.
Intended funding timeframe in months: 36
Open Philanthropy300,000.0032021-02AI safety/technical researchhttps://www.openphilanthropy.org/grants/university-of-tubingen-adversarial-robustness-research-matthias-hein/Catherine Olsson Nick Beckstead Intended use of funds (category): Direct project expenses

Intended use of funds: Grant "to support research by Professor Matthias Hein on adversarial robustness as a means to improve AI safety."

Donor reason for selecting the donee: This is one of five grants made by the donor for "adversarial robustness research" in January and February 2021, all with the same grant investigators (Catherine Olsson and Daniel Dewey) except the Santa Cruz grant that had Olsson and Nick Beckstead. https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-santa-cruz-xie-adversarial-robustness https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/mit-adversarial-robustness-research https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-wagner and https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-song are the four other grants. It looks like the donor became interested in funding this research topic at this time.

Donor reason for donating that amount (rather than a bigger or smaller amount): No explicit reasons for the amount are given, but the amount is similar to the amounts for other grants from Open Philanthropy to early-stage researchers in adversarial robustness research. This includes three other grants https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-santa-cruz-xie-adversarial-robustness https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-wagner and https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-song made at the same time as well as grants later in the year to early-stage researchers at Carnegie Mellon University, Stanford University, and University of Southern California.

Donor reason for donating at this time (rather than earlier or later): This is one of five grants made by the donor for "adversarial robustness research" in January and February 2021, all with the same grant investigators (Catherine Olsson and Daniel Dewey) except the Santa Cruz grant that had Olsson and Nick Beckstead. https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-santa-cruz-xie-adversarial-robustness https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/mit-adversarial-robustness-research https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-wagner and https://www.openphilanthropy.org/focus/global-catastrophic-risks/potential-risks-advanced-artificial-intelligence/uc-berkeley-adversarial-robustness-song are the four other grants. It looks like the donor became interested in funding this research topic at this time.
Intended funding timeframe in months: 36
John Templeton Foundation224,923.0042017ACT Fellowshiphttps://templeton.org/grants/grant-database-- For project Hong Yu Wong- ACT Fellowship Application – The New Concept of Mind; Project leaders: Hong Yu Wong,. Affected regions: Europe; affected countries: FIXME.