All Episodes

November 24, 2025 69 mins

Philosopher Dr. David Thorstad tears into one of AI safety's most influential arguments: the singularity hypothesis. We discuss why the idea of recursive self-improvement leading to superintelligence doesn't hold up under scrutiny, how these arguments have redirected hundreds of millions in funding away from proven interventions, and why people keep backpedaling to weaker versions when challenged.

David walks through the actual structure of singularity arguments, explains why similar patterns show up in other longtermist claims, and makes the case for why we should focus on concrete problems happening right now like poverty, disease, the rise of authoritarianism instead of speculative far-future scenarios.

Chapters

  • (00:00) - Intro
  • (02:13) - David's background
  • (08:00) - (Against) The Singularity Hypothesis
  • (29:46) - Beyond the The Singularity
  • (39:56) - What We Should Actually Be Worried About
  • (49:00) - Philanthropic Funding

Links

The Singularity Hypothesis

  • David's Philosophical Studies article - Against the singularity hypothesis
  • Time "AI Dictionary" page - Singularity
  • EA Forum blogpost - Summary: Against the singularity hypothesis
  • Journal of Conciousness Studies article - The Singularity: A Philisophical Analysis
  • Interim Report from the Panel Chairs: AAAI Presidential Panel on Long-Term AI Futures
  • Epoch AI blogpost - Do the returns to software R&D point towards a singularity?
  • Epoch AI report - Estimating Idea Production: A Methodological Survey

Funding References

  • LessWrong blogpost - An Overview of the AI Safety Funding Situation
  • AISafety.com funding page
  • Report - Stanford AI Index 2025, Chapter 4.3
  • Forbes article - AI Spending To Exceed A Quarter Trillion Next Year
  • AI Panic article - The “AI Existential Risk” Industrial Complex
  • GiveWell webpage - How Much Does It Cost To Save a Life?
  • Wikipedia article - Purchasing power parity

Pascal's Mugging and the St. Petersburg Paradox

  • Wikipedia article - St. Petersburg Paradox
  • Conjecture Magazine article - Pascal’s Mugging and Bad Explanations
  • neurabites explainer - Ergodicity: the Most Over-Looked Assumption
  • Wikipedia article - Extraordinary claims require extraordinary evidence

The Time of Perils


Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.