All Episodes

November 16, 2018 47 min

An artificial intelligence capable of improving itself runs the risk of growing intelligent beyond any human capacity and outside of our control. Josh explains why a superintelligent AI that we haven’t planned for would be extremely bad for humankind. (Original score by Point Lobo.)

Interviewees: Nick Bostrom, Oxford University philosopher and founder of the Future of Humanity Institute; David Pearce, philosopher and co-founder of the World Transhumanist Association (Humanity+); Sebastian Farquahar, Oxford University philosopher.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com
Share
Mark as Played

Chat About The End Of The World with Josh Clark

Advertise With Us

Follow Us On

Show Links

AboutStoreRSS

Popular Podcasts

Crime Junkie
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks then look no further. Josh and Chuck have you covered.

For You

    Music, radio and podcasts, all free. Listen online or download the iHeartRadio App.

    Connect

    © 2021 iHeartMedia, Inc.