Preview Mode Links will not work in preview mode


This is a nonfiction audiobook narrated by Matt Arnold with the permission of the author, David Chapman. Full text at:

You can support the podcast and get episodes a week early, by supporting the Patreon:

Original music by Kevin MacLeod.

Artwork on this webpage is by Barry Gohn.

Search for "Fluidity" on Apple Podcasts, Spotify, Amazon Music, Deezer, Gaana, Player.FM, the mobile app, and RadioPublic.

Dec 3, 2023

This concludes the "Apocalypse Now" section of Better Without AI.
AI systems may cause near-term disasters through their proven ability to shatter societies and cultures. These might potentially cause human extinction, but are more likely to scale up to the level of the twentieth century dictatorships, genocides, and...

Nov 12, 2023

Who is in control of AI? - It may already be too late to shut down the existing AI systems that could destroy civilization.

What an AI apocalypse may look like - Scenarios in which artificial intelligence systems degrade critical institutions to the point of collapse seem...

Nov 6, 2023

Apocalypse now - Current AI systems are already harmful. They pose apocalyptic risks even without further technology development. This chapter explains why; explores a possible path for near-term human extinction via AI; and sketches...

Oct 22, 2023

Superintelligence should scare us only insofar as it grants superpowers. Protecting against specific harms of specific plausible powers may be our best strategy for preventing catastrophes.
For much of the AI safety community, the central question has been “when will it...

Oct 15, 2023

Many people call the future threat “artificial general intelligence,” but all three words there are misleading when trying to understand risks.
AI may radically accelerate technology development. That might be extremely good or extremely bad. There are...