David Edmonds (Uehiro Centre, Oxford University) and Nigel Warburton (freelance philosopher/writer) interview top philosophers on a wide range of topics. Two books based on the series have been published by Oxford University Press. We are currently self-funding - donations very welcome via our website http://www.philosophybites.com
…
continue reading
Content provided by Nonzero. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nonzero or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!
Go offline with the Player FM app!
The case for alarm about artificial intelligence (Robert Wright & Holly Elmore)
MP3•Episode home
Manage episode 481468348 series 134138
Content provided by Nonzero. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nonzero or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
This is a free preview of a paid episode. To hear more, visit www.nonzero.org
0:28 Holly’s newsletter and her work with PauseAI 3:45 The state of AI safety 10:54 Why Holly started PauseAI US 14:55 Is AI acceleration… accelerating? 23:47 What rationalists got wrong about the singularity 29:43 Mechanize Inc. and the danger of AI safety sellouts 40:10 Holly: Here’s why AI should alarm you 48:25 Heading to Overtime
…
continue reading
0:28 Holly’s newsletter and her work with PauseAI 3:45 The state of AI safety 10:54 Why Holly started PauseAI US 14:55 Is AI acceleration… accelerating? 23:47 What rationalists got wrong about the singularity 29:43 Mechanize Inc. and the danger of AI safety sellouts 40:10 Holly: Here’s why AI should alarm you 48:25 Heading to Overtime
Robert Wright (Nonzero, The Evolution of God, Why Buddhism Is True) and Holly Elmore (PauseAI US). Recorded April 29, 2025.
Twitter: https://twitter.com/NonzeroPods
847 episodes
MP3•Episode home
Manage episode 481468348 series 134138
Content provided by Nonzero. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Nonzero or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
This is a free preview of a paid episode. To hear more, visit www.nonzero.org
0:28 Holly’s newsletter and her work with PauseAI 3:45 The state of AI safety 10:54 Why Holly started PauseAI US 14:55 Is AI acceleration… accelerating? 23:47 What rationalists got wrong about the singularity 29:43 Mechanize Inc. and the danger of AI safety sellouts 40:10 Holly: Here’s why AI should alarm you 48:25 Heading to Overtime
…
continue reading
0:28 Holly’s newsletter and her work with PauseAI 3:45 The state of AI safety 10:54 Why Holly started PauseAI US 14:55 Is AI acceleration… accelerating? 23:47 What rationalists got wrong about the singularity 29:43 Mechanize Inc. and the danger of AI safety sellouts 40:10 Holly: Here’s why AI should alarm you 48:25 Heading to Overtime
Robert Wright (Nonzero, The Evolution of God, Why Buddhism Is True) and Holly Elmore (PauseAI US). Recorded April 29, 2025.
Twitter: https://twitter.com/NonzeroPods
847 episodes
All episodes
×Welcome to Player FM!
Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.