Artwork

Content provided by First Principles and Christian Keil. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by First Principles and Christian Keil or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

#3: Extropic - Why Thermodynamic Computing is the Future of AI (PUBLIC DEBUT)

1:12:59
 
Share
 

Manage episode 405937841 series 3554927
Content provided by First Principles and Christian Keil. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by First Principles and Christian Keil or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

Episode 3: Extropic is building a new kind of computer – not classical bits, nor quantum qubits, but a secret, more complex third thing. They call it a Thermodynamic Computer, and it might be many orders of magnitude more powerful than even the most powerful supercomputers today.

Check out their “litepaper” to learn more: https://www.extropic.ai/future.

======

(00:00) - Intro

(00:41) - Guillaume's Background

(02:40) - Trevor's Background

(04:02) - What is Extropic Building? High-Level Explanation

(07:07) - Frustrations with Quantum Computing and Noise

(10:08) - Scaling Digital Computers and Thermal Noise Challenges

(13:20) - How Digital Computers Run Sampling Algorithms Inefficiently

(17:27) - Limitations of Gaussian Distributions in ML

(20:12) - Why GPUs are Good at Deep Learning but Not Sampling

(23:05) - Extropic's Approach: Harnessing Noise with Thermodynamic Computers

(28:37) - Bounding the Noise: Not Too Noisy, Not Too Pristine

(31:10) - How Thermodynamic Computers Work: Inputs, Parameters, Outputs

(37:14) - No Quantum Coherence in Thermodynamic Computers

(41:37) - Gaining Confidence in the Idea Over Time

(44:49) - Using Superconductors and Scaling to Silicon

(47:53) - Thermodynamic Computing vs Neuromorphic Computing

(50:51) - Disrupting Computing and AI from First Principles

(52:52) - Early Applications in Low Data, Probabilistic Domains

(54:49) - Vast Potential for New Devices and Algorithms in AI's Early Days

(57:22) - Building the Next S-Curve to Extend Moore's Law for AI

(59:34) - The Meaning and Purpose Behind Extropic's Mission

(01:04:54) - Call for Talented Builders to Join Extropic

(01:09:34) - Putting Ideas Out There and Creating Value for the Universe

(01:11:35) - Conclusion and Wrap-Up

======

Links:

First Principles:

======

Production and marketing by The Deep View (https://thedeepview.co). For inquiries about sponsoring the podcast, email [email protected]

======

Checkout the video version here → http://tinyurl.com/4fh497n9

🔔 Follow to stay updated with new uploads

  continue reading

20 episodes

Artwork
iconShare
 
Manage episode 405937841 series 3554927
Content provided by First Principles and Christian Keil. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by First Principles and Christian Keil or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

Episode 3: Extropic is building a new kind of computer – not classical bits, nor quantum qubits, but a secret, more complex third thing. They call it a Thermodynamic Computer, and it might be many orders of magnitude more powerful than even the most powerful supercomputers today.

Check out their “litepaper” to learn more: https://www.extropic.ai/future.

======

(00:00) - Intro

(00:41) - Guillaume's Background

(02:40) - Trevor's Background

(04:02) - What is Extropic Building? High-Level Explanation

(07:07) - Frustrations with Quantum Computing and Noise

(10:08) - Scaling Digital Computers and Thermal Noise Challenges

(13:20) - How Digital Computers Run Sampling Algorithms Inefficiently

(17:27) - Limitations of Gaussian Distributions in ML

(20:12) - Why GPUs are Good at Deep Learning but Not Sampling

(23:05) - Extropic's Approach: Harnessing Noise with Thermodynamic Computers

(28:37) - Bounding the Noise: Not Too Noisy, Not Too Pristine

(31:10) - How Thermodynamic Computers Work: Inputs, Parameters, Outputs

(37:14) - No Quantum Coherence in Thermodynamic Computers

(41:37) - Gaining Confidence in the Idea Over Time

(44:49) - Using Superconductors and Scaling to Silicon

(47:53) - Thermodynamic Computing vs Neuromorphic Computing

(50:51) - Disrupting Computing and AI from First Principles

(52:52) - Early Applications in Low Data, Probabilistic Domains

(54:49) - Vast Potential for New Devices and Algorithms in AI's Early Days

(57:22) - Building the Next S-Curve to Extend Moore's Law for AI

(59:34) - The Meaning and Purpose Behind Extropic's Mission

(01:04:54) - Call for Talented Builders to Join Extropic

(01:09:34) - Putting Ideas Out There and Creating Value for the Universe

(01:11:35) - Conclusion and Wrap-Up

======

Links:

First Principles:

======

Production and marketing by The Deep View (https://thedeepview.co). For inquiries about sponsoring the podcast, email [email protected]

======

Checkout the video version here → http://tinyurl.com/4fh497n9

🔔 Follow to stay updated with new uploads

  continue reading

20 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play