Artwork

Content provided by WebsEdge. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by WebsEdge or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

AI vs Energy: Can Smarter Chips and Local Clouds Save the Planet?

34:30
 
Share
 

Manage episode 481200419 series 3470923
Content provided by WebsEdge. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by WebsEdge or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

In this episode of Agents of Tech, hosts Autria Godfrey and Stephen Horn dive deep into one of AI’s most pressing challenges: energy consumption. With the release of DeepSeek and rising concerns over compute power and costs, the race to build efficient AI is heating up.We speak to two pioneering researchers:Dr. Shreyas Sen (Purdue University), who’s developing nervous-system-inspired chips that connect wearables with ultra-low energy use.Dr. Hongyin Luo (MIT CSAIL / BitEnergy AI), whose work on Linear-Complexity Multiplication (L-Mul) may drastically cut compute cost and energy usage.Is the future of AI massive centralized data centers — or decentralized personal clouds and localized compute? And what happens when we run out of training data?👉 Don’t forget to like, comment, and subscribe to support meaningful tech discussions!⏱️ Timestamps / Chapters:00:00 - Introduction: Welcome to Agents of Tech00:20 - Why AI energy usage is an urgent issue01:00 - DeepSeek’s $6M run and the energy debate02:00 - The promise of mixture-of-experts and energy savings02:45 - Interview intro: Dr. Hongyin Luo (BitEnergy AI)03:25 - What is L-Mul and why it matters06:00 - Floating-point vs integer math in AI08:30 - Shifting compute from datacenters to the edge10:00 - Barriers to L-Mul adoption and FPGA innovation12:00 - The case for local family clouds13:30 - Moonshot idea: Stop pretraining to save energy15:00 - Interview intro: Dr. Shreyas Sen (Purdue University)15:45 - Wearable brains and nervous-system-inspired design17:30 - Conductive human body as an AI network19:00 - Brain-to-device communication breakthroughs21:30 - Data layers: cloud, edge, and leaf devices23:00 - Real-world use and commercialization of Wi-R tech24:00 - Future implications and distributed AI potential26:00 - Panel discussion: What does efficient AI really mean?28:30 - The end of training and rise of true intelligence?30:00 - From megawatt datacenters to household AI hubs32:00 - Wrap-up and reflections33:55 - Credits and thanks#ArtificialIntelligence #AI #EnergyEfficiency #GreenAI #EdgeComputing #BrainInspiredTech #WearableTech#NeuralNetworks #BodyPoweredAI #futureofai #SustainableTech #AIChips #AIInnovation #SmartWearables

  continue reading

25 episodes

Artwork
iconShare
 
Manage episode 481200419 series 3470923
Content provided by WebsEdge. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by WebsEdge or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

In this episode of Agents of Tech, hosts Autria Godfrey and Stephen Horn dive deep into one of AI’s most pressing challenges: energy consumption. With the release of DeepSeek and rising concerns over compute power and costs, the race to build efficient AI is heating up.We speak to two pioneering researchers:Dr. Shreyas Sen (Purdue University), who’s developing nervous-system-inspired chips that connect wearables with ultra-low energy use.Dr. Hongyin Luo (MIT CSAIL / BitEnergy AI), whose work on Linear-Complexity Multiplication (L-Mul) may drastically cut compute cost and energy usage.Is the future of AI massive centralized data centers — or decentralized personal clouds and localized compute? And what happens when we run out of training data?👉 Don’t forget to like, comment, and subscribe to support meaningful tech discussions!⏱️ Timestamps / Chapters:00:00 - Introduction: Welcome to Agents of Tech00:20 - Why AI energy usage is an urgent issue01:00 - DeepSeek’s $6M run and the energy debate02:00 - The promise of mixture-of-experts and energy savings02:45 - Interview intro: Dr. Hongyin Luo (BitEnergy AI)03:25 - What is L-Mul and why it matters06:00 - Floating-point vs integer math in AI08:30 - Shifting compute from datacenters to the edge10:00 - Barriers to L-Mul adoption and FPGA innovation12:00 - The case for local family clouds13:30 - Moonshot idea: Stop pretraining to save energy15:00 - Interview intro: Dr. Shreyas Sen (Purdue University)15:45 - Wearable brains and nervous-system-inspired design17:30 - Conductive human body as an AI network19:00 - Brain-to-device communication breakthroughs21:30 - Data layers: cloud, edge, and leaf devices23:00 - Real-world use and commercialization of Wi-R tech24:00 - Future implications and distributed AI potential26:00 - Panel discussion: What does efficient AI really mean?28:30 - The end of training and rise of true intelligence?30:00 - From megawatt datacenters to household AI hubs32:00 - Wrap-up and reflections33:55 - Credits and thanks#ArtificialIntelligence #AI #EnergyEfficiency #GreenAI #EdgeComputing #BrainInspiredTech #WearableTech#NeuralNetworks #BodyPoweredAI #futureofai #SustainableTech #AIChips #AIInnovation #SmartWearables

  continue reading

25 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide

Listen to this show while you explore
Play