Artwork

Content provided by Craig S. Smith. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Craig S. Smith or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

#252 Sid Sheth: How d-Matrix is Disrupting AI Inference in 2025

54:32
 
Share
 

Manage episode 479924349 series 2455219
Content provided by Craig S. Smith. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Craig S. Smith or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

This episode is sponsored by the DFINITY Foundation.

DFINITY Foundation's mission is to develop and contribute technology that enables the Internet Computer (ICP) blockchain and its ecosystem, aiming to shift cloud computing into a fully decentralized state.

Find out more at https://internetcomputer.org/

In this episode of Eye on AI, we sit down with Sid Sheth, CEO and Co-Founder of d-Matrix, to explore how his company is revolutionizing AI inference hardware and taking on industry giants like NVIDIA.

Sid shares his journey from building multi-billion-dollar businesses in semiconductors to founding d-Matrix—a startup focused on generative AI inference, chiplet-based architecture, and ultra-low latency AI acceleration.

We break down:

  • Why the future of AI lies in inference, not training

  • How d-Matrix’s Corsair PCIe accelerator outperforms NVIDIA's H200

  • The role of in-memory compute and high bandwidth memory in next-gen AI chips

  • How d-Matrix integrates seamlessly into hyperscaler and enterprise cloud environments

  • Why AI infrastructure is becoming heterogeneous and what that means for developers

  • The global outlook on inference chips—from the US to APAC and beyond

  • How Sid plans to build the next NVIDIA-level company from the ground up.

Whether you're building in AI infrastructure, investing in semiconductors, or just curious about the future of generative AI at scale, this episode is packed with value.

Stay Updated:

Craig Smith on X:https://x.com/craigss

Eye on A.I. on X: https://x.com/EyeOn_AI

(00:00) Intro

(02:46) Introducing Sid Sheth

(05:27) Why He Started d-Matrix

(07:28) Lessons from Building a $2.5B Chip Business

(11:52) How d-Matrix Prototypes New Chips

(15:06) Working with Hyperscalers Like Google & Amazon

(17:27) What’s Inside the Corsair AI Accelerator

(21:12) How d-Matrix Beats NVIDIA on Chip Efficiency

(24:10) The Memory Bandwidth Advantage Explained

(26:27) Running Massive AI Models at High Speed

(30:20) Why Inference Isn’t One-Size-Fits-All

(32:40) The Future of AI Hardware

(36:28) Supporting Llama 3 and Other Open Models

(40:16) Is the Inference Market Big Enough?

(43:21) Why the US Is Still the Key Market

(46:39) Can India Compete in the AI Chip Race?

(49:09) Will China Catch Up on AI Hardware?

  continue reading

253 episodes

Artwork
iconShare
 
Manage episode 479924349 series 2455219
Content provided by Craig S. Smith. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Craig S. Smith or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

This episode is sponsored by the DFINITY Foundation.

DFINITY Foundation's mission is to develop and contribute technology that enables the Internet Computer (ICP) blockchain and its ecosystem, aiming to shift cloud computing into a fully decentralized state.

Find out more at https://internetcomputer.org/

In this episode of Eye on AI, we sit down with Sid Sheth, CEO and Co-Founder of d-Matrix, to explore how his company is revolutionizing AI inference hardware and taking on industry giants like NVIDIA.

Sid shares his journey from building multi-billion-dollar businesses in semiconductors to founding d-Matrix—a startup focused on generative AI inference, chiplet-based architecture, and ultra-low latency AI acceleration.

We break down:

  • Why the future of AI lies in inference, not training

  • How d-Matrix’s Corsair PCIe accelerator outperforms NVIDIA's H200

  • The role of in-memory compute and high bandwidth memory in next-gen AI chips

  • How d-Matrix integrates seamlessly into hyperscaler and enterprise cloud environments

  • Why AI infrastructure is becoming heterogeneous and what that means for developers

  • The global outlook on inference chips—from the US to APAC and beyond

  • How Sid plans to build the next NVIDIA-level company from the ground up.

Whether you're building in AI infrastructure, investing in semiconductors, or just curious about the future of generative AI at scale, this episode is packed with value.

Stay Updated:

Craig Smith on X:https://x.com/craigss

Eye on A.I. on X: https://x.com/EyeOn_AI

(00:00) Intro

(02:46) Introducing Sid Sheth

(05:27) Why He Started d-Matrix

(07:28) Lessons from Building a $2.5B Chip Business

(11:52) How d-Matrix Prototypes New Chips

(15:06) Working with Hyperscalers Like Google & Amazon

(17:27) What’s Inside the Corsair AI Accelerator

(21:12) How d-Matrix Beats NVIDIA on Chip Efficiency

(24:10) The Memory Bandwidth Advantage Explained

(26:27) Running Massive AI Models at High Speed

(30:20) Why Inference Isn’t One-Size-Fits-All

(32:40) The Future of AI Hardware

(36:28) Supporting Llama 3 and Other Open Models

(40:16) Is the Inference Market Big Enough?

(43:21) Why the US Is Still the Key Market

(46:39) Can India Compete in the AI Chip Race?

(49:09) Will China Catch Up on AI Hardware?

  continue reading

253 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide

Listen to this show while you explore
Play