Artwork

Content provided by Tech Field Day. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Tech Field Day or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

62. A Different Type of Datacenter is Needed for AI

22:08
 
Share
 

Manage episode 489234159 series 1204875
Content provided by Tech Field Day. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Tech Field Day or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

Learn More from the AI Infrastructure Field Day Presentations

AI demands specialized data center designs due to its unique hardware utilization and networking needs, which require a new type of infrastructure. This Tech Field Day Podcast episode features Denise Donohue, Karen Lopez, Lino Telera, and Alastair Cooke. Network design has been a consistent part of the AI infrastructure discussions at Tech Field Day events. The need for a dedicated network to interconnect GPUs differentiates AI training and fine-tuning networks from general-purpose computing. The vast power demand for high-density GPU servers highlights a further need for different data centers with liquid cooling and massive power distribution. Model training is only one part of the AI pipeline; business value is delivered by AI inference with a different set of needs and a closer eye on financial management. Inference will likely require servers with GPUs and high-speed local storage, but not the same networking density as training and fine-tuning. Inference will also need servers adjacent to existing general-purpose infrastructure running existing business applications. Some businesses may be able to fit their AI applications into their existing data centers, but many will need to build or rent new infrastructure.

Host

Alastair Cooke, Tech Field Day Event Lead

Panelists:

Karen Lopez

Lino Telera

Denise Donohue

Follow the Tech Field Day Podcast ⁠⁠⁠on X/Twitter⁠⁠⁠ or ⁠⁠⁠on Bluesky⁠⁠⁠ and use the Hashtag #TFDPodcast to join the discussion. Listen to more episodes ⁠⁠⁠on the podcast page of the website⁠⁠⁠.

Follow ⁠⁠⁠Tech Field Day⁠⁠⁠ for more information on upcoming and current event coverage ⁠⁠⁠on X/Twitter⁠⁠⁠, ⁠⁠⁠on Bluesky⁠⁠⁠, and ⁠⁠⁠on LinkedIn⁠⁠⁠, or ⁠⁠⁠visit our website⁠⁠⁠.

  continue reading

313 episodes

Artwork
iconShare
 
Manage episode 489234159 series 1204875
Content provided by Tech Field Day. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Tech Field Day or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

Learn More from the AI Infrastructure Field Day Presentations

AI demands specialized data center designs due to its unique hardware utilization and networking needs, which require a new type of infrastructure. This Tech Field Day Podcast episode features Denise Donohue, Karen Lopez, Lino Telera, and Alastair Cooke. Network design has been a consistent part of the AI infrastructure discussions at Tech Field Day events. The need for a dedicated network to interconnect GPUs differentiates AI training and fine-tuning networks from general-purpose computing. The vast power demand for high-density GPU servers highlights a further need for different data centers with liquid cooling and massive power distribution. Model training is only one part of the AI pipeline; business value is delivered by AI inference with a different set of needs and a closer eye on financial management. Inference will likely require servers with GPUs and high-speed local storage, but not the same networking density as training and fine-tuning. Inference will also need servers adjacent to existing general-purpose infrastructure running existing business applications. Some businesses may be able to fit their AI applications into their existing data centers, but many will need to build or rent new infrastructure.

Host

Alastair Cooke, Tech Field Day Event Lead

Panelists:

Karen Lopez

Lino Telera

Denise Donohue

Follow the Tech Field Day Podcast ⁠⁠⁠on X/Twitter⁠⁠⁠ or ⁠⁠⁠on Bluesky⁠⁠⁠ and use the Hashtag #TFDPodcast to join the discussion. Listen to more episodes ⁠⁠⁠on the podcast page of the website⁠⁠⁠.

Follow ⁠⁠⁠Tech Field Day⁠⁠⁠ for more information on upcoming and current event coverage ⁠⁠⁠on X/Twitter⁠⁠⁠, ⁠⁠⁠on Bluesky⁠⁠⁠, and ⁠⁠⁠on LinkedIn⁠⁠⁠, or ⁠⁠⁠visit our website⁠⁠⁠.

  continue reading

313 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play