Go offline with the Player FM app!
62. A Different Type of Datacenter is Needed for AI
Manage episode 489234159 series 1204875
Learn More from the AI Infrastructure Field Day Presentations
AI demands specialized data center designs due to its unique hardware utilization and networking needs, which require a new type of infrastructure. This Tech Field Day Podcast episode features Denise Donohue, Karen Lopez, Lino Telera, and Alastair Cooke. Network design has been a consistent part of the AI infrastructure discussions at Tech Field Day events. The need for a dedicated network to interconnect GPUs differentiates AI training and fine-tuning networks from general-purpose computing. The vast power demand for high-density GPU servers highlights a further need for different data centers with liquid cooling and massive power distribution. Model training is only one part of the AI pipeline; business value is delivered by AI inference with a different set of needs and a closer eye on financial management. Inference will likely require servers with GPUs and high-speed local storage, but not the same networking density as training and fine-tuning. Inference will also need servers adjacent to existing general-purpose infrastructure running existing business applications. Some businesses may be able to fit their AI applications into their existing data centers, but many will need to build or rent new infrastructure.
Host
Alastair Cooke, Tech Field Day Event Lead
Panelists:
Follow the Tech Field Day Podcast on X/Twitter or on Bluesky and use the Hashtag #TFDPodcast to join the discussion. Listen to more episodes on the podcast page of the website.
Follow Tech Field Day for more information on upcoming and current event coverage on X/Twitter, on Bluesky, and on LinkedIn, or visit our website.
313 episodes
Manage episode 489234159 series 1204875
Learn More from the AI Infrastructure Field Day Presentations
AI demands specialized data center designs due to its unique hardware utilization and networking needs, which require a new type of infrastructure. This Tech Field Day Podcast episode features Denise Donohue, Karen Lopez, Lino Telera, and Alastair Cooke. Network design has been a consistent part of the AI infrastructure discussions at Tech Field Day events. The need for a dedicated network to interconnect GPUs differentiates AI training and fine-tuning networks from general-purpose computing. The vast power demand for high-density GPU servers highlights a further need for different data centers with liquid cooling and massive power distribution. Model training is only one part of the AI pipeline; business value is delivered by AI inference with a different set of needs and a closer eye on financial management. Inference will likely require servers with GPUs and high-speed local storage, but not the same networking density as training and fine-tuning. Inference will also need servers adjacent to existing general-purpose infrastructure running existing business applications. Some businesses may be able to fit their AI applications into their existing data centers, but many will need to build or rent new infrastructure.
Host
Alastair Cooke, Tech Field Day Event Lead
Panelists:
Follow the Tech Field Day Podcast on X/Twitter or on Bluesky and use the Hashtag #TFDPodcast to join the discussion. Listen to more episodes on the podcast page of the website.
Follow Tech Field Day for more information on upcoming and current event coverage on X/Twitter, on Bluesky, and on LinkedIn, or visit our website.
313 episodes
All episodes
×Welcome to Player FM!
Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.