Artwork

Content provided by SE-Radio Team. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by SE-Radio Team or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

SE Radio 661: Sunil Mallya on Small Language Models

59:28
 
Share
 

Manage episode 473326519 series 215
Content provided by SE-Radio Team. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by SE-Radio Team or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

Sunil Mallya, co-founder and CTO of Flip AI, discusses small language models with host Brijesh Ammanath. They begin by considering the technical distinctions between SLMs and large language models.

LLMs excel in generating complex outputs across various natural language processing tasks, leveraging extensive training datasets on with massive GPU clusters. However, this capability comes with high computational costs and concerns about efficiency, particularly in applications that are specific to a given enterprise. To address this, many enterprises are turning to SLMs, fine-tuned on domain-specific datasets. The lower computational requirements and memory usage make SLMs suitable for real-time applications. By focusing on specific domains, SLMs can achieve greater accuracy and relevance aligned with specialized terminologies.

The selection of SLMs depends on specific application requirements. Additional influencing factors include the availability of training data, implementation complexity, and adaptability to changing information, allowing organizations to align their choices with operational needs and constraints.

This episode is sponsored by Codegate.

  continue reading

1022 episodes

Artwork
iconShare
 
Manage episode 473326519 series 215
Content provided by SE-Radio Team. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by SE-Radio Team or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

Sunil Mallya, co-founder and CTO of Flip AI, discusses small language models with host Brijesh Ammanath. They begin by considering the technical distinctions between SLMs and large language models.

LLMs excel in generating complex outputs across various natural language processing tasks, leveraging extensive training datasets on with massive GPU clusters. However, this capability comes with high computational costs and concerns about efficiency, particularly in applications that are specific to a given enterprise. To address this, many enterprises are turning to SLMs, fine-tuned on domain-specific datasets. The lower computational requirements and memory usage make SLMs suitable for real-time applications. By focusing on specific domains, SLMs can achieve greater accuracy and relevance aligned with specialized terminologies.

The selection of SLMs depends on specific application requirements. Additional influencing factors include the availability of training data, implementation complexity, and adaptability to changing information, allowing organizations to align their choices with operational needs and constraints.

This episode is sponsored by Codegate.

  continue reading

1022 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide

Listen to this show while you explore
Play