Artwork

Content provided by ReversingLabs Inc.. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by ReversingLabs Inc. or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

The Threat of Package Hallucinations

43:26
 
Share
 

Manage episode 491895528 series 3393145
Content provided by ReversingLabs Inc.. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by ReversingLabs Inc. or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

In this episode of ConversingLabs, host Paul Roberts interviews Major Joe Spracklen, a PhD student at the University of Texas at San Antonio, who recently published a paper with his peers regarding the threat posed to software supply chains caused by code-generating Large Language Models (LLMs). The paper, “We Have a Package for You! A Comprehensive Analysis of Package Hallucinations by Code Generating LLMs” (PDF), discusses how the rise of these LLMs can create package hallucinations that arise from fact-conflicting errors – representing a novel form of package confusion attack.

  continue reading

47 episodes

Artwork
iconShare
 
Manage episode 491895528 series 3393145
Content provided by ReversingLabs Inc.. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by ReversingLabs Inc. or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

In this episode of ConversingLabs, host Paul Roberts interviews Major Joe Spracklen, a PhD student at the University of Texas at San Antonio, who recently published a paper with his peers regarding the threat posed to software supply chains caused by code-generating Large Language Models (LLMs). The paper, “We Have a Package for You! A Comprehensive Analysis of Package Hallucinations by Code Generating LLMs” (PDF), discusses how the rise of these LLMs can create package hallucinations that arise from fact-conflicting errors – representing a novel form of package confusion attack.

  continue reading

47 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play