

SPONSORED
Can your AI assistant become a silent data leak? In this episode of Cyberside Chats, Sherri Davidoff and Matt Durrin break down EchoLeak, a zero-click exploit in Microsoft 365 Copilot that shows how attackers can manipulate AI systems using nothing more than an email. No clicks. No downloads. Just a cleverly crafted message that turns your AI into an unintentional insider threat.
They also share a real-world discovery from LMG Security’s pen testing team: how prompt injection was used to extract system prompts and override behavior in a live web application. With examples ranging from corporate chatbots to real-world misfires at Samsung and Chevrolet, this episode unpacks what happens when AI is left untested—and why your security strategy must adapt.
Key Takeaways
Resources
#EchoLeak #Cybersecurity #Cyberaware #CISO #Microsoft #Microsoft365 #Copilot #AI #GenAI #AIsecurity #RiskManagement
27 episodes
Can your AI assistant become a silent data leak? In this episode of Cyberside Chats, Sherri Davidoff and Matt Durrin break down EchoLeak, a zero-click exploit in Microsoft 365 Copilot that shows how attackers can manipulate AI systems using nothing more than an email. No clicks. No downloads. Just a cleverly crafted message that turns your AI into an unintentional insider threat.
They also share a real-world discovery from LMG Security’s pen testing team: how prompt injection was used to extract system prompts and override behavior in a live web application. With examples ranging from corporate chatbots to real-world misfires at Samsung and Chevrolet, this episode unpacks what happens when AI is left untested—and why your security strategy must adapt.
Key Takeaways
Resources
#EchoLeak #Cybersecurity #Cyberaware #CISO #Microsoft #Microsoft365 #Copilot #AI #GenAI #AIsecurity #RiskManagement
27 episodes
Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.