Artwork

Content provided by Jeremy Chapman and Microsoft Mechanics. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Jeremy Chapman and Microsoft Mechanics or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Microsoft Purview protections for Copilot

9:11
 
Share
 

Manage episode 478354830 series 1320201
Content provided by Jeremy Chapman and Microsoft Mechanics. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Jeremy Chapman and Microsoft Mechanics or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

Use Microsoft Purview and Microsoft 365 Copilot together to build a secure, enterprise-ready foundation for generative AI. Apply existing data protection and compliance controls, gain visibility into AI usage, and reduce risk from oversharing or insider threats.

Classify, restrict, and monitor sensitive data used in Copilot interactions. Investigate risky behavior, enforce dynamic policies, and block inappropriate use—all from within your Microsoft 365 environment.

Erica Toelle, Microsoft Purview Senior Product Manager, shares how to implement these controls and proactively manage data risks in Copilot deployments.

► QUICK LINKS:

00:00 - Microsoft Purview controls for Microsoft 365 Copilot

00:32 - Copilot security and privacy basics

01:47 - Built-in activity logging

02:24 - Discover and Prevent Data Loss with DSPM for AI

04:18 - Protect sensitive data in AI interactions

05:08 - Insider Risk Management

05:12 - Monitor and act on inappropriate AI use

07:14 - Wrap up

► Link References

Check out https://aka.ms/M365CopilotwithPurview

Watch our show on oversharing at https://aka.ms/OversharingMechanics

► Unfamiliar with Microsoft Mechanics?

As Microsoft's official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft.

• Subscribe to our YouTube: https://www.youtube.com/c/MicrosoftMechanicsSeries

• Talk with other IT Pros, join us on the Microsoft Tech Community: https://techcommunity.microsoft.com/t5/microsoft-mechanics-blog/bg-p/MicrosoftMechanicsBlog

• Watch or listen from anywhere, subscribe to our podcast: https://microsoftmechanics.libsyn.com/podcast

► Keep getting this insider knowledge, join us on social:

• Follow us on Twitter: https://twitter.com/MSFTMechanics

• Share knowledge on LinkedIn: https://www.linkedin.com/company/microsoft-mechanics/

• Enjoy us on Instagram: https://www.instagram.com/msftmechanics/

• Loosen up with us on TikTok: https://www.tiktok.com/@msftmechanics

  continue reading

258 episodes

Artwork
iconShare
 
Manage episode 478354830 series 1320201
Content provided by Jeremy Chapman and Microsoft Mechanics. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Jeremy Chapman and Microsoft Mechanics or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

Use Microsoft Purview and Microsoft 365 Copilot together to build a secure, enterprise-ready foundation for generative AI. Apply existing data protection and compliance controls, gain visibility into AI usage, and reduce risk from oversharing or insider threats.

Classify, restrict, and monitor sensitive data used in Copilot interactions. Investigate risky behavior, enforce dynamic policies, and block inappropriate use—all from within your Microsoft 365 environment.

Erica Toelle, Microsoft Purview Senior Product Manager, shares how to implement these controls and proactively manage data risks in Copilot deployments.

► QUICK LINKS:

00:00 - Microsoft Purview controls for Microsoft 365 Copilot

00:32 - Copilot security and privacy basics

01:47 - Built-in activity logging

02:24 - Discover and Prevent Data Loss with DSPM for AI

04:18 - Protect sensitive data in AI interactions

05:08 - Insider Risk Management

05:12 - Monitor and act on inappropriate AI use

07:14 - Wrap up

► Link References

Check out https://aka.ms/M365CopilotwithPurview

Watch our show on oversharing at https://aka.ms/OversharingMechanics

► Unfamiliar with Microsoft Mechanics?

As Microsoft's official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft.

• Subscribe to our YouTube: https://www.youtube.com/c/MicrosoftMechanicsSeries

• Talk with other IT Pros, join us on the Microsoft Tech Community: https://techcommunity.microsoft.com/t5/microsoft-mechanics-blog/bg-p/MicrosoftMechanicsBlog

• Watch or listen from anywhere, subscribe to our podcast: https://microsoftmechanics.libsyn.com/podcast

► Keep getting this insider knowledge, join us on social:

• Follow us on Twitter: https://twitter.com/MSFTMechanics

• Share knowledge on LinkedIn: https://www.linkedin.com/company/microsoft-mechanics/

• Enjoy us on Instagram: https://www.instagram.com/msftmechanics/

• Loosen up with us on TikTok: https://www.tiktok.com/@msftmechanics

  continue reading

258 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide

Listen to this show while you explore
Play