Artwork

Content provided by Intel Corporation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Intel Corporation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Algorithmic Fairness with Alice Xiang – Intel on AI – Season 2, Episode 12

35:45
 
Share
 

Manage episode 321488109 series 3321523
Content provided by Intel Corporation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Intel Corporation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

In this episode of Intel on AI guest Alice Xiang, Head of Fairness, Transparency, and Accountability Research at the Partnership on AI, talks with host Abigail Hing Wen, Intel AI Tech Evangelist and New York Times best-selling author, about algorithmic fairness—the study of how algorithms might systemically perform better or worse for certain groups of people and the ways in which historical biases or other systemic inequities might be perpetuated by algorithmic systems.

The two discuss the lofty goals of the Partnership on AI, why being able to explain how a model arrived at a specific decision is important for the future of AI adoption, and the proliferation of criminal justice risk assessment tools.

Follow Alice on Twitter: twitter.com/alicexiang Follow Abigail on Twitter: twitter.com/abigailhingwen Learn more about Intel’s work in AI: intel.com/ai

  continue reading

122 episodes

Artwork
iconShare
 
Manage episode 321488109 series 3321523
Content provided by Intel Corporation. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Intel Corporation or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

In this episode of Intel on AI guest Alice Xiang, Head of Fairness, Transparency, and Accountability Research at the Partnership on AI, talks with host Abigail Hing Wen, Intel AI Tech Evangelist and New York Times best-selling author, about algorithmic fairness—the study of how algorithms might systemically perform better or worse for certain groups of people and the ways in which historical biases or other systemic inequities might be perpetuated by algorithmic systems.

The two discuss the lofty goals of the Partnership on AI, why being able to explain how a model arrived at a specific decision is important for the future of AI adoption, and the proliferation of criminal justice risk assessment tools.

Follow Alice on Twitter: twitter.com/alicexiang Follow Abigail on Twitter: twitter.com/abigailhingwen Learn more about Intel’s work in AI: intel.com/ai

  continue reading

122 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play