Artwork

Content provided by Tehya N.. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Tehya N. or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

How Big Tech Inherited Eugenics: Anita Say Chan on Algorithmic Bias, Data Colonialism & Techno-Eugenics

1:11:51
 
Share
 

Manage episode 480216328 series 2951000
Content provided by Tehya N.. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Tehya N. or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

In this powerful episode of For the Love of History, host TC is joined by scholar and author Dr. Anita Say Chan to explore the unsettling historical roots of modern data science and artificial intelligence. Drawing from her groundbreaking book Predatory Data: Eugenics in Big Tech and Our Fight for an Independent Future, Anita uncovers how today's predictive algorithms trace back to 19th-century eugenics. Yes, really. Statistical regression—the backbone of online recommendation engines—was developed by a eugenicist. And that’s just the beginning.

We unpack how algorithmic bias, data colonialism, and techno-eugenics operate in today’s platforms—from Facebook’s role in global violence to the AI industry’s resistance to regulation. If you’re curious about the intersections of technology, race, gender, and power, this is the episode you’ve been waiting for.

📌 Key Topics Covered:

  • The hidden eugenic origins of data science and regression analysis

  • How algorithms are modern tools of social control

  • The racist, classist history of “fitness” in academic institutions

  • What “techno-eugenics” looks like today—from content moderation failures to AI bias

  • Case studies: Facebook’s role in violence in India and Myanmar

  • Why Big Tech underinvests in safety protocols outside the West

  • How tech elites bypass democratic institutions for unchecked influence

📚 About Our Guest:
Dr. Anita Say Chan is an Associate Professor at the University of Illinois Urbana-Champaign and founder of the Community Data Clinic. Her research focuses on feminist, decolonial approaches to tech and global information justice.

📖 Featured Book:
Predatory Data: Eugenics in Big Tech and Our Fight for an Independent Future by Anita Say Chan

📍 Timestamps (For Better UX + SEO):

  • 00:01 — Meet Dr. Anita Say Chan

  • 04:00 — Eugenics and the invention of data prediction

  • 10:15 — U.S. universities and the rise of eugenic policy

  • 17:45 — Techno-eugenics: What it means today

  • 24:30 — Case study: Facebook in India and Myanmar

  • 30:00 — Tech elites, lobbying, and the erosion of democracy

  • 35:00 — The fight for global data justice

🎯 Call to Action:
Enjoyed this deep dive into the dark roots of data? Don’t forget to subscribe, rate, and leave a review on your favorite podcast platform. Share this episode with a friend who's still not questioning the algorithm—and grab a copy of Predatory Data to keep the conversation going.

✨ Want more delightful brain food? Support the pod and get bonus goodies over on [⁠Patreon⁠] (

👉 Don’t forget to rate, review, and tell your cat about us. It helps more history nerds find us! 🐈‍⬛

🧠 SEO Keywords Integrated:

  • Big Tech and eugenics

  • history of data science

  • algorithmic bias in tech

  • techno-eugenics explained

  • AI and racial bias

  • Anita Say Chan interview

  • Predictive algorithms and inequality

  • data colonialism

  • podcast on tech ethics

  • podcast on eugenics history

Learn more about your ad choices. Visit megaphone.fm/adchoices

  continue reading

154 episodes

Artwork
iconShare
 
Manage episode 480216328 series 2951000
Content provided by Tehya N.. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Tehya N. or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

In this powerful episode of For the Love of History, host TC is joined by scholar and author Dr. Anita Say Chan to explore the unsettling historical roots of modern data science and artificial intelligence. Drawing from her groundbreaking book Predatory Data: Eugenics in Big Tech and Our Fight for an Independent Future, Anita uncovers how today's predictive algorithms trace back to 19th-century eugenics. Yes, really. Statistical regression—the backbone of online recommendation engines—was developed by a eugenicist. And that’s just the beginning.

We unpack how algorithmic bias, data colonialism, and techno-eugenics operate in today’s platforms—from Facebook’s role in global violence to the AI industry’s resistance to regulation. If you’re curious about the intersections of technology, race, gender, and power, this is the episode you’ve been waiting for.

📌 Key Topics Covered:

  • The hidden eugenic origins of data science and regression analysis

  • How algorithms are modern tools of social control

  • The racist, classist history of “fitness” in academic institutions

  • What “techno-eugenics” looks like today—from content moderation failures to AI bias

  • Case studies: Facebook’s role in violence in India and Myanmar

  • Why Big Tech underinvests in safety protocols outside the West

  • How tech elites bypass democratic institutions for unchecked influence

📚 About Our Guest:
Dr. Anita Say Chan is an Associate Professor at the University of Illinois Urbana-Champaign and founder of the Community Data Clinic. Her research focuses on feminist, decolonial approaches to tech and global information justice.

📖 Featured Book:
Predatory Data: Eugenics in Big Tech and Our Fight for an Independent Future by Anita Say Chan

📍 Timestamps (For Better UX + SEO):

  • 00:01 — Meet Dr. Anita Say Chan

  • 04:00 — Eugenics and the invention of data prediction

  • 10:15 — U.S. universities and the rise of eugenic policy

  • 17:45 — Techno-eugenics: What it means today

  • 24:30 — Case study: Facebook in India and Myanmar

  • 30:00 — Tech elites, lobbying, and the erosion of democracy

  • 35:00 — The fight for global data justice

🎯 Call to Action:
Enjoyed this deep dive into the dark roots of data? Don’t forget to subscribe, rate, and leave a review on your favorite podcast platform. Share this episode with a friend who's still not questioning the algorithm—and grab a copy of Predatory Data to keep the conversation going.

✨ Want more delightful brain food? Support the pod and get bonus goodies over on [⁠Patreon⁠] (

👉 Don’t forget to rate, review, and tell your cat about us. It helps more history nerds find us! 🐈‍⬛

🧠 SEO Keywords Integrated:

  • Big Tech and eugenics

  • history of data science

  • algorithmic bias in tech

  • techno-eugenics explained

  • AI and racial bias

  • Anita Say Chan interview

  • Predictive algorithms and inequality

  • data colonialism

  • podcast on tech ethics

  • podcast on eugenics history

Learn more about your ad choices. Visit megaphone.fm/adchoices

  continue reading

154 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide

Listen to this show while you explore
Play