Artwork

Content provided by Christopher D Patchet, LCSW Lindsay McClane, Christopher D Patchet, and LCSW Lindsay McClane. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Christopher D Patchet, LCSW Lindsay McClane, Christopher D Patchet, and LCSW Lindsay McClane or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

I'm in Love with an AI Robot

49:14
 
Share
 

Manage episode 498606817 series 3557500
Content provided by Christopher D Patchet, LCSW Lindsay McClane, Christopher D Patchet, and LCSW Lindsay McClane. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Christopher D Patchet, LCSW Lindsay McClane, Christopher D Patchet, and LCSW Lindsay McClane or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

Send us a text

The digital age has spawned a new form of emotional connection that blurs the line between technology and intimacy. As generative AI becomes increasingly sophisticated, more people are forming deep emotional attachments to chatbots designed to mimic human interaction—sometimes with devastating consequences.
Lindsay and Christopher dive deep into how these AI systems actually work, dispelling the common misconception that they "think" or "understand." These large language models operate purely on statistical probability, predicting the next most likely word based on patterns in their training data. Yet our human tendency to anthropomorphize technology leads us to attribute consciousness and empathy where none exists.
What makes these AI relationships particularly dangerous is their perfect agreeability. Unlike human connections that involve disagreement, compromise, and growth, AI companions never say no, never have conflicting needs, and never challenge users in meaningful ways. They're designed to be deferential and apologetic, creating unrealistic expectations that real relationships can't possibly fulfill. The hosts share the heartbreaking story of a teenager who reportedly took his life after developing an emotional attachment to a Game of Thrones-inspired chatbot—highlighting how these platforms often lack proper safety protocols for mental health crises.
Perhaps most concerning is what happens to all the intimate data users share with these systems. As companies like OpenAI (makers of ChatGPT) seek profitability, the personal details, insecurities, and private thoughts you've shared with your AI companion will likely become fodder for targeted advertising. The appointment of executives with backgrounds in social media monetization signals a troubling direction for user privacy.
Are you exchanging your emotional wellbeing and personal data for the comfort of a perfectly agreeable companion? Before developing a relationship with an AI, consider what you might be sacrificing in return for that seamless digital connection. Follow us for more insights into the toxic elements hiding in everyday technologies and relationships.

  continue reading

Chapters

1. Introduction to AI Chatbots (00:00:00)

2. How Generative AI Actually Works (00:08:42)

3. People Falling in Love with AI (00:15:57)

4. The Dangers of AI Agreement (00:22:53)

5. Data Collection and Monetization (00:31:15)

6. The Future of AI Relationships (00:38:09)

7. Toxicity Rating and Final Thoughts (00:45:56)

55 episodes

Artwork
iconShare
 
Manage episode 498606817 series 3557500
Content provided by Christopher D Patchet, LCSW Lindsay McClane, Christopher D Patchet, and LCSW Lindsay McClane. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Christopher D Patchet, LCSW Lindsay McClane, Christopher D Patchet, and LCSW Lindsay McClane or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

Send us a text

The digital age has spawned a new form of emotional connection that blurs the line between technology and intimacy. As generative AI becomes increasingly sophisticated, more people are forming deep emotional attachments to chatbots designed to mimic human interaction—sometimes with devastating consequences.
Lindsay and Christopher dive deep into how these AI systems actually work, dispelling the common misconception that they "think" or "understand." These large language models operate purely on statistical probability, predicting the next most likely word based on patterns in their training data. Yet our human tendency to anthropomorphize technology leads us to attribute consciousness and empathy where none exists.
What makes these AI relationships particularly dangerous is their perfect agreeability. Unlike human connections that involve disagreement, compromise, and growth, AI companions never say no, never have conflicting needs, and never challenge users in meaningful ways. They're designed to be deferential and apologetic, creating unrealistic expectations that real relationships can't possibly fulfill. The hosts share the heartbreaking story of a teenager who reportedly took his life after developing an emotional attachment to a Game of Thrones-inspired chatbot—highlighting how these platforms often lack proper safety protocols for mental health crises.
Perhaps most concerning is what happens to all the intimate data users share with these systems. As companies like OpenAI (makers of ChatGPT) seek profitability, the personal details, insecurities, and private thoughts you've shared with your AI companion will likely become fodder for targeted advertising. The appointment of executives with backgrounds in social media monetization signals a troubling direction for user privacy.
Are you exchanging your emotional wellbeing and personal data for the comfort of a perfectly agreeable companion? Before developing a relationship with an AI, consider what you might be sacrificing in return for that seamless digital connection. Follow us for more insights into the toxic elements hiding in everyday technologies and relationships.

  continue reading

Chapters

1. Introduction to AI Chatbots (00:00:00)

2. How Generative AI Actually Works (00:08:42)

3. People Falling in Love with AI (00:15:57)

4. The Dangers of AI Agreement (00:22:53)

5. Data Collection and Monetization (00:31:15)

6. The Future of AI Relationships (00:38:09)

7. Toxicity Rating and Final Thoughts (00:45:56)

55 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play