Artwork

Content provided by information labs and Information labs. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by information labs and Information labs or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

AI lab TL;DR | Paul Keller - A Vocabulary for Opting Out of AI Training and TDM

15:06
 
Share
 

Manage episode 473056725 series 3480798
Content provided by information labs and Information labs. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by information labs and Information labs or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

🔍 In this TL;DR episode, Paul Keller (The Open Future Foundation) outlines a proposal for a common opt-out vocabulary to improve how EU copyright rules apply to AI training. The discussion introduces three clear use cases—TDM, AI training, and generative AI training—to help rights holders express their preferences more precisely. By standardizing terminology across the value chain, the proposal aims to bring legal clarity, promote interoperability, and support responsible AI development.

📌 TL;DR Highlights

⏲️[00:00] Intro

⏲️[00:41] Q1-Why is this vocabulary needed for AI training opt-outs?

⏲️[04:17] Q2-How does it help creators, AI developers, and policymakers and what are some of the concepts?

⏲️[11:55] Q3-What are its limitations, and how could it evolve?

⏲️[14:35] Wrap-up & Outro

💭 Q1 - Why is this vocabulary needed for AI training opt-outs?

🗣️ "At the core of the EU copyright framework is... the TDM exceptions – the exceptions for text and data mining that were introduced in the 2019 Copyright Directive."

🗣️ "It ensures that rights holders have some level of control over their works, and it makes sure that the majority of publicly available works are available to innovate on top of, to build new things."

🗣️ "The purpose of such a vocabulary is to provide a common language for expressing rights reservations and opt-outs that are understood in the same way along the entire value chain."

🗣️ "This vocabulary proposal is the outcome of discussions that we had with many stakeholders, including rights holders, AI companies, policymakers, academics, and public interest technologists."

💭 Q2 - How does it help creators, AI developers, and policymakers and what are some of the concepts?

🗣️ "At the very core, the idea of vocabulary is that you have some common understanding of language... that terms you use mean the same to other people that you deal with."

🗣️ "We offer these three use cases for people to target their opt-outs from... like sort of the Russian dolls: the wide TDM category that is AI training, and in that is generative AI training."

🗣️ "If all of these technologies sort of use the same definition of what they are opting out, it becomes interoperable and it becomes also relatively simple to understand on the rights holder side."

💭 Q3 - What are its limitations, and how could it evolve?

🗣️ "The biggest limitation is... we need to see if this lands in reality and stakeholders start working with this."

🗣️ "These information intermediaries... essentially convey the information from rights holders to model providers—then it has a chance to become something that structures this field."

🗣️ "It is designed as a sort of very simple, relatively flexible approach that makes it expandable."

📌 About Our Guest

🎙️ Paul Keller | The Open Future Foundation

🌐 Article | A Vocabulary for opting out of AI training and other forms of TDM

https://openfuture.eu/wp-content/uploads/2025/03/250307_Vocabulary_for_opting_out_of_AI_training_and_other_forms_of_TDM.pdf

🌐 Paul Keller

https://www.linkedin.com/in/paulkeller/

Paul Keller is the co-Founder and Director of Policy at the Open Future Foundation, a European nonprofit organization. He has extensive experience as a media activist, open policy advocate and systems architect striving to improve access to knowledge and culture.

#AI #ArtificialIntelligence #GenerativeAI

  continue reading

33 episodes

Artwork
iconShare
 
Manage episode 473056725 series 3480798
Content provided by information labs and Information labs. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by information labs and Information labs or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

🔍 In this TL;DR episode, Paul Keller (The Open Future Foundation) outlines a proposal for a common opt-out vocabulary to improve how EU copyright rules apply to AI training. The discussion introduces three clear use cases—TDM, AI training, and generative AI training—to help rights holders express their preferences more precisely. By standardizing terminology across the value chain, the proposal aims to bring legal clarity, promote interoperability, and support responsible AI development.

📌 TL;DR Highlights

⏲️[00:00] Intro

⏲️[00:41] Q1-Why is this vocabulary needed for AI training opt-outs?

⏲️[04:17] Q2-How does it help creators, AI developers, and policymakers and what are some of the concepts?

⏲️[11:55] Q3-What are its limitations, and how could it evolve?

⏲️[14:35] Wrap-up & Outro

💭 Q1 - Why is this vocabulary needed for AI training opt-outs?

🗣️ "At the core of the EU copyright framework is... the TDM exceptions – the exceptions for text and data mining that were introduced in the 2019 Copyright Directive."

🗣️ "It ensures that rights holders have some level of control over their works, and it makes sure that the majority of publicly available works are available to innovate on top of, to build new things."

🗣️ "The purpose of such a vocabulary is to provide a common language for expressing rights reservations and opt-outs that are understood in the same way along the entire value chain."

🗣️ "This vocabulary proposal is the outcome of discussions that we had with many stakeholders, including rights holders, AI companies, policymakers, academics, and public interest technologists."

💭 Q2 - How does it help creators, AI developers, and policymakers and what are some of the concepts?

🗣️ "At the very core, the idea of vocabulary is that you have some common understanding of language... that terms you use mean the same to other people that you deal with."

🗣️ "We offer these three use cases for people to target their opt-outs from... like sort of the Russian dolls: the wide TDM category that is AI training, and in that is generative AI training."

🗣️ "If all of these technologies sort of use the same definition of what they are opting out, it becomes interoperable and it becomes also relatively simple to understand on the rights holder side."

💭 Q3 - What are its limitations, and how could it evolve?

🗣️ "The biggest limitation is... we need to see if this lands in reality and stakeholders start working with this."

🗣️ "These information intermediaries... essentially convey the information from rights holders to model providers—then it has a chance to become something that structures this field."

🗣️ "It is designed as a sort of very simple, relatively flexible approach that makes it expandable."

📌 About Our Guest

🎙️ Paul Keller | The Open Future Foundation

🌐 Article | A Vocabulary for opting out of AI training and other forms of TDM

https://openfuture.eu/wp-content/uploads/2025/03/250307_Vocabulary_for_opting_out_of_AI_training_and_other_forms_of_TDM.pdf

🌐 Paul Keller

https://www.linkedin.com/in/paulkeller/

Paul Keller is the co-Founder and Director of Policy at the Open Future Foundation, a European nonprofit organization. He has extensive experience as a media activist, open policy advocate and systems architect striving to improve access to knowledge and culture.

#AI #ArtificialIntelligence #GenerativeAI

  continue reading

33 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide

Listen to this show while you explore
Play