Holding AI to a Double Standard: The Moral Costs of Inaction
Manage episode 465164493 series 3641575
We often demand near-perfection from AI—while we tolerate a much higher error rate when humans do the same tasks.
We tend to expect machine systems—like self-driving cars or AI medical diagnostics to be perfect.
In this episode we ask, "Why do we hold them to a near-impossible standard of zero errors?" And we explore the history behind the regulatory systems and biases we have today.
Follow us on X
10 episodes