Artwork

Content provided by Daniel Faggella. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Daniel Faggella or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Yi Zeng - Exploring 'Virtue' and Goodness Through Posthuman Minds [AI Safety Connect, Episode 2]

1:14:19
 
Share
 

Manage episode 476449867 series 3610999
Content provided by Daniel Faggella. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Daniel Faggella or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

This is an interview with Yi Zeng, Professor at the Chinese Academy of Sciences, a member of the United Nations High-Level Advisory Body on AI, and leader of the Beijing Institute for AI Safety and Governance (among many other accolades).

Over a year ago when I asked Jaan Tallinn "who within the UN advisory group on AI has good ideas about AGI and governance?" he mentioned Yi immediately. Jaan was right.

See the full article from this episode: https://danfaggella.com/zeng1

Watch the full episode on YouTube: https://youtu.be/jNfnYUcBlmM

This episode referred to the following other essays and resources:

-- AI Safety Connect - https://aisafetyconnect.com
-- Yi's profile on the Chinese Academy of Sciences - https://braincog.ai/~yizeng/

...
There three main questions we cover here on the Trajectory:
1. Who are the power players in AGI and what are their incentives?
2. What kind of posthuman future are we moving towards, or should we be moving towards?
3. What should we do about it?
If this sounds like it's up your alley, then be sure to stick around and connect:
-- Blog: danfaggella.com/trajectory
-- X: x.com/danfaggella
-- LinkedIn: linkedin.com/in/danfaggella
-- Newsletter: bit.ly/TrajectoryTw
-- YouTube: https://www.youtube.com/@trajectoryai

  continue reading

22 episodes

Artwork
iconShare
 
Manage episode 476449867 series 3610999
Content provided by Daniel Faggella. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Daniel Faggella or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.

This is an interview with Yi Zeng, Professor at the Chinese Academy of Sciences, a member of the United Nations High-Level Advisory Body on AI, and leader of the Beijing Institute for AI Safety and Governance (among many other accolades).

Over a year ago when I asked Jaan Tallinn "who within the UN advisory group on AI has good ideas about AGI and governance?" he mentioned Yi immediately. Jaan was right.

See the full article from this episode: https://danfaggella.com/zeng1

Watch the full episode on YouTube: https://youtu.be/jNfnYUcBlmM

This episode referred to the following other essays and resources:

-- AI Safety Connect - https://aisafetyconnect.com
-- Yi's profile on the Chinese Academy of Sciences - https://braincog.ai/~yizeng/

...
There three main questions we cover here on the Trajectory:
1. Who are the power players in AGI and what are their incentives?
2. What kind of posthuman future are we moving towards, or should we be moving towards?
3. What should we do about it?
If this sounds like it's up your alley, then be sure to stick around and connect:
-- Blog: danfaggella.com/trajectory
-- X: x.com/danfaggella
-- LinkedIn: linkedin.com/in/danfaggella
-- Newsletter: bit.ly/TrajectoryTw
-- YouTube: https://www.youtube.com/@trajectoryai

  continue reading

22 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide

Listen to this show while you explore
Play