Steven Parton & Connor Leahy , The Feedback Loop by Singularity

FBL91: Connor Leahy - The Existential Risk of AI Alignment

20 Feb 2023 • 53 min • EN
53 min
00:00
53:40
No file found

This week our guest is AI researcher and founder of Conjecture, Connor Leahy, who is dedicated to studying AI alignment. Alignment research focuses on gaining an increased understanding of how to build advanced AI systems that pursue the goals they were designed for instead of engaging in undesired behavior. Sometimes, this means just ensuring they share the values and ethics we have as humans so that our machines don’t cause serious harm to humanity. In this episode, Connor provides candid insights into the current state of the field, including the very concerning lack of funding and human resources that are currently going into alignment research. Amongst many other things, we discuss how the research is conducted, the lessons we can learn from animals, and the kind of policies and processes humans need to put into place if we are to prevent what Connor currently sees as a highly plausible existential threat.  Find out more about Conjecture at conjecture.dev or follow Connor and his work at twitter.com/NPCollapse ** Apply for registration to our exclusive South By Southwest event on March 14th @ www.su.org/basecamp-sxsw Apply for an Executive Program Scholarship at su.org/executive-program/ep-scholarship Learn more about Singularity: su.org Host: Steven Parton - LinkedIn / Twitter Music by: Amine el Filali

From "The Feedback Loop by Singularity"

Listen on your iPhone

Download our iOS app and listen to interviews anywhere. Enjoy all of the listener functions in one slick package. Why not give it a try?

App Store Logo
application screenshot

Popular categories