Phil Torres, XRiskology Interview, Part 2: Superintelligence

Published: Dec. 1, 2017, 9:20 a.m.

This is the much-anticipated second part of the Phil Torres Tapes! We have a guest on the show today – Phil Torres. Phil Torres is an author, Affiliate Scholar at the Institute for Ethics and Emerging Technologies, former contributor at the Future of Life Institute, and founding Director of the X-Risks Institute. He has published in Bulletin of the Atomic ScientistsSkepticFree InquiryThe Humanist, Journal of Future StudiesBioethicsJournal of Evolution and Technology, ForesightErkenntnis, and Metaphilosophy, as well as popular media like TimeMotherboardSalonCommon DreamsCounterpunchAlternetThe Progressive, and Truthout.

I was absolutely delighted that he agreed to be interviewed for a show like ours, and so I urge you to seek out his website – risksandreligion.org – and buy one of his books. There’s “The End – what Science and Religion have to tell us about the Apocalypse”, which is on my shelf already, and, recently, we have Morality, Foresight, and Human Flourishing, which is is an introduction to the whole field of existential risks. So I would urge you all, if you’re interested in this topic – that of risks to the entire human species, which I think we can agree affects us all – to buy one of those books.

This is the second part of our conversation, which focuses on AI, Superintelligence, and the control problem. How can we deal with AI, how will it impact our lives and have we any hope of controlling a superintelligent AI? There's plenty more general discussion about existential risks, too. 

Follow Phil @xriskology and the show @physicspod.