Phil Torres, XRiskology Interview, Part 2: Superintelligence

Published: Dec. 1, 2017, 9:20 a.m.

This is the much-anticipated second part of the Phil Torres Tapes! We have a guest on the show today \u2013 Phil Torres. Phil Torres\xa0is an author, Affiliate Scholar at the\xa0Institute for Ethics and Emerging Technologies, former contributor at the\xa0Future of Life Institute, and founding Director\xa0of the\xa0X-Risks Institute. He has published in\xa0Bulletin of the Atomic Scientists,\xa0Skeptic,\xa0Free Inquiry,\xa0The Humanist,\xa0Journal of Future Studies,\xa0Bioethics,\xa0Journal of Evolution and Technology,\xa0Foresight,\xa0Erkenntnis, and\xa0Metaphilosophy, as well as popular media like\xa0Time,\xa0Motherboard,\xa0Salon,\xa0Common Dreams,\xa0Counterpunch,\xa0Alternet,\xa0The Progressive, and\xa0Truthout.

I was absolutely delighted that he agreed to be interviewed for a show like ours, and so I urge you to seek out his website \u2013 risksandreligion.org \u2013 and buy one of his books. There\u2019s \u201cThe End \u2013 what Science and Religion have to tell us about the Apocalypse\u201d, which is on my shelf already, and, recently, we have\xa0Morality, Foresight, and Human Flourishing, which is is an introduction to the whole field of existential risks. So I would urge you all, if you\u2019re interested in this topic \u2013 that of risks to the entire human species, which I think we can agree affects us all \u2013 to buy one of those books.

This is the second part of our conversation, which focuses on AI, Superintelligence, and the control problem. How can we deal with AI, how will it impact our lives and have we any hope of controlling a superintelligent AI? There's plenty more general discussion about existential risks, too.\xa0

Follow Phil @xriskology and the show @physicspod.\xa0