XRiskology: Existential Risks with Phil Torres, Part I

Published: Sept. 24, 2017, 3:55 p.m.

We have a guest on the show today \u2013 Phil Torres. Phil Torres\xa0is an author, Affiliate Scholar at the\xa0Institute for Ethics and Emerging Technologies, former contributor at the\xa0Future of Life Institute, and founding Director\xa0of the\xa0X-Risks Institute. He has published in\xa0Bulletin of the Atomic Scientists,\xa0Skeptic,\xa0Free Inquiry,\xa0The Humanist,\xa0Journal of Future Studies,\xa0Bioethics,\xa0Journal of Evolution and Technology,\xa0Foresight,\xa0Erkenntnis, and\xa0Metaphilosophy, as well as popular media like\xa0Time,\xa0Motherboard,\xa0Salon,\xa0Common Dreams,\xa0Counterpunch,\xa0Alternet,\xa0The Progressive, and\xa0Truthout.

I was absolutely delighted that he agreed to be interviewed for a show like ours, and so I urge you to seek out his website \u2013 risksandreligion.org \u2013 and buy one of his books. There\u2019s \u201cThe End \u2013 what Science and Religion have to tell us about the Apocalypse\u201d, which is on my shelf already, and, forthcoming, we have Morality, Foresight, and Human Flourishing, which is going to act as an introduction to the whole field of existential risks, which people have been thinking about for a good deal of time now. So I would urge you all, if you\u2019re interested in this topic \u2013 that of risks to the entire human species, which I think we can agree affects us all \u2013 to buy one of those books.

This is the first part of our conversation, which touches on what is meant by an existential risk, some specific examples from the modern world in terms of nuclear profileration and nuclear accidents; transhumanism, and how our societies and institutions can deal with existential risks more effectively. We talk about the field in general and how we can hope to think more constructively about the end of the world - without waving a 'The End is Nigh' sign! The second part, which focuses on AI, will be released shortly.

Follow Phil @xriskology and the show @physicspod.