AI Art is Powered by ISIS Executions and Non-Consenual Porn

Published: Sept. 23, 2022, 4:09 p.m.

b'

AI art has gotten wildly popular over the past year. Programs like Midjourney and Dall-E are generating incredible images and incredible controversy. But these programs don\\u2019t exist in a vacuum. AI\\u2019s require billions of images to learn how and what to draw. Where are they getting those pictures? They\\u2019re hoovering them up on the internet. A place full of child porn, ISIS execution videos, and non-consensual adult images. With AI it\\u2019s all garbage in, garbage out. So who controls this data and is there anything we can do about it?


On this episode of Cyber, Motherboard writer Chloe Xiang walks us through the ins and outs of the AI trained on ISIS execution images.


Stories discussed in this episode:


ISIS Executions and Non-Consensual Porn Are Powering AI Art


Amazon Driver Fired for Posting Photo of Customer\\u2019s Dildo to Reddit


\'The Silence of My Critics Speaks for Itself:\' Hans Niemann Says He Is Being Unfairly Attacked in Chess Scandal


We\\u2019re recording CYBER live on Twitch. Watch live during the week. Follow us there to get alerts when we go live. We take questions from the audience and yours might just end up on the show.


Subscribe to CYBER on Apple Podcasts or wherever you listen to your podcasts.


Sign up for Motherboard\\u2019s daily newsletter
for a regular dose of our original reporting, plus behind-the-scenes content about our biggest stories.



Hosted on Acast. See acast.com/privacy for more information.

'