Contributing Writer
https://x.com/ScottNover
https://www.linkedin.com/in/scottnover/
Scott Nover
Contributing Writer
Scott Nover is the lead writer for GZERO AI. He's a contributing writer for Slate and was previously a staff writer at Quartz and Adweek. His writing has appeared in The Atlantic, Fast Company, Vox.com, and The Washington Post, among other outlets. He currently lives near Washington, DC, with his wife and pup.
May 28, 2024
This month, the US Department of Justice charged a 42-year-old Wisconsin man named Steven Anderegg with alleged crimes related to creating and distributing AI-generated child pornography. If convicted of all four counts brought by federal prosecutors, Anderegg faces up to 70 years in prison.
The case is novel. It’s the first time that the federal government has brought charges for child porn fully generated by AI. The government said that Anderegg created a trove of 13,000 fake images using the text-to-image generator Stable Diffusion, made by the company Stability AI, along with certain add-ons to the technology. This isn’t the first blow-up involving Stable Diffusion, though. In December, Stanford University researchers found that the dataset LAION-5B, used by Stable Diffusion, included 1,679 illegal images of child sexual abuse material.
This case could set a new precedent for an open question: Is AI-generated child pornography — for all intents and purposes under the law — child pornography?