REUTERS
This month, the US Department of Justice charged a 42-year-old Wisconsin man named Steven Anderegg with alleged crimes related to creating and distributing AI-generated child pornography. If convicted of all four counts brought by federal prosecutors, Anderegg faces up to 70 years in prison.
The case is novel. It’s the first time that the federal government has brought charges for child porn fully generated by AI. The government said that Anderegg created a trove of 13,000 fake images using the text-to-image generator Stable Diffusion, made by the company Stability AI, along with certain add-ons to the technology. This isn’t the first blow-up involving Stable Diffusion, though. In December, Stanford University researchers found that the dataset LAION-5B, used by Stable Diffusion, included 1,679 illegal images of child sexual abuse material.
This case could set a new precedent for an open question: Is AI-generated child pornography — for all intents and purposes under the law — child pornography?