If you contact the police and rat someone for expressing their interest in child sexual abuse material (CSAM) to you, it is probably not the best idea to have the same material on your own devices. Or to further consent to a search so that law enforcement can gather more information. But that’s what an Alaskan does. He was detained in police custody.
404 Media reported Earlier this week the man, Anthaney O’Connor, ended up getting himself arrested after police searched his devices that allegedly revealed AI-generated child sexual abuse material (CSAM).
From 404:
According to the newly filed charging documentsAnthaney O’Connor, reached out to law enforcement in August to alert them to an unidentified airman sharing child sexual abuse (CSAM) material with O’Connor. While investigating the crime, and with O’Connor’s consent, federal authorities searched his phone for more information. A review of the electronics revealed that O’Connor allegedly offered to create a virtual reality CSAM for the airman, according to the criminal complaint.
According to police, the unidentified airman shared with O’Connor a picture he had taken of a child at a grocery store, and the two discussed how they would put the minor into a clear which is a virtual reality world.
Law enforcement claims to have found at least six clear, AI-generated CSAM images on O’Connor’s devices, which he says were deliberately downloaded, along with several ” true” was accidentally mixed up. By searching O’ Connor’s house, law enforcement discovered a computer with several hard drives hidden in a hole in the house; A computer review allegedly revealed a 41-second video of the child being raped.
In an interview with authorities, O’Connor said she regularly reported CSAM to internet service providers “but was still sexually satisfied from the images and videos.” It’s unclear why he decided to report the airman to law enforcement. Maybe he feels guilty or maybe he really believes that his AI CSAM is not breaking the law.
AI image generators are usually trained using real photos; means that the pictures of children “generated” by AI are based on real pictures. There is no way to separate the two. AI-based CSAM is not a victimless crime in that sense.
The first such arrest of a person for possession of AI-generated CSAM just happened back in May when the FBI arrested a man for using Stable Diffusion to create “thousands of realistic images of underage minors.”
AI advocates say it’s always been possible to create clear images of minors using Photoshop, but AI tools make it easier for anyone to do. this. A recent report found that one of six Congresswomen targeted by AI-generated deepfake porn. Many products have guardrails to prevent the worst use, similar to the way printers don’t allow photocopying of money. Enforcing barriers will at least prevent some of this behavior.





