If you report someone for child pornography… don’t also have it on your computer

by Andrea
0 comments
If you report someone for child pornography… don’t also have it on your computer

If you report someone for child pornography… don’t also have it on your computer

An American resident in Alaska reported another individual for possession of child pornography generated by Artificial Intelligence. He was later arrested for the same crime.

If you contact the police to report someone who has expressed an interest in child sexual abuse material, it may not be the best idea to have the same type of material on your own devices.

Or consent to a search so the police can gather more information.

But that would have been the case Anthony O’Connoran American living in Alaska, did — ending up being arrested for possession of child pornography.

According to , O’Connor was arrested earlier this week after a police search of his computer devices allegedly revealed child sexual abuse material. generated by Artificial Intelligence.

According to prosecutors, O’Connor contacted law enforcement authorities in August to alert them about an unidentified individual, a professional pilot, who had shared child pornography with O’Connor.

During the investigation of the crime, and with O’Connor’s consent, federal authorities They searched your cell phone for additional information.

An analysis of the electronics revealed that O’Connor apparently offered to provide the pilot with child sexual abuse material generated in virtual reality with the help of Artificial Intelligence.

According to police, the unidentified pilot shared with O’Connor a photo he had taken of a child in a grocery storeand the two discussed how they could superimpose the minor in an explicit virtual reality world.

Police authorities claim to have found at least six images explicit, AI-generated images on O’Connor’s devices, which he claimed were intentionally downloaded, along with several “real” images — which would have been unintentionally mixed.

After searching O’Connor’s home, authorities discovered a computer and several hidden hard drives; a more detailed analysis of the computer revealed a 41-second video of child rape.

In questioning by authorities, O’Connor admitted that he regularly reported child sexual abuse material to Internet service providers, “but continued to feel sexually gratified with the images and the videos.”

It is not known why O’Connor decided to report the pilot to authorities, who speculate that he perhaps had a guilty conscience – or perhaps he truly believed that his images, because they were generated by AI, they did not violate the law.

AI image generators are typically trained with real imageswhich means that images of children “generated” by AI are fundamentally based on real images.

So there is no way to separate the two, and AI-based child pornography It is not a victimless crimehighlights the . In May, the FBI carried out the first for possession of AI-generated child pornography. The detainee had thousands of explicit images generated by Stabe Diffusion on his computer.

Most AI platforms have barriers to prevent misuse of its functionalities, like printers — which, in the United States, do not allow photocopying of money.

But obviously, who really wants to generate explicit images — or printing notes with Benjamin Franklin’s face on them — can do it.

Source link

You may also like

Our Company

News USA and Northern BC: current events, analysis, and key topics of the day. Stay informed about the most important news and events in the region

Latest News

@2024 – All Right Reserved LNG in Northern BC