Artificial intelligence could help ‘normalize’ child sexual abuse as graphic images erupt online: experts

Artificial intelligence is opening the door to a disturbing trend of people creating realistic images of children in sexual settings, which could increase the number of cases of sex crimes against kids in real life, experts warn. 

AI platforms that can mimic human conversation or create realistic images exploded in popularity late last year into 2023 following the release of chatbot ChatGPT, which served as a watershed moment for the use of artificial intelligence. As the curiosity of people across the world was piqued by the technology for work or school tasks, others have embraced the platforms for more nefarious purposes.

The National Crime Agency (NCA), which is the U.K.’s lead agency combating organized crime, warned this week that the proliferation of machine-generated explicit images of children is having a “radicalizing” effect “normalizing” pedophilia and disturbing behavior against kids. 

“We assess that the viewing of these images – whether real or AI-generated – materially increases the risk of offenders moving on to sexually abusing children themselves,” NCA Director General Graeme Biggar said in a recent report.  

AI ‘DEEPFAKES’ OF INNOCENT IMAGES FUEL SPIKE IN SEXTORTION SCAMS, FBI WARNS

Graeme Biggar

National Crime Agency Director General Graeme Biggar during a Northern Ireland Policing Board meeting at James House, Belfast June 1, 2023. (Photo by Liam McBurney/PA Images via Getty Images)

The agency estimates there are up to 830,000 adults, or 1.6% of the adult population in the U.K. that pose some type of sexual danger against children. The estimated figure is 10 times greater than the U.K.’s prison population, according to Biggar. 

The majority of child sexual abuse cases involve viewing explicit images, according to Biggar, and with the help of AI, creating and viewing sexual images could “normalize” abusing children in the real world. 

ARTIFICIAL INTELLIGENCE CAN DETECT ‘SEXTORTION’ BEFORE IT HAPPENS AND HELP FBI: EXPERT

“[The estimated figures] partly reflect a better understanding of a threat that has historically been underestimated, and partly a real increase caused by the radicalising effect of the internet, where the widespread availability of videos and images of children being abused and raped, and groups sharing and discussing the images, has normalised such behaviour,” Biggar said. 

AI computer

Artificial intelligence illustrations are seen on a laptop with books in the background in this illustration photo on July 18, 2023. (Photo by Jaap Arriens/NurPhoto via Getty Images)

Stateside, a similar explosion of using AI to create sexual images of children is unfolding. 

“Children’s images, including the content of known victims, are being repurposed for this really evil output,” Rebecca Portnoff, the director of data science at Thorn, a nonprofit that works to protect kids, told the Washington Post last month. 

CANADIAN MAN SENTENCED TO PRISON OVER AI-GENERATED CHILD PORNOGRAPHY: REPORT

“Victim identification is already a needle-in-a-haystack problem, where law enforcement is trying to find a child in harm’s way,” she said. “The ease of using these tools is a significant shift, as well as the realism. It just makes everything more of a challenge.”

Popular AI sites that can create images based on simple prompts often have community guidelines preventing the creation of disturbing photos. 

Teenaged girl

Teenaged girl in dark room. (Getty Images )

Such platforms are trained on millions of images from across the internet that serve as building blocks for AI to create convincing depictions of people or locations that do not actually exist. 

LAWYERS BRACE FOR AI’S POTENTIAL TO UPEND COURT CASES WITH PHONY EVIDENCE

Midjourney, for example, calls for PG-13 content that avoids “nudity, sexual organs, fixation on naked breasts, people in showers or on toilets, sexual imagery, fetishes.” While DALL-E, OpenAI’s image creator platform, only allows G-rated content, prohibiting images that show “nudity, sexual acts, sexual services, or content otherwise meant to arouse sexual excitement.” However, dark web forums of people with ill intentions discuss workarounds to create disturbing images, according to various reports on AI and sex crimes. 

Police car

Police car with 911 sign. (Getty Images )

Biggar noted that the AI-generated images of children also throws police and law enforcement into a maze of deciphering fake images from those of real victims who need assistance. 

“The use of AI for child sexual abuse will make it harder for us to identify real children who need protecting, and further normalise abuse,” the NCA director general said. 

AI-generated images can also be used in sextortion scams, with the FBI issuing a warning on the crimes last month. 

Deepfakes often involve editing videos or photos of people to make them look like someone else by using deep-learning AI and have been used to harass victims or collect money, including kids. 

FBI WARNS OF AI DEEPFAKES BEING USED TO CREATE ‘SEXTORTION’ SCHEMES

“Malicious actors use content manipulation technologies and services to exploit photos and videos—typically captured from an individual’s social media account, open internet, or requested from the victim—into sexually-themed images that appear true-to-life in likeness to a victim, then circulate them on social media, public forums, or pornographic websites,” the FBI said in June. 

CLICK HERE TO GET THE FOX NEWS APP

“Many victims, which have included minors, are unaware their images were copied, manipulated, and circulated until it was brought to their attention by someone else.”

Check Also

Larian Studios shocks fans by not planning any Baldur’s Gate 3 DLC or expansions, with no Baldur’s Gate 4 in sight. Time for something new!

During a panel at the Game Developers Conference (GDC) today, the founder of Larian Studios, …