
Artificial intelligence is opening the door to a disturbing trend of people creating realistic images of children in sexual situations that could increase the number of real-life sex crimes against children, experts have warned.
With the release of the chatbot ChatGPT, artificial intelligence platforms that can imitate human conversations or create realistic images will rapidly become popular from the end of last year to 2023, becoming a watershed in the use of artificial intelligence. As technology for work or school tasks piques the curiosity of people around the world, others begin to exploit these platforms for more nefarious purposes.
The UK’s main agency fighting organized crime, the National Crime Agency, warned this week that a surge in machine-generated sexually explicit images of children was having a “radical” effect, “normalizing” pedophilia and disturbing behavior against children.
NCA director general Graeme Biggar said in a recent report: “We assess that viewing these images, whether real or artificially risk of child sexual abuse.”
AI ‘deepfakes’ of innocent images fuel sextortion scams, FBI warns
National Crime Agency (NCA) Commissioner Graeme Biggar at a meeting of the Northern Ireland Magistrates’ Commission at James House in Belfast. Image date: Thursday 1 June 2023. (Photo: Liam McBurney/PA Images via Getty Images) (Getty Images)
The agency estimates that as many as 830,000 adults (1.6% of the UK adult population) pose some type of sexual risk to children. Bigg said the figure was ten times the size of the UK prison population.
Most child sexual abuse cases involve viewing explicit imagery, and with the help of artificial intelligence, creating and viewing sexual imagery could “normalize” child abuse in the real world, Bigg said.
Expert: AI could detect ‘sextortion’ before it happens and help FBI
“[The estimated figures] Reflecting in part a better awareness of a historically underappreciated threat, and in part a real increase in the radicalization effects of the Internet, where videos and images of children being abused and raped are ubiquitous and groups share and discuss these images, has normalized that behavior,” Bigger said.

In this illustration photo from July 18, 2023, artificial intelligence illustration is seen on a laptop with books in the background. (Photo: Jaap Arriens/NurPhoto via Getty Images) (Getty Images)
A similar explosion in the use of artificial intelligence to produce sexual imagery of children is underway in the United States.
“Imagery of children, including content of known victims, is being repurposed for this truly nefarious output,” said Rebecca Botnoff, director of data science at Thorne, a nonprofit dedicated to protecting children, last month. told The Washington Post.
Canadian man jailed for AI-generated child pornography: report
“Victim identification is already a needle-in-a-haystack problem, and law enforcement is struggling to find children who have been harmed,” she said. “The ease of use and reality of these tools is a major shift. It just makes everything more challenging.”
Popular AI sites that can create images from simple prompts often have community guidelines that prevent the creation of disturbing photos.

Teenage girl in a dark room. (Getty Images)
Such platforms are trained on millions of images from the internet, which serve as building blocks for artificial intelligence that can create convincing descriptions of people or places that don’t actually exist.
Lawyers brace for AI’s potential to upend court cases with false evidence
For example, Halfway Journey requires PG-13 content to avoid “nudity, sexual organs, focus on bare breasts, people in the shower or toilet, sexual imagery, fetishes.” OpenAI’s image creation platform, DALL-E, only allows G-rated content and prohibits images showing “nude, sexual acts, sexual services, or other content intended to arouse sexual excitement.” However, according to various reports on artificial intelligence and sex crimes, darknet forums for malicious actors discuss workarounds for producing disturbing images.

Police car with 911 sign. (Getty Images)
AI-generated images of children also leave police and law enforcement in a maze of deciphering fake images and images of real victims in need of assistance, Bigger noted.
“The use of artificial intelligence for child sexual abuse will make it harder for us to identify children who really need protection and further normalize abuse,” the NCA director general said.
AI-generated images can also be used in sextortion scams, a crime the FBI warned about last month.
Deepfakes typically involve using deep-learning artificial intelligence to edit videos or photos of people to make them look like other people, and are used to harass or collect money from victims, including children.
FBI warns of AI deepfakes being used to create ‘sextortion’ schemes
“Malicious actors use content manipulation techniques and services to take photos and videos—often captured from an individual’s social media accounts, the open internet, or requested by the victim—and turn them into lifelike sexually themed images with the victim, which they then share on social media. media, public forums, or pornographic sites,” the FBI said in June.
Click here for the Fox News app
“Many victims, including minors, are unaware that their images are being copied, manipulated and distributed until someone else brings them to their attention.”