Children in the UK are using artificial intelligence image generators to create indecent images of other children. This is a worrying and illegal trend, an internet safety group has warned.
The UK Safer Internet Center (UKSIC) said it has received reports from teachers that pupils are using artificial intelligence to “create images that are legally child sexual abuse material.”
He said “urgent action” was needed to prevent the misuse of the technology in schools and to help children understand the dangers of creating such images.
The creation, possession and distribution of child sexual abuse images is illegal in the UK. This applies to both photographic and artificial intelligence-generated content, including cartoons and less realistic images.
According to UKSIC director David Wright, children may be exposed to artificial intelligence image generators without fully realizing the harm they can cause and the risks associated with sharing such images online.
“Young people don’t always realize the seriousness of what they are doing, but such harmful actions are to be expected when new technologies such as AI generators become more accessible to the public,” Wright said.
He said schools need to nip this problem in the bud by filtering and monitoring their systems and seeking support for incidents and security issues.
“While the number of incidents is currently low, we are at the very beginning of the journey and need to see action taken now before schools become overwhelmed and the problem grows,” Wright said.
Teachers and parents are being urged to talk to children and explain the risks of such behavior for fear it could lead to further violence or blackmail if the images are leaked to the open web.
There is a “real risk” that sex offenders could use fake images to shame and silence their victims, warns Victoria Green, chief executive of the Marie Collins Foundation, a charity for children affected by technology-facilitated abuse.
“The images may not have been created by children to cause harm, but once shared, these materials can fall into the wrong hands and end up on specialized abuse sites,” Green said.
In October, the Internet Watch Foundation, a British organization, warned that images of child sexual abuse created by artificial intelligence had become so realistic that many of them were indistinguishable from the real thing, even to trained analysts.
The foundation said it had discovered “thousands” of child sexual abuse images on the Internet and warned that more must be done to prevent the widespread production of such images.