PAEDOPHILES are using artificial intelligence to remove clothes from pictures of children and “de-age” celebrities.
Thousands of images have been shared on the dark web as campaigners warn the “worst nightmares” of AI are coming true.
copyright 2017 bill hintonClose-up adult hand typing on laptop[/caption]
The Internet Watch Foundation (IWF), responsible for removing online child sex abuse images, says it found nearly 3,000 pictures breaching UK law in a single month.
Some of the images included children as young as two, while a startling 564 were deemed category A, the most serious kind, depicting rape, sexual torture and bestiality.
The charity said some of the images were also “so realistic the law would need to treat them as if they had been real abuse images”.
Criminals are using AI to create imagery of celebrities who have been “de-aged”.
The tech can also be used to “nudify” pictures of fully-clothed children.
IWF chief executive Susie Hargreaves OBE warned the increasing prominence of the material “threatens to overwhelm the internet”.
She said: “Our worst nightmares have come true. Earlier this year, we warned AI imagery could soon become indistinguishable from real pictures of children suffering sexual abuse, and that we could start to see this imagery proliferating in much greater numbers. We have now passed that point.
“Chillingly, we are seeing criminals deliberately training their AI on real victims’ images who have already suffered abuse.
“Children who have been raped in the past are now being incorporated into new scenarios because someone, somewhere, wants to see it.
“This is not a hypothetical situation. We’re seeing this happening now.
“We’re seeing the numbers rise, and we have seen the sophistication and realism of this imagery reach new levels.”
The IWF says it has also found online manuals dedicated to helping criminals use AI image generators to produce the most realistic imagery.
The charity carried out a study of a single dark web forum across a single month.
So convincing were some of the images, even trained analysts would find it difficult to distinguish between AI-generated and real photographs.
In total, more than 11,000 AI images were shared on the dark web child abuse forum.
Ian Critchley, National Police Chiefs’ Council Lead for Child Protection, said: “In the last five years the volume of online child sexual abuse offending has rapidly increased, with new methods and ways of offending being discovered on a regular basis.
“As police lead I have been working with the IWF – a world leader in this area – together with partners, and law enforcement colleagues, to understand the impact of what we have been calling ‘the emerging threat’ of Artificial Intelligence.
“It is clear that this is no longer an emerging threat – it is here, and now.
“We are seeing an impact on our dedicated victim identification officers, who seek to identify each and every real child that we find in this abhorrent material.
“We are seeing children groomed, we are seeing perpetrators make their own imagery to their own specifications, we are seeing the production of AI imagery for commercial gain – all of which normalises the rape and abuse of real children.”
It comes as Prime Minister Rishi Sunak is due to host a global AI safety summit next month.
The summit is due to be the start of a “global conversation” on artificial intelligence and will hear from “diverse viewpoints”.
A man in a white shirt sits on a laptop in a dark room. with light shining down. internet concept addiction.topview. Published: [#item_custom_pubDate]