SAFEGUARDING Minister Jess Phillips says it’s time to “scare the s**t” out of Brits on child sexual abuse.
She promised to get tough on tech companies if they fail to shut down sick content — both real and AI-generated fakes.
Darren FletcherJess Phillips says it’s time to ‘scare the s**t’ out of Brits on child sexual abuse[/caption]
Darren FletcherHeathrow Border Force staff run a pilot scheme to detect child sexual abuse material digitally[/caption]
The Sun on Sunday joined her on a visit to Heathrow where Border Force is trying to catch paedophiles travelling into the country.
“It keeps me awake at night,” she said.
“That’s how worried I am about the growing availability and access that people have to child sexual abuse material, both real and synthetic — so AI deepfakes.”
In a fresh barb at tech boss and X owner Elon Musk, she said concerns about privacy online were “for the birds” — as everyone has handed it over online already.
The US billionaire has previously rowed with Mrs Phillips over grooming gangs.
And she has told how she has been a victim of “deepfake” pornographic images herself.
Now, under a pilot scheme at Heathrow called Operation Excalibur, officials are using intelligence to target travellers suspected of having child abuse images.
This includes profiling people and working with international police forces.
One way to check for the images is by searching electronic devices for photos already known to law enforcement that have been “hashed” — which means they have been given an invisible electronic “watermark”.
Illegal content Border Force does this by connecting a cable to the suspect’s phone or laptop and scanning it at Customs.
New laws will soon give officers powers to make suspects unlock their devices to be scanned.
Trials have taken place at Heathrow, Gatwick, Manchester and Birmingham airports since July 2024 — and 65 per cent of scans revealed evidence of child sex abuse images.
It comes after charity the Internet Watch Foundation found online child sexual abuse, or links to it, a record-breaking 291,273 times last year — an 830 per cent increase since the charity began this work in 2014.
Mrs Phillips added: “I don’t think we’ve scared the s**t out of people about this enough in the country.
“We need to do quite a lot of work to take the public on a journey with us. If people realised some of the stuff their kids could access they would be horrified.”
If people realised some of the stuff their kids could access they would be horrified
Jess Phillips
Reports of AI-generated child sexual abuse imagery has risen by 380 per cent in just one year.
But now Britain is set to become the first country in the world with specific AI sex abuse offences.
And the new Online Safety Act, compels firms to show they are committed to removing illegal content from their platforms.
Companies failing to act face fines of up to £18million or ten per cent of their qualifying worldwide revenue.
But Mrs Phillips said Labour will go further if tech companies fail to rid their apps of child sexual abuse.
Charities are calling for more to be done to stop material being shared in private communications such as messaging app Telegram, which can be harder for police to access.
Mrs Phillips said: “When I talk to law enforcement, they will tell me very honestly about some of the limitations of what they can and can’t look at, and how they work with tech companies to get access to that.
“If the Online Safety Act isn’t enough and we still find that we have fears that people are sharing child abuse — this is not some kink, this is abusing children here and around the world — then we will absolutely go further.”
Getty38,000 child sexual abuse image crimes logged by police in England and Wales in the past year[/caption]
She said it was also the tech firms’ responsibility to build safety into their designs, adding: “There needs to be a much greater push for the market to provide safety.
“At the moment that’s absolutely not the case. That’s because I don’t think there’s a great enough demand from the public for that.
“It has to be safety by design. Things get created faster than legislation can ever be passed. You need culture and policy.”
She warned that “nudification” apps — which use AI to create fake naked images of a victim — keep being created.
Mrs Phillips added: “The amount of effort companies put into algorithms — if they put the same amount of effort into keeping us safe, we’d be safe.
“What I want is for parents and the public to be much more demanding about safety, like you would be about a car.
I don’t think that people are as s**t scared of this as they should be…it keeps me awake at night
Jess Phillips MP, Parliamentary Under-Secretary of State for Safeguarding and Violence Against Women and Girls
“You wouldn’t put your kids in a car without a seatbelt. I want parent power.”
She said the public were much more in favour of keeping kids safe than concerns over privacy.
“The idea that we have any privacy left is literally for the birds,” the minister added.
“Elon Musk literally knows when I have my period. I cannot believe that the argument about privacy persists when every single person in this room has definitely given it away to somebody who was not democratically elected.”
Mrs Phillips also told of how she herself fell victim to fake pornographic images online, and added: “It has been for too long a laissez-faire attitude to the idea of images as harm. And when you have had fake images made of you, which I have had, the idea that you don’t know who has seen it, you don’t know if you’re in the company of somebody who is using it, is terrifying for people to live with.”
Mrs Phillips was first alerted to the images by a journalist, but members of the public have also contacted her about deepfake images on porn sites.
SHOCKING STATISTICS
38,000 child sexual abuse image crimes logged by police in England and Wales in the past year
380% rise in reports of AI-generated child sexual abuse material between 2023 and 2024
9,215 defendants in court for child sexual abuse offences in year to December 2023
80% of convictions for viewing indecent images did not result in a jail sentence in 2023
She added: “I’ve not seen any of these deepfakes, I am not looking in the places where they might be.
“I had no idea they existed. But the thing about it is you don’t know who has seen it. You walk into a room and you think of people seeing these images of me.
“People think it’s a victimless crime, but I’ve got teenage sons, the idea there is images of me on porn sites their friends might see — that is horrendous.
“It is everyone’s responsibility to tackle these heinous crimes and safeguard children from all forms of sexual abuse.”
She also described her horror at how young many of the perpetrators of child sexual abuse are.
Mrs Phillips said: “The statistic that keeps me awake at night, and this is largely because I am the mother of teenage boys, is that in 53 per cent of the reported child sexual abuse incidences last year, the average age of a perpetrator was 14.
“That is a terrifying finding, that our abusers are trending young.”
BEAST CAGED
A BRITISH music teacher who plotted to abuse Filipina girls as young as four was caught at the UK border as he returned from the Far East.
James Alexander was arrested at Manchester Airport on June 30, 2018 after arriving from Thailand, where he had lived from 2017.
National Crime Agency officials seized his electronic devices and found he had sent money transfers to the facilitators of abuse, and used Skype and WhatsApp to try to arrange a trip to the Philippines to abuse girls.
Leeds Crown Court heard he asked for images of girls aged nine and six “posing in a certain way”, asked what the youngest “would do with him” and said he wanted sexual relations with a four-year-old.
Alexander, of Beeston, Leeds, admitted arranging the commission of a child sex offence, three counts of attempting to cause a girl under 13 to engage in sexual activity, and one of making indecent images of a child.
He was jailed for five years in 2019, given a five-year sexual harm prevention order banning foreign travel and was made to sign the sex offenders register for life.
Published: [#item_custom_pubDate]