Social media companies must ‘get a grip’ on removing child abuse images

13 September 2021, 11:04

A child using a laptop computer. (Dominic Lipinski/PA)
Record levels of internet grooming spark calls for stronger Online Safety Bill. Picture: PA

The All-Party Parliamentary Group (APPG) on Social Media said there are ‘very real concerns’ about the impact of encryption on child protection.

Social media companies must not encrypt messages unless they can guarantee they can keep platforms free of illegal content, an inquiry has warned.

The All-Party Parliamentary Group (APPG) on Social Media is calling for companies to step up and do more to protect children from online grooming and sexual abuse.

It launched its inquiry into the “disturbing” rise of so-called “self-generated” child sexual abuse material last November.

The cross-party MPs say the Home Office must review legislation to ensure it is as easy as possible for children to have their images removed from the internet.

Self-generated content can include material filmed using webcams, very often in the child’s own room, and then shared online.

In some cases, children are groomed, deceived or extorted into producing and sharing a sexual image or video of themselves.

The report, Selfie Generation – What’s Behind The Rise Of Self-Generated Indecent Images Of Children?, says the trend “seems to have been exacerbated by the Covid-19 crisis”.

The amount of “self-generated” child sex abuse material recorded by the Internet Watch Foundation (IWF) between January and April this year was more than double that detected during the same period in 2020.

Experts believe an increase in the number of offenders exchanging child sex abuse material during lockdowns may stimulate demand beyond the pandemic.

The MPs say many witnesses “raised very real concerns” about the impact of encryption on child protection, saying it could “cripple” the ability of programmes to detect illegal imagery.

They write: “The APPG believes it is completely unacceptable for a company to encrypt a service that has many child users.

“Doing this would do so much damage to child protection.

“We recommend that technology companies do not encrypt their services until a workable solution can be found that ensures equivalency with the current arrangements for the detection of this imagery.”

Labour MP Chris Elmore, chairman of the APPG, said social media companies must be more proactive in rooting out abusive images, and be clear to young users how they can complain about them.

He said: “It’s high time that we take meaningful action to fix this unacceptable mess.

“Children are daily at real risk of unimaginable cruelty, abuse and, in some instances, death.

“Social media companies are fundamentally failing to discharge their duties, and simply ignoring what should be an obvious moral obligation to keep young users safe.

“They need to get a grip, with institutional re-design, including the introduction of a duty-of-care on the part of companies toward their young users.”

The term “self-generated” should “not be taken to imply that such children have any share in the moral responsibility for their abuse”, he added.

Among 10 recommendations, the report says it should be replaced by “first person produced imagery” to avoid inadvertent victim blaming.

Susie Hargreaves, director of the UK Safer Internet Centre, said: “We see the fallout of abuse and, when children are targeted and made to abuse themselves on camera by criminal adult predators, it has a heart-breaking effect on children and their families.

“There is hope, and there are ways for children and young people to fight back.

“The Report Remove tool we launched this year with Childline empowers young people to have illegal images and videos of themselves removed.”

She added: “New legislation will also help make a difference, and the forthcoming Online Safety Bill is a unique opportunity to make the UK a safer place to be online, particularly for children.”

A Home Office spokeswoman said: “Keeping children safe is one of our highest priorities and the strongest measures contained in the Online Safety Bill are designed to protect children.

“If social media companies do not properly assess or take action against the risks their sites pose to children, they will face heavy fines or have their sites blocked. The Bill will further make tech companies accountable to an independent regulator.

“We are clear that companies must continue to take responsibility for stopping the intolerable level of harmful material on their platforms and embed public safety in their system designs, which is why the Bill will also compel them to consider the risks associated with all elements of their services and take robust action to keep their users safe.”

By Press Association

More Technology News

See more More Technology News

A person using their smartphone

Just 18% of teachers think phone ban would improve pupil behaviour – poll

A laptop user with their hood up holding a bank card

EE warns Christmas shoppers over rising threat of scams

The Royal Shakespeare Theatre in Stratford-upon-Avon (RSC/PA)

Royal Shakespeare Company to look at AI and immersive technology in theatre

A young girl uses the TikTok app on a smartphone

Safety is ‘at the core’ of TikTok, European executive says

Microsoft surface tablets

Microsoft outage still causing ‘lingering issues’ with email

The Google logon on the screen of a smartphone

Google faces £7 billion legal claim over search engine advertising

Hands on a laptop

Estimated 7m UK adults own cryptoassets, says FCA

A teenager uses his mobile phone to access social media,

Social media users ‘won’t be forced to share personal details after child ban’

Google Antitrust Remedies

US regulators seek to break up Google and force Chrome sale

Jim Chalmers gestures

Australian government rejects Musk’s claim it plans to control internet access

Graphs showing outages across Microsoft

Microsoft outage hits Teams and Outlook users

A person holds an iphone showing the app for Google chrome search engine

Apple and Google ‘should face investigation over mobile browser duopoly’

UK unveils AI cyber defence lab to combat Russian threats, as minister pledges unwavering support for Ukraine

British spies to ramp up fight against Russian cyber threats with launch of cutting-edge AI research unit

Pat McFadden

UK spies to counter Russian cyber warfare threat with new AI security lab

Openreach van

Upgrade to Openreach ultrafast full fibre broadband ‘could deliver £66bn boost’

Laptop with a virus warning on the screen

Nato countries are in a ‘hidden cyber war’ with Russia, says Liz Kendall