Iain Dale 10am - 1pm
Protect children or pay the price, Culture Sec warns social media giants
29 November 2022, 07:45 | Updated: 29 November 2022, 08:14
Social media giants will face "severe punishments", including major fines, if they fail to stop young children using their platforms, Culture Secretary Michelle Donelan has warned.
Listen to this article
Loading audio...
Fresh changes to the proposed internet safety laws will tackle the "absurd situation" surrounding the enforcement of age limits on social media platforms.
The updates will require tech firms to show how they enforce user age limits, as well as publish summaries of risk assessments in regard to potential harm to children on their sites and declare details of enforcement action taken against them by Ofcom - the new regulator for the tech sector.
Companies could also face being fined by Ofcom up to 10% of annual turnover if they fail to fulfil policies to tackle racist, homophobic or other content harmful to children on their platforms.
It comes in a bid to "rebalance" the Online Safety Bill with "common sense approaches", Ms Donelan told LBC's Nick Ferrari at Breakfast.
"We are removing the legal but harmful (duties), which would have led to unintended consequences and have an erosion of free speech.
"Whereas we're rebalancing this for some common-sense approaches."
The new version of the bill better reflects its "original purpose" to "protect young people", Ms Donelan previously wrote in the Telegraph.
Read more: Donald Trump is back on Twitter after owner Elon Musk asks users if they want him reinstated
Read more: Facebook parent company Meta lays off 11,000 employees, including as many as 650 in the UK
"Protecting children is the fundamental reason why the Online Safety Bill was created, and so the changes I have made strengthen the child protection elements of the bill significantly," she wrote in the Telegraph.
"Some platforms claim they don't allow anyone under 13 - any parent will tell you that is nonsense.
"Some platforms claim not to allow children, but simultaneously have adverts targeting children.
"The legislation now compels companies to be much clearer about how they enforce their own age limits."
Along with the changes to boost child safety online, controversial measures that would have forced social media sites to take down material designated "legal but harmful" are to be removed.
Under the original plans, the biggest platforms would have been compelled to not only remove illegal content, but also any material which had been named in the legislation as legal but potentially harmful.
There will now be a greater requirement for firms to provide adults with tools to hide certain content they do not wish to see - including types of content that do not meet the criminal threshold but could be harmful to see, such as the glorification of eating disorders, misogyny and some other forms of abuse.
It is an approach which the Government is calling a "triple shield" of online protection which also allows for freedom of speech.
However, shadow culture secretary Lucy Powell said it was a "major weakening" of the bill, adding: "Replacing the prevention of harm with an emphasis on free speech undermines the very purpose of this bill, and will embolden abusers, Covid deniers, hoaxers, who will feel encouraged to thrive online."
The latest changes follow other updates to the bill, including criminalising the encouragement of self-harm and of "downblousing" and the sharing of pornographic deepfakes.
The Government also confirmed further amendments will be tabled shortly aimed at boosting protections for women and girls online.