Henry Riley 4am - 7am
Social Media Firms To Be Legally Required To Protect Users
8 April 2019, 10:38
Internet sites and social media firms could face being fined or blocked in the UK under new regulations aimed at to protecting internet users.
The Government has announced they'll bring in “world first” internet safety laws, designed to make the UK the safest place in the world to be online.
The Prime Minister says that online companies should start taking responsibility for their platforms, and that this will help to restore public trust in online technology.
The move is part of a joint proposal between the Department for Digital, Culture, Media and Sport and Home Office, which could see a new independent regulator introduced to monitor online firms and ensure they meet their responsibilities.
The proposal will also include a mandatory ‘duty of care’, which will require companies to take "reasonable steps" to keep their users safe and tackle illegal and harmful activity on their services.
Jeremy Wright, the Digital Secretary said that "the era of self-regulation for online companies is over.”
We want the UK to be the safest place in the world to be online, and the best place to start and grow a digital business. That’s why our proposals announced today will help make sure everyone in our country can enjoy the Internet safely #OnlineSafety https://t.co/wssYpNTDE6 pic.twitter.com/R8iwVDSCkD
— Jeremy Wright MP (@DCMS_SecOfState) April 8, 2019
"Tech can be an incredible force for good and we want the sector to be part of the solution in protecting their users. However those that fail to do this will face tough action.” Mr Wright said.
In the first online safety laws of their kind, social media companies and tech firms will be legally required to protect their users and face tough penalties if they do not comply.
Children's charities have welcomed the move, Javed Khan, the Chief Executive of Barnardo’s said that children in the UK are facing growing risks online - from cyber-bullying to sexual grooming to gaming addiction. He said that the proposed move was a "very important step in the right direction.”
Measures in the White Paper include:
A new statutory ‘duty of care’ to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services.
Further stringent requirements on tech companies to ensure child abuse and terrorist content is not disseminated online.
Giving a regulator the power to force social media platforms and others to publish annual transparency reports on the amount of harmful content on their platforms and what they are doing to address this.
Making companies respond to users’ complaints, and act to address them quickly.
Codes of practice, issued by the regulator, which could include measures such as requirements to minimise the spread of misleading and harmful disinformation with dedicated fact checkers, particularly during election periods.
A new “Safety by Design” framework to help companies incorporate online safety features in new apps and platforms from the start.
A media literacy strategy to equip people with the knowledge to recognise and deal with a range of deceptive and malicious behaviours online, including catfishing, grooming and extremism.