Clive Bull 1am - 4am
Tech giants given 12 months to meet new child privacy protection rules
2 September 2020, 10:34
Code forces firms to make young people a priority from the design up or face enforcement action, including fines.
Tech firms have 12 months to ensure their platforms adhere to new child privacy protection measures.
The Age Appropriate Design Code sets out 15 standards that companies must build into any online services used by children, making data protection of young people a priority from the design up.
These can stretch from apps and connected toys, to social media sites and online games, and even educational websites and streaming services.
Organisations that fail to follow the code after the transition period ends on September 2 2021 could face enforcement action by data regulator ICO (Information Commissioner’s Office) which include compulsory audits, orders to stop processing and fines of up to 4% of global turnover.
Under the rules, privacy settings must be set to high by default and nudge techniques should not be used to encourage children to weaken their settings, the code states.
Location settings that allow the world to see where a child is should also be switched off by default.
Data collection and sharing should be minimised, and profiling that can allow children to be served up targeted content should be switched off by default too.
“A generation from now we will all be astonished that there was ever a time when there wasn’t specific regulation to protect kids online,” said Elizabeth Denham, Information Commissioner.
“This code makes clear that kids are not like adults online, and their data needs greater protections.
“We want children to be online, learning and playing and experiencing the world, but with the right protections in place.
“We do understand that companies, particularly small businesses, will need support to comply with the code, and that’s why we have taken the decision to give businesses a year to prepare, and why we’re offering help and support.”
Ian Russell, father of 14-year-old schoolgirl Molly Russell who took her own life in 2017 after viewing harmful images on Instagram, told the PA news agency the code is “an important step in reclaiming the web as a safe place for young users”.
“The tech companies now have 12 months to make their services safe for use by anyone under the age of 18,” he said.
“In one year’s time, tech providers will have to adhere to the AADC (Age Appropriate Design Code) as young people’s data will no longer be allowed to be used to cause them harm.
“Our currently risky platforms will have to become safer by design.
“After the transition period, vulnerable young people will be less likely to be lured to the web’s darkest content, which in some cases, as we all to painfully know, can cost lives.
“I look forward to the day the UK introduces world-beating internet regulation, it’s long overdue.”
Andy Burrows, head of child safety online policy at the NSPCC, said the move will force tech firms to “take online harms seriously” so there can be “no more excuses for putting children at risk”.
“For the first time, high-risk social networks will have a legal duty to assess their sites for sexual abuse risks and no longer serve up harmful self-harm and suicide content to children,” he said.
“The Government must also press ahead with its Online Harms Bill to go hand in hand with the Code.
“This must be enforced by an independent regulator with the teeth it needs to hold platforms to account for safety failings.”