James O'Brien 10am - 1pm
TikTok to ban teenagers from using beauty filters over mental health concerns
27 November 2024, 10:49
Teenagers will be banned from using beauty filters on popular social media app TikTok amid concerns over their mental health and self-esteem.
Listen to this article
Loading audio...
In the coming weeks, under-18s on the video-sharing platform will lose access to so-called “beauty filters” which smooth over skin, plump up users’ lips and increase the size of their eyes.
Filters such as “bold glamour” can completely change a child’s features, making them unrecognisable in videos shared online.
Speaking at the company’s safety forum at its European headquarters in Dublin, Chloe Setter, TikTok’s lead on child safety public policy, said: “We’re hoping that this will give us the ability to detect and remove more and more quickly.”
Child safety groups have accused TikTok of contributing to the mental health crisis facing children, with filters adding unnecessary pressure on young people to keep up with unrealistic beauty standards.
This move comes as part of a wider push by TikTok to increase protections for young people on its platform.
Christine Grahn, TikTok's head of European public policy, said users needed to feel safe in order to properly use a platform, and this was a key factor for the company.
"If people don't feel safe, they are not going to bring their authentic selves to the platform and express themselves, and that means that we don't have the platform that we're hoping to create," Ms Grahn told the PA news agency.
"In order for us to achieve the best result, which is, at the end of the day, safety for our users, we have everything to gain from working with partners of various sorts.
"We adapt our products based on the research that comes out of that work, and we also work with academic partners to integrate their experience.
"The end result is going to be so much better if we work together as a society to address societal issues rather than trying to do so in silos."
The changes come as the UK introduces the long-awaited Online Safety Act, designed to protect young people online.
Andy Burrows, the chief executive of suicide prevention charity the Molly Rose Foundation, told the Guardian: “It will not escape anyone’s attention that these shifts are being announced largely to comply with EU and UK regulation. This makes the case for more ambitious regulation, not less.”
He added: “TikTok should act quickly to fix the systemic weaknesses in its design that allows a torrent of harmful content to be algorithmically recommended to young people aged 13 or over.”