Nick Ferrari 7am - 10am
TikTok told its systems are failing to spot anti-vaccine misinformation
17 December 2020, 17:34
Facebook, YouTube and TikTok were appearing before MPs to answer questions on misinformation and fake news.
Social media platforms have been told their systems of moderation are failing, after MPs highlighted high-profile accounts spreading anti-vaccine misinformation without reprimand.
Video app TikTok was accused of allowing anti-vaccine “fanatics” to “spread lies”.
During a hearing of the Digital, Culture, Media and Sport (DCMS) sub-committee on online harms and misinformation alongside fellow tech giants Facebook and YouTube, the firms were criticised for not removing large accounts with thousands of followers which were sharing anti-vaccine content.
John Nicolson MP told TikTok that its system of content moderation “wasn’t working” and that misinformation was “rampant” on the video-sharing app.
Speaking to the app’s European director of government relations and public policy, Theo Bertram, Mr Nicolson said: “We’re in the middle of a worldwide pandemic – 1.7 million people have died, that’s 66,000 in the UK alone. At last we’ve got a vaccine and yet you allow anti-vaccination fanatics to spread lies on your platform, why?”
Mr Nicolson highlighted an account which had more than 38,000 followers where the owner had posted videos containing misinformation about the content of vaccines, adding that it took him only “minutes” to find “lots and lots” of similar videos despite TikTok’s suggestion that it does not allow such content.
The MP argued that if TikTok had failed to remove an account with such a large following, “what are the chances that you’re going to get rid of the smaller fry?”
Mr Bertram said TikTok has a team of 10,000 content moderators working around the world and that the company has a “clear policy against vaccine misinformation” and would remove any videos and accounts it found to be in breach of those rules.
Later in the committee hearing, chairman Julian Knight noted that since being raised before the committee, the account in question had been removed from TikTok.
He said to Mr Bertram: “It’s a pity it takes a parliamentary select committee to do this, isn’t it?”
The committee had called the technology firms to give evidence in order to hear what was being done to combat anti-vaccine misinformation as Covid-19 jabs begin to roll out around the world.
Concerns have been raised that misinformation could be used to undermine confidence in vaccines and put public health at risk.
Earlier this week, TikTok announced new labels for coronavirus vaccine misinformation, saying it would mark videos on the subject of vaccines with links to official and authoritative sources of information.
Facebook and YouTube were also criticised on the issue, with one YouTube channel – which has more than seven million subscribers, and which has posted vaccine misinformation that remains active – being flagged by the committee.
YouTube’s head of public policy for the UK and Ireland, Iain Bundred, said he was not aware of the account but acknowledged the content described should not be allowed on the platform and that it would be investigated.
The hearing took place on the same day that Twitter announced it would begin prioritising the flagging and removal of posts which make false claims about vaccines in general, as well as those linked to the Covid-19 response.