Richard Spurr 1am - 4am
Ofcom: One in three internet users cannot spot false accounts or content
31 March 2022, 08:52
A study by the regulator found that about one in 20 people believe everything they see online.
More than a third of internet users are unaware that online content might be false or biased, an Ofcom study has found.
Some 30% of UK adults who use the internet – 14.5 million – are unsure about or do not even consider the truthfulness of the information they see online, the regulator’s annual survey found.
A further 6%, or about one in every 20 internet users, believe everything they see online.
Both adults and children overestimate their ability to spot misinformation, Ofcom found.
Participants were shown social media posts and profiles to determine whether they could verify their authenticity. Although seven in 10 adults (69%) said they were confident in identifying misinformation, only two in 10 (22%) were able to correctly identify the tell-tale signs of a genuine post, without making mistakes.
Ofcom saw a similar pattern among older children aged 12-17, with 74% confident but only 11% able to identify genuine content.
Similarly, about a quarter of adults (24%) and children (27%) who claimed to be confident in spotting misinformation were unable to identify a fake social media profile in practice.
The study also found that 33% of parents of five to seven-year-olds and 60% of parents of eight to 11-year-olds reported their children having a social media profile, despite them being under the minimum age requirement of 13 for most sites.
TikTok, in particular, is growing in popularity, even among the youngest age groups, with 16% of three to four-year-olds and 29% of five to sevens using the platform.
And Ofcom warned that many children could be “tactically” using other accounts or “finstas” – fake Instagrams – to conceal aspects of their online lives from parents.
The study found 64% of eight to 11-year-olds had multiple accounts or profiles, with 46% of these having an account just for their family to see.
More than a third of children (35%) reported engaging in potentially risky behaviours, which could hinder a parent or guardian keeping proper checks on their online use.
A fifth (21%) surfed in “incognito mode” and 19% deleted their browsing history, while 6% circumvented parental controls put in place to stop them visiting certain apps and sites.
Meanwhile, children are seeing less video content from friends online and more from brands, celebrities and influencers, Ofcom said.
Feeds full of “slick professionalised content” seemed to be encouraging a trend towards scrolling instead of sharing, with both adults (88%) and children (91%) three times as likely to watch videos online than to post their own videos.
Ofcom warned that the “sheer volume” of information meant having the critical skills and understanding to decipher fact from fiction had “never been more important”.
Every minute 500 hours of content are uploaded to YouTube, 5,000 videos are viewed on TikTok and 695,000 stories are shared on Instagram.
Four in five adult internet users (81%) want to see tech firms take responsibility for monitoring content on their sites and apps. Two thirds (65%) also want protection against inappropriate or offensive content.
Ofcom chief executive, Melanie Dawes, said: “In a volatile and unpredictable world, it’s essential that everyone has the tools and confidence to separate fact and fiction online – whether it’s about money, health, world events or other people.
“But many adults and children are struggling to spot what might be fake. So we’re calling on tech firms to prioritise rooting out harmful misinformation, before we take on our new role helping to tackle the problem. And we’re offering tips on what to consider when you’re browsing or scrolling.”