Social media ‘failure’ on Trump must cause rethink on disinformation, MP says

8 January 2021, 16:24

Social media stock
Social media stock. Picture: PA

Damian Collins said social media sites must change their approach to the spread of disinformation in the wake of the violence at the US Capitol.

Social media giants must rethink how they define harmful disinformation, with the violence at the US Capitol proving their efforts so far have been a “failure” and the move to ban Donald Trump was “too little, too late”.

That is according to MP Damian Collins, the former chair of the Digital, Culture, Media and Sport (DCMS) select committee which led an inquiry into social media and disinformation.

Mr Collins said platforms such as Facebook and Twitter had allowed the US president to “whip up” supporters and frequently share false claims with them because “for too long believed this is simply a matter of free speech and personal choice”.

Electoral College Photo Gallery
Trump supporters try to break through a police barrier (John Minchillo/AP)

Supporters of Mr Trump stormed the US Capitol building on Wednesday in protest at his election loss to Joe Biden, leading to violent clashes and the deaths of five people.

The president was subsequently locked out of his social media accounts temporarily after being accused of inciting the violence, although Facebook has since extended the ban until the end of his presidency on January 20.

Conservative MP Mr Collins, who as chair of the DCMS Committee produced a report on social networks and the rise of disinformation in 2018, said at the time his committee had labelled disinformation “a threat to democracy”.

“Over the last few days we’ve seen that play out, we’ve seen what it could do,” he told the PA news agency.

“We do not yet, I think, know the damage that’s been done with the poison that’s been pumped into the system, the impact that will have on politics in America in the coming years. It’s something that we still don’t fully understand.”

Mr Collins said Mr Trump had been allowed to “incite” interference with the US election process since losing to Joe Biden because much of the content shared by the president and his supporters was not classed by the platforms as an imminent physical threat to others and therefore a breach of their rules.

“It’s not just about content moderation, it’s about the amplification of that content and that really is the most important thing,” Mr Collins said.

“It’s the fact that people that can use the social media platforms to reach large numbers of people, and the platforms themselves recommend content to people based on what they’re interested in – the platforms are designed to succeed in an attention economy and they’re largely designed to be blind as to what it is people are interested in.

“Now that may be fine if you’re selling music or a pair of shows, but if you’re selling a political ideology and the platforms are directing people to that content because they think they’re interested in it – that’s where the platforms have a clear responsibility to try and make sure their systems aren’t used to amplify harmful messages.”

He said sites should now recognise that their systems are being used to spread disinformation, and that represented “both an immediate harm and a long term attack on democracy, on mainstream media” as well as “driving divisions in society”.

To force this change, Mr Collins said the Government’s planned Online Harms regulation, designed to curb social media and tech giants, should take a closer look at its definition of harmful misinformation, making platforms enforce rules which focus on content beyond just that which causes imminent physical danger to an individual.

“I think we need to look wider than that and say OK, are there circumstances where what we’re talking about is not necessarily an immediate physical danger to an individual, but actually something which is potentially more widely damaging to society as a whole, and I would say that someone using social media to incite an insurrection is an example of how that is dangerous,” he said.

However, the Tory MP also warned that the incident highlighted the power social media wielded, and regulation must ensure that decisions such as de-platforming a president or prime minister “aren’t left to people like Mark Zuckerberg”.

“I think there needs to be some sort of legal or regulatory framework around the way those decisions are made,” he said.

“Like other media, there should be some sort of guidelines around which these companies have to operate and they should be held accountable or liable for the decisions that they make”.

By Press Association

More Technology News

See more More Technology News

UK unveils AI cyber defence lab to combat Russian threats, as minister pledges unwavering support for Ukraine

British spies to ramp up fight against Russian cyber threats with launch of cutting-edge AI research unit

Pat McFadden

UK spies to counter Russian cyber warfare threat with new AI security lab

Openreach van

Upgrade to Openreach ultrafast full fibre broadband ‘could deliver £66bn boost’

Laptop with a virus warning on the screen

Nato countries are in a ‘hidden cyber war’ with Russia, says Liz Kendall

Pat McFadden

Russia prepared to launch cyber attacks on UK, minister to warn

A person holds an iphone showing the app for Google chrome search engine

Apple and Google ‘should face investigation over mobile browser duopoly’

A Google icon on a smartphone

Firms can use AI to help offset Budget tax hikes, says Google UK boss

Icons of social media apps, including Facebook, Instagram, YouTube and WhatsApp, are displayed on a mobile phone screen

Growing social media app vows to shake up ‘toxic’ status quo

Will Guyatt questions who is responsible for the safety of children online

Are Zuckerberg and Musk responsible for looking after my kids online?

Social media apps on a phone

U16s social media ban punishes children for tech firm failures, charities say

Google shown on a smartphone

US Government proposes forcing Google to sell Chrome to break-up tech empire

The logo for Google's Gemini AI assistant

Google’s Gemini AI gets dedicated iPhone app in the UK for the first time

Facebook stock

EU fines Meta £660m for competition rule breaches over Facebook Marketplace

A phone taking a photo of a phone mast

Government pledges more digital inclusion as rural Wales gets phone mast boost

Social media apps displayed on a mobile phone screen

What is Bluesky and why are people leaving X to sign up?

Someone types at a keyboard

Cyber security chief warns Black Friday shoppers to be alert to scams