Meta investigated by Oversight Board over Facebook posts about far-right summer riots

3 December 2024, 13:13

Far-right riots in Manchester, August 2024
Far-right riots in Manchester, August 2024. Picture: Alamy

By Josef Al Shemary

The board, which examines content moderation decisions made by Meta's social platforms, is to look at three Facebook posts shared during the summer riots in the UK.

Listen to this article

Loading audio...

Meta's decision not to delete three posts about the Southport stabbings that triggered UK-wide riots is being investigated by its Oversight Board.

Violence erupted across the country after a knife attack in Southport which killed three girls and injured eight others.

The riots were fuelled by misinformation that spread rapidly on social media about the attacker's identity, including false claims that he was a Muslim asylum seeker who had arrived in the UK on a small boat.

The suspect was later identified as Axel Rudakubana, who was born in Cardiff to a Christian family.

The three posts being investigated included calls for mosques to be attacked and buildings housing migrants to be set on fire, AI-generated images of Muslims being chased, and details of when and where to meet for protests.

James O'Brien coins the term the 'Farage riots'

All three posts were reported by users but remained online after being assessed by Meta's automated tools.

Even after users appealed the automated decisions, the posts stayed online - none of the posts were reviewed by humans.

The same users who had reported the posts appealed to the Oversight Board over the decision.

Meta's Oversight Board is an independent body that reviews Meta’s content moderation, making binding decisions on how content is moderated on Facebook and Instagram.

The Board has now confirmed it will investigate the three cases for violating either its hate speech or violence and incitement policies.

Nigel Farage is challenged over his role in the UK riots

The first post incited people to attack mosques and set buildings "where scum are living" on fire, referring to migrants as terrorists.

The second post showed what appeared to be an AI-generated image of a giant man wearing a Union flag T-shirt who is chasing several Muslim men.

Text overlaid on the picture provided details of when and where to meet for one of the protests.

Read more: Social media fanned flames of UK riots, but politicians must take responsibility too

Read more: Foreign bots 'turbo-charged' misinformation during riots after fatal Southport attack, claims counter-terror boss

The third post, another AI-generated image, shows four Muslim men running in front of the Houses of Parliament chasing a crying blond-haired toddler in a Union flag T-shirt.

The image is captioned “wake up”.

Since the start of the investigation, Facebook deleted the first post but the two others remained online.

Meta will be investigated by the Oversight Board.
Meta will be investigated by the Oversight Board. Picture: Getty

The social media giant confirmed to the board it still believes its decisions to leave the second and third post on Facebook was correct.

The board said it had selected these cases to examine Meta's policy preparedness and crisis response to violent riots targeting migrant and Muslim communities.

It also said it would now accept public comments on the issue, including the role social media played in the UK riots and the spreading of misinformation.

There have since been calls to tighten online safety laws to better respond to misinformation and disinformation because of the real world impact it can have.