Shelagh Fogarty 1pm - 4pm
Exclusive
Antisemitic online hate soars in months since October 7 attacks, new analysis reveals
20 December 2024, 08:40
Antisemitic hate online has soared in the months since the October 7 attacks, shock new analysis of social media reveals.
Listen to this article
Loading audio...
A new report seen by LBC show a disturbing rise in hateful content aimed at Jews in Britain – with tens of thousands of posts still being detected every day.
The study from the Antisemitism Policy Trust looked at over 409 million Reddit posts and comments, 4.95 million X posts, and hundreds of thousands of posts from smaller platforms including TruthSocial, Telegram, Minds, and Odysee.
It looked directly at social media posts in the months after the October 7 terror attacks in Israel, to June 2024.
There are now as many as four antisemitic messages per Jewish person in the UK per day (500,000) on outlets like X, it found, up from their previous report which showed half that.
On X alone, approximately 129,200 antisemitic posts were identified out of the 4.95 million sampled.
Bots and artificial intelligence are helping to create fake trends and graphic antisemitic content, bypassing traditional safety algorithms, it found.
The report’s authors also discovered evidence of a new class of alternative online influencers is on the rise – with social media-style celebrities pumping out hate speech, and promoted by some platforms.
Much of this content was easily discovered by existing safety tools.
Alternative and smaller media platforms like Rumble, Minds, and Odysee are sources of antisemitism that spill over into larger mainstream platforms, the report finds.
Many of these websites saw fewer, but more extreme, antisemitic posts.
And their content is less well moderated than some of the larger firms too, the report claims.
It’s these smaller platforms that the Science and Technology Secretary, Peter Kyle, recently decided to exempt from all sections of the Online Safety Act, which will crackdown on hateful content online.
The new law, which ministers hope will be some of the toughest in the world, will kick in within three months, and the watchdog will monitor their progress.
From March tech platforms will have to proactively take down illegal content including terrorist material, intimate image abuse, child abuse, and make it easier for the police to report fraud and scams to websites.
But critics say they fear it still won’t go far enough to stamp out racism online.
Regulator Ofcom said earlier this week that social media firms have “a job of work” to do to comply with the new laws – and have yet to introduce all the required safety measures to protect children and adults from harmful content.
If media firms break new Ofcom guidance, they could face huge fines and the closure of sites.
Danny Stone, Chief Executive of the Antisemitism Policy Trust, told LBC that “anti-Jewish racism online is out of control”.
He said: “The scale, reach and impact of the hate seeping through sites both small and large should not be underestimated.“
Online hate actors are using the same techniques as big brands to grab attention and lure people down rabbit holes towards further harmful content.“
It isn’t just the large sites that people are familiar with that are doing damage, but small, high harm so-called alt media platforms on which rules are few and hate is extreme in nature.
“Sadly, the Government only this week opted not to regulate such sites to the fullest extent possible, and our report underlines how bad a mistake that was.
“From large to small and back again, what we’ve termed antisemitic superhighways are connecting people through extremism.
“Whether it be the war between Russia and Ukraine or Israel and Hamas, individuals are being exposed to hate at unprecedented levels, which is being normalised and having a radicalisation effect.”
He says that social media companies aren’t doing all they can to find, remove and stop amplifying this hate.
And he wants ministers to do more to regulate them – or risk helping to pollute society further.
He added: “Our report Is one of a number that makes clear that regulation is too light-touch, and left to their own devices social media companies wont clean up the internet.
"They are helping to pollute society and are undermining our democratic ways of life. If this were happening in any other industry we would make them pay.
“That is what Government should be doing and we will be pressing Ministers to take further action."
A government spokesperson said: "Ofcom’s illegal codes published this week, alongside the children’s codes set to be finalised next Spring, mean platforms - regardless of size or reach – will have to protect users from illegal content and material.
"Smaller platforms which spread hate and harmful material will also be tightly regulated. We expect Ofcom to use the full range of its powers – including fining and seeking court approval to blocking access to sites – if the people behind these platforms fail to comply."
X, Telegraph, Reddit and TruthSocial were approached for comment.