'Deeply disturbing' rape and incest game exposes the weaknesses in online safety law - it's time to step it up

10 April 2025, 20:01 | Updated: 10 April 2025, 20:03

The computer game "No Mercy" centres around a male protagonist who is encouraged to "become every woman's worst nightmare", and "never take no for an answer".
The computer game "No Mercy" centres around a male protagonist who is encouraged to "become every woman's worst nightmare", and "never take no for an answer". Picture: No Mercy on Steam

By Emma Soteriou

The controversial computer game "No Mercy" was launched on Steam, a major online gaming platform, last month, and contains simulations of violence, incest and "unavoidable non-consensual sex" (i.e. rape).

Listen to this article

Loading audio...

Where available, it is easy to download for a small fee, with minimal age checks. After an LBC investigation, the video game has now been removed from Steam in the UK.

The issue does not end there, however. The very existence of "No Mercy" alone is deeply disturbing, but it also calls into question how these types of games, capable of perpetuating misogynistic attitudes and real-world violence against women, are permitted to become and remain available on online platforms, seemingly without any scrutiny.

The availability of such a game must be at odds with the intention behind the recent the Online Safety Act 2023 (the Act), which was designed to protect children and adults online.

As of March 17, 2025, the Act now requires that online service providers within its scope take measures to protect their users from 'illegal content'. Content (including words, images, speech or sounds) is 'illegal content' if the use, possession, viewing, accessing, publication or dissemination of the content constitutes a 'relevant offence', including extreme sexual violence, extreme pornography, and inciting violence.

'Illegal content' must not only be removed when brought to the provider's attention, but there are also duties to ensure such content does not appear at all. The Act will be strengthened over the next few months, when platforms will be required to prevent children accessing harmful/age-inappropriate content, amongst other important safeguards.

There remains some uncertainty about which types of online services are caught under the Act. Online providers regulated by the Act are set out as: (1) user-to-user service providers, (2) search engine service providers, and (3) internet services which publish/display regulated provider pornographic content. Ofcom has created a tool to help online platforms understand if the Act applies to their business.

However, to ensure a wide variety of online platforms are captured, the scope of the Act is intentionally broad. It is also important to note that the Act has extra-territorial effect, meaning it applies to services outside the jurisdiction if they have links to the UK.

When issues of online safety do arise, the independent regulator is Ofcom, whose role it is to "make sure that online services protect their users". Ofcom has wide-ranging powers to enforce providers’ compliance with the Act. The risk for failing to comply is significant, with Ofcom able to dish out penalties including fines of up to £18 million or 10% of the company's qualifying worldwide revenue (whichever is greater).

It is therefore surprising that in recent comments to LBC concerning "No Mercy", Ofcom explained that it cannot regulate this kind of content and that they do not "investigate individual complaints", pushing the responsibility to deal with the issues of displaying damaging content back on the online provider and leaving unanswered questions on what recourse is available where issues with online safety arise.

The nature of the online world is ever evolving, and it poses significant dangers to the children and adults using it. It would be expected that the legislation regulating it should be applied with vigour. Given the sensitivity of the subject matter, the current relevance of harmful incel culture and the recent introduction of the Act, it might be expected that regulators such Ofcom would be more proactive at progressing enforcement action to ensure that online content of all kinds that is capable of causing harm to users cannot fall between the cracks.

___________________________

Helen Scambler is a lawyer at Mishcon de Reya LLP.

LBC Opinion provides a platform for diverse opinions on current affairs and matters of public interest.

The views expressed are those of the authors and do not necessarily reflect the official LBC position.

To contact us email views@lbc.co.uk