Psychiatrist ‘did not sleep well’ after viewing content seen by Molly Russell

27 September 2022, 16:54

Molly Russell inquest
Molly Russell inquest. Picture: PA

Dr Navin Venugopal spoke about the ‘very disturbing’ content in the witness box on Tuesday.

A child psychiatrist has told an inquest the self-harm material viewed on social media by Molly Russell before she died left him “not able to sleep well for a few weeks”.

Dr Navin Venugopal said the “very disturbing, distressing” content Molly had engaged with would “certainly affect her and made her feel more hopeless” as he gave evidence at North London Coroner’s Court.

On Tuesday, proceedings were paused for a few moments as the family’s lawyer Oliver Sanders KC told the court a “rather unpleasant” Instagram account had been set up using an image of Molly as its profile picture.

In a short statement, a spokesman for Meta said: “This account has been removed from Instagram for violating our policies.”

Molly, from Harrow in north-west London, ended her life in November 2017, prompting her family to campaign for better internet safety.

The inquest heard the 14-year-old had written a note before she died, which Dr Venugopal described as “very sad to look at”.

Under questioning from Coroner Andrew Walker, the witness agreed it was important to recognise “children are not adults”, and that adult matters should not be accessible to children.

Dr Venugopal told the inquest he saw no “positive benefit” to the material viewed by the teenager before she died.

Asking the witness about what effect the material would have had on Molly, the coroner said: “This material seems to romanticise, glamorise, and take the subject of self-harm – take it away from reality and make it seem almost unreal, take away from these terrible acts any kind of consequence.

“You have looked at the material, do you think that the material that Molly viewed had any impact on her state of mind?”

Dr Venugopal replied: “I suppose I will start off, I will talk about the effect the material had on my state of mind.

“I had to see it over a short period of time and it was very disturbing, distressing.

“There were periods where I was not able to sleep well for a few weeks so bearing in mind that the child saw this over a period of months I can only say that she was (affected) – especially bearing in mind that she was a depressed 14-year-old.

“It would certainly affect her and made her feel more hopeless.”

The coroner continued: “Can you see any positive benefit for that material being looked at?”

“No, I do not,” Dr Venugopal replied.

Mr Sanders then took the witness through a number of videos viewed by Molly on Instagram, followed by a note written by the teenager on her phone two days after watching one clip which used “identical language”.

Dr Venugopal told the court: “If they are of that mindset and are seeing these sorts of things, it could have an impact.”

The witness was taken through his reports in which he concluded the content Molly viewed had “exacerbated her sense of helplessness.

In his statement, Dr Venugopal said: “I think that the harm suffered was certainly more significant than minimal or trivial, although it is difficult to be more specific.

“This content led Molly to conclude from the material she viewed that she was unlikely to recover from her depression and that her future was bleak and hopeless.

“This is likely to have exacerbated her sense of helplessness and made her less likely to seek help and support from family or friends.”

Dr Venugopal added: “I am of the opinion that it is likely that Miss Russell was placed at risk through accessing self-harm material on social media websites and using the internet.

“There was a risk to Miss Russell’s health and mental state by looking at self-harm related content.”

Concluding his examination of the expert, Mr Sanders asked: “The material she was looking at wasn’t safe, was it?”

“No it was not,” Dr Venugopal replied.

The head of health and wellbeing at Instagram’s parent company Meta and the head of community operations at Pinterest have both apologised at the inquest for content Molly viewed.

Meta executive Elizabeth Lagone said she believed posts which the Russell family argued “encouraged” suicide were safe when the teenager viewed them.

Pinterest’s Judson Hoffman told the inquest the site was “not safe” when Molly used it.

The final witnesses to give evidence at the inquest will be the headteacher and deputy headteacher of the teenager’s school on Wednesday.

The coroner said he would accept submissions from all parties on considerations for his conclusions and a prevention of future deaths report on Thursday.

By Press Association

More Technology News

See more More Technology News

Ellen Roome with her son Jools Sweeney

Bereaved mother: Social media firms ‘awful’ in search for answers on son’s death

A remote-controlled sex toy

Remote-controlled sex toys ‘vulnerable to attack by malicious third parties’

LG AeroCatTower (Martyn Landi/PA)

The weird and wonderful gadgets of CES 2025

Sinclair C5 enthusiasts enjoy the gathering at Alexandra Palace in London

Sinclair C5 fans gather to celebrate ‘iconic’ vehicle’s 40th anniversary

A still from Kemp's AI generated video

Spandau Ballet’s Gary Kemp releases AI generated music video for new single

DragonFire laser weapon system

Britain must learn from Ukraine and use AI for warfare, MPs say

The Pinwheel Watch, a smartwatch designed for children, unveiled at the CES technology show in Las Vegas.

CES 2025: Pinwheel launches child-friendly smartwatch with built in AI chatbot

The firm said the morning data jumps had emerged as part of its broadband network analysis (PA)

Millions head online at 6am, 7am and 8am as alarms go off, data shows

A mobile phone screen

Meta ends fact-checking on Facebook and Instagram in favour of community notes

Mark Zuckerberg

Meta criticised over ‘chilling’ content moderation changes

Apps displayed on smartphone

Swinney voices concern at Meta changes and will ‘keep considering’ use of X

sam altman

Sister of OpenAI CEO Sam Altman files lawsuit against brother alleging sexual abuse as child

OpenAI chief executive Sam Altman with then-prime minister Rishi Sunak at the AI Safety Summit in Milton Keynes in November 2023

OpenAI boss Sam Altman denies sister’s allegations of sexual abuse

A super-resolution prostate image

New prostate cancer imaging shows ‘extremely encouraging’ results in trials

Gadget Show

AI will help workers with their jobs, not replace them, tech executives say

Zuckerberg said he will "work with President Trump to push back on governments around the world that are going after American companies and pushing to censor more”.

Meta’s ‘chilling’ decision to ditch fact-checking and loosen moderation could have ‘dire consequences’ says charity