Self-harm is revered on social media, says an MP’s daughter whose experiences have inspired his fight to crack down on tech giants.
In an article for The Telegraph, Claudia Collins, 14, said self-harm content was still “as dominant as ever” on social media four years after Molly Russell, also 14, took her life after being targeted with self-harm and suicide content on Instagram.
Ms Collins, daughter of Damian Collins, the chairman of the committee helping draft new duty of care laws, said: “Self-harm is not just prominent on social media, it is revered. Molly’s death reflects an online world fuelling an epidemic.”
She was inspired to write about it by her father after he told her words would have far more impact as “no researcher or data could recreate her real experience”.
Mr Collins said that as a father he was “hugely concerned” at the amount of graphic self-harm material being directed at his 14-year-old daughter and her friends on TikTok and Instagram.
He revealed the committee will consider additional criminal sanctions against directors, civil court orders for families to get redress and powers for the watchdog Ofcom to raid their premises if there are serious breaches or problems with their algorithms.
Ms Collins described videos recommended to her and her friends of girls with “drooling mascara beneath their eyes,” scarring their arms with pins or sharing images of cuts with a hashtag “barcode wrist,” joking they were scanned by supermarket checkouts.
“The algorithm recognises the growth in promotional self-harm posts and instead of intercepting and shutting them down, they continue to recommend these posts to young girls, many under the app’s age limit.
“I’ve seen videos of my friends crying posted online, adding to a stream of trending videos. Although these accounts may seem alien, these extremes are not unusual.
“With poor mental health at their highest ever rates, social media will fuel the genocide of my generation.”
Mr Collins said his daughter had raised her “increasing concern” about self-harm being “glorified and glamorised” online when they were discussing social media.
He said it mirrored evidence to the committee by Molly’s father Ian who spoke of the “shocking” experience of parents whose children had been exposed to self-harm and suicide content.
“As a father, it is hugely concerning,” said Mr Collins. “Ultimately whatever you try to do to prepare children for growing up and living in the world, you are not with them when they are on social media.
“You can’t stand between them and the black box and what they are exposed to on it. What is concerning is that they are being exposed to harmful content based on how they are being profiled.”
He said his daughter was on his mind when he challenged TikTok on what they were doing to prevent self-harm content being directed at teenagers. He said he was “not at all convinced” they were researching it or screening to prevent it.
He cited evidence from Facebook whistleblower Frances Haugen, who also gave evidence to the committee last week, that vulnerable people were highly likely to be targeted by the platforms’ algorithms.
Given Molly died four years ago, it was clear that the algorithms which sites like Instagram were using to try to screen out self-harm content were not working. There was as little as three per cent picked up.
He said he had nothing to contradict Ms Haugen’s criticisms of Facebook. “It is focused on profit, it is focused on engagement, it treats users’ data as its own and AI systems it has created to solve the problem don’t work. And the company does not have a plan B,” he said.
He indicated there were three potential answers. One was greater investigative powers for the regulator Ofcom to go into companies to carry out proper inquiries into why, for example, its AI systems for screening out self-harm material are not working.
He is also considering criminal liability for social media directors to be introduced immediately rather than held in reserve, on top of the multi-billion pounds the tech giants will face, and legal redress for users through the civil courts if their data is misused.
A Tik Tok spokesman said: "Protecting the well-being of our community is extremely important to us. We do not allow content depicting, promoting, normalising, or glorifying activities that could lead to suicide or self-harm.
"Our policies aim to support people who may be struggling and provide access to expert emotional help, and we redirect searches for words and phrases related to self-harm, including 'barcode wrists', to in-app support resources and external partners like Samaritans."