BET À DAY
Molly Russel was 14 when she committed suicide in 2017, and for her parents the content she had consulted on social networks had something to do with it. British justice ruled in their favor.
The suicide of this British teenager has revived the debate in the United Kingdom on the influence of social networks on young people. Because trying to understand her gesture, her parents discovered that the young girl had been massively exposed, mainly on Instagram and Pinterest, to content evoking, even normalizing, depression, suicide and self-harm.
For almost five years, they have been fighting for recognition of the role of social networks in the death of their daughter and to ensure that children are better protected on the internet.
The conclusions of a legal procedure called “inquest”, intended to determine the causes of the death of the teenager, are severe. The content seen by Molly “was not safe” and “should never have been accessible to a child”, declared Andrew Walker, the “coroner” in charge of the procedure.
And rather than to qualify her death as suicide, he therefore considered that the teenager “died of an act of self-harm, while she suffered from depression and the negative effects of content seen on the internet”.
Some of this content was “particularly explicit, tending to portray self-harm and suicide as an inevitable consequence of an illness from which it could not cure”, and “without any counterpoint”, he insisted. /p>
In particular, he questioned the algorithms developed by social networks, which tend to offer users content similar to what they have seen previously.
Calls for help
Reacting to the court exit, Molly's father, Ian Russell, said he hoped the findings would “mark an important step in making the necessary changes” to how social media works.
The question of how social media works algorithms and content moderation is a major challenge for platforms, as more and more people are calling for stronger regulations to prevent abuses, such as cyberbullying or misinformation.
A law on “safety on the internet” supposed to strike a balance between freedom of expression and the protection of users, in particular minors, is also under consideration in the British Parliament.
During the proceedings, the Court disseminated some of the more than 2,000 content – texts, photos, videos – evoking suicide or self-harm, seen on the networks by the young Molly in the six months preceding her death.
The young girl had even received email recommendations from Pinterest suggesting that she go and read “10 'posts' on depression that you might like”.
The procedure also highlighted the difficulty for families and school staff to control what children see and do on the internet.
Confined to these networks, the young girl had appealed for help to celebrities, such as the novelist J.K Rowling to whom she wrote on Twitter: “My mind is filled with suicidal thoughts.” So many bottles overboard that got lost in the digital ocean.
Executives of Meta (owner of Facebook and Instagram) and Pinterest heard during the procedure presented their “apologies” for the death from Molly.
Pinterest's Judson Hoffman had acknowledged that the girl had been exposed to “unsafe” content, and that “there is content that should have been removed that doesn't was not”.
Elizabeth Lagone, representing Meta, explained that the assessment of the danger of content was “constantly changing” and that some seen by the young girl were “safe” and reflected “nuanced and complicated” messages. She also considered it “important” to allow people with suicidal thoughts to express them on the internet.
On Friday, the group founded by Mark Zuckerberg assured “that it was committed to ensuring a positive experience for all on Instagram, especially teenagers, and that he would carefully study the coroner's full report.”
If you need any help
QUEBEC SUICIDE PREVENTION LINE
– 1 866 APPELLE (277-3553)
KIDS, I LISTEN
– 1 800 668-6868  ;
– 1 800 263-2266< /p>
Quebec Suicide Prevention Association