Not a really surprising outcome, but I do think this is now going to be used to further crack down on social media (whihc because of how laws are written will include reddit).
> Molly viewed more than 16,000 pieces of content on Instagram in the final six months of her life, of which 2,100 were related to suicide, self-harm and depression. The inquest also heard how she had compiled a digital pinboard on Pinterest with 469 images related to similar subjects.
The article doesn’t mention what *kind* of content this was.
Some of my contacts on Instagram regularly post depression related content but it’s more memes around how the day is a struggle, how it’s OK not to be OK and other ‘affirmative’ content. It doesn’t encourage self harm, but is unashamedly saying “I have depression and it sucks”.
The first one (something encouraging it) really shouldn’t be shown, the second is much murkier – some folks find that (if not helpful) then at least soothing to be able to voice how they’re feeling.
But then if Instagram got shit-hot on policing that kind of content, someone dedicated to finding it could easily do so on any number of alternatives / forums / Google etc.
what everybody seems to be missing her Parents should have been controlling her internet usage at 14 I did with my children check the phone’s every week see what they were doing quite simple really, she probably would also have committed suicide, without looking at the internet
Tbh I don’t like it. If it’s a suicide then the cause of death should be “suicide”. How can a coroner decide that such a word is “inappropriate” or “not safe”? It’s literally the cause of death.
Anyways, looking forward to how the government use this to crack down on our privacy. “Protecting the children” is always the go-to excuse.
Oh trust me on this one, Internet is easily a scary place if you’re new and young to it, I got to grow up with it and seen the early days and the content that was getting posted online, the “memes” of how to make your very own crystals at home with a straw and Bleach (FOR FUCK SAKE DON’T DO THIS) or to delete system32 folder
​
​
The internet will never be tamed by anyone, so you best learn how to be safe while you use it, because no matter what happens you’ll always have to deal with someone being a cunt.
feel bad but were the parents just allowing unlimited access and not monitoring anything?
also suffering from negative effects of online content? Seriously? Someone point to me where that is in DSM
That coroner is an idiot. It’s suicide whether you like it or not and she was 14, she wasn’t a kid. Instagram algorithms wouldn’t have made her depressed or commit suicide or even made her worse, it’s just blame games from the dad and a sucked in coroner. She would have been suffering these feelings for a long time but instead of finding out why, we’ve just politicised her death and shown that suicide still carries a stigma to it that people don’t want to address.
She’s 14, she ain’t gonna be influenced into feeling nor something that she wouldn’t be already thinking when it comes to suicide and depression. That I know for sure, though it won’t be easy to understand to people who’ve never suffered from those thoughts.
If she hadn’t found those images on instagram she’d have found them somewhere else, and likely more graphic ones at that. I used to self harm, the websites that encourage posting pictures and almost make it a competition are VERY easy to find- depression can be seen as a glamorous thing in some online circles. I feel for her and her family, and I don’t really know what the solution is here because people should have the right to express themselves online
8 comments
Not a really surprising outcome, but I do think this is now going to be used to further crack down on social media (whihc because of how laws are written will include reddit).
> Molly viewed more than 16,000 pieces of content on Instagram in the final six months of her life, of which 2,100 were related to suicide, self-harm and depression. The inquest also heard how she had compiled a digital pinboard on Pinterest with 469 images related to similar subjects.
The article doesn’t mention what *kind* of content this was.
Something encouraging self harm / self destruction? Images showing it?
Some of my contacts on Instagram regularly post depression related content but it’s more memes around how the day is a struggle, how it’s OK not to be OK and other ‘affirmative’ content. It doesn’t encourage self harm, but is unashamedly saying “I have depression and it sucks”.
The first one (something encouraging it) really shouldn’t be shown, the second is much murkier – some folks find that (if not helpful) then at least soothing to be able to voice how they’re feeling.
But then if Instagram got shit-hot on policing that kind of content, someone dedicated to finding it could easily do so on any number of alternatives / forums / Google etc.
what everybody seems to be missing her Parents should have been controlling her internet usage at 14 I did with my children check the phone’s every week see what they were doing quite simple really, she probably would also have committed suicide, without looking at the internet
Tbh I don’t like it. If it’s a suicide then the cause of death should be “suicide”. How can a coroner decide that such a word is “inappropriate” or “not safe”? It’s literally the cause of death.
Anyways, looking forward to how the government use this to crack down on our privacy. “Protecting the children” is always the go-to excuse.
Oh trust me on this one, Internet is easily a scary place if you’re new and young to it, I got to grow up with it and seen the early days and the content that was getting posted online, the “memes” of how to make your very own crystals at home with a straw and Bleach (FOR FUCK SAKE DON’T DO THIS) or to delete system32 folder
​
​
The internet will never be tamed by anyone, so you best learn how to be safe while you use it, because no matter what happens you’ll always have to deal with someone being a cunt.
feel bad but were the parents just allowing unlimited access and not monitoring anything?
also suffering from negative effects of online content? Seriously? Someone point to me where that is in DSM
That coroner is an idiot. It’s suicide whether you like it or not and she was 14, she wasn’t a kid. Instagram algorithms wouldn’t have made her depressed or commit suicide or even made her worse, it’s just blame games from the dad and a sucked in coroner. She would have been suffering these feelings for a long time but instead of finding out why, we’ve just politicised her death and shown that suicide still carries a stigma to it that people don’t want to address.
She’s 14, she ain’t gonna be influenced into feeling nor something that she wouldn’t be already thinking when it comes to suicide and depression. That I know for sure, though it won’t be easy to understand to people who’ve never suffered from those thoughts.
If she hadn’t found those images on instagram she’d have found them somewhere else, and likely more graphic ones at that. I used to self harm, the websites that encourage posting pictures and almost make it a competition are VERY easy to find- depression can be seen as a glamorous thing in some online circles. I feel for her and her family, and I don’t really know what the solution is here because people should have the right to express themselves online