Is TikTok luring children to death? • FRANCE 24 English

Now, it’s time for today’s perspective and a warning today of a cycle of depression, self harm, and potentially suicidal content for our children and our grandchildren. It’s the warning from Amnesty International France as it says that the social media app Tik Tok is still steering vulnerable children and young people towards such content. New research by the organization called Dragged into a Rabbit Hole. It highlights Tik Tok’s ongoing failure, the organization says, to address its systematic design risks affecting children and young people. With us to discuss the findings is deputy director at Amnesty Tech, Lauren Amnest. Thanks very much for being with us on the program today. Tell us first and simply what you found. We found that two years after we first did our original research into Tik Tok that children and young people who experience or express an interest in mental health content rapidly um fall into a rabbit hole of depressive content within less than an hour of scrolling. And that within 3 to four hours of being on on the app, they are being shown uh content romanticizing suicide or um users that are expressing their intention to kill themselves by suicide or even videos showing methods to for you know to kill people kill yourself. Um, and this is two years after we did our original research where we also showed that accounts in other countries such as the US and Kenya and the Philippines were being dragged into these rabbit holes of depressive content. So two years later, not enough has been done. And within the EU, there is this robust, you know, this this legislation that should oblige very large online platforms to mitigate and to address their systemic risks for minors. So children under the age of 18 and that simply isn’t happening. And that’s our research. And this happens to anybody who looks on Tik Tok, even if they’re looking at, I don’t know, if they’re trying to look for dress designs or something like that. So we did find that our researcher who was doing the the manual experiments who was actually um on the app herself was being shown the depressive content within a number of minutes of being on the app. Now the thing with the app is that it it shows you a highly personalized feed of videos. So it picks up on um the things that you watch and often the things that you watch might not necessarily be the things that you think you’re interested in. And in that way, the app is able to I mean the app the way that Tik Tok describes it is that it responds to your interests and the interests that you express through the videos you watch. However, we argue that that can also pick up on your mood or protective characteristics for instance. Um and in that way it can trap you in these rabbit holes of depressive content if you go on feeling depressed. You know, we’re saying that children and young people who already are facing mental health challenges, who often may be children who seek comfort in in social media are the ones that are particularly vulnerable and may find that their their mental health challenges are exacerbated by this personalized, highly personalized feed. And I know that you think it’s part of the underlying business uh model, isn’t it? The whole problem why this is happening in the first place. Exactly. The whole um the whole business model of Tik Tok and and many other big tech companies is about the attention economy. They want to keep your eyes glued on their screen for the longest amount of time because that allows them to one keep you on the app but also collect personal information about you and with that personal information they can sell that to advertisers and then advertisers can put their adverts in front of you or you can you know you can sell this information on people to other other people who can use it elsewhere. So it’s a people may go on thinking they’re getting a product but actually they are the product. You know the people who are I think one of our um families that we spoke to described how their child had become a product. They become their information is commodified and in that way they want to learn everything about you. You know I think you wouldn’t invite somebody into your house and allow them to rifle through all your drawers and your cupboards but in some ways that’s what happens when you go on these apps. They are collecting all this personal information on you to sell it. And the way they keep you engaged is by showing things that they think you’re going to be interested in. Sometimes that might be benign. It might be cats, but often, especially when it’s highly highly personalized like Tik Tok, and it’s responding to the things that you’re watching as opposed to positive signals that you give it. So you’re not, you know, necessarily telling them that you want to look for cats. then it can respond to your mental state and in that way it can really shape and manipulate how you feel and lots of the people we we spoke to that’s exactly how they feel. What would you um advise people to do because there are a lot of parents a lot of grandparents perhaps who are watching this who may uh be well aware of the problem and and know that their children use Tik Tok all the time. What advice would you give to them? Well, Tik Tok have put some mitigation efforts in place, but our position is that it really shouldn’t fall on the user to try to resist what is essentially an extremely addictive design. Tik Tok is designed to to get people hooked. You know, it’s not designed it’s not safety by design and it’s not designed in the best interests of children. It’s very difficult though, isn’t it, to get children to keep off it when all their friends are doing it. Absolutely. And that’s why in some ways it shouldn’t fall to them. it should fall to regulators to regulate the the apps properly and the apps themselves to change their business model. You know, we’re calling on on apps to instead of having a personalized feed by default, make it an opt-in and then when you do, make sure that people have to express specific interests. So, they have to go in and say, “I’m interested in dresses, dances, and cats.” It’s probably unlikely that people would go in and say, “I’m interested in self harm and suicide.” But that’s they get trapped there because they’re shown stuff that grabs their attention and that is the most extreme content. We already know that people end up in rabbit holes of extreme content because that’s what they are subconsciously drawn to. So for us there are things people can do and just being aware of the addictiveness of the the app can help but really it shouldn’t fall to users. It’s about regulation and also the apps changing. You mentioned before, didn’t you? The European Union’s Digital Services Act. I mean it came in in 2023 require platforms to identify mitigate risk to children’s rights. It’s plainly not working is it so far? No. I mean, it’s only been around for two years, but there is also an ongoing investigation into Tik Tok. And we really hope that this new research will feed into that investigation, and we will see strong enforcement action because we do believe that we have shown that two years after we first revealed the the dangers of of uh Tik Tok to children with mental health challenges, not enough has been done. So, they clearly aren’t mitigating. and their their recent report for um under the DSA really didn’t have enough information about what kind of mitigation strategies they’re taking to address the evident risks on their platform for children and young people and you put uh all your findings to Amnesty and to Amnesty to Tik Tok and no reply no response this time they haven’t responded we have been in touch I believe they have responded to other media outlets they wouldn’t necessarily agree with our our findings, I’m sure, but um you know, we’re quite used to that. I think we think we think we’ve proved that it there are evident risks. I mean, obviously there’s limitations to this research, which is why it’s so important that apps also grant access to independent researchers to investigate the algorithms on their apps. you know, it’s it’s an imperfect situation where we’re doing research and they argue it may not necessarily replicate a a real person, but also, you know, we weren’t necessarily inputting all the signals that would a child or young person would. So, you know, the rabbit hole effect could have been worse. Um, but it does highlight why it’s so important that apps grant access to researchers to do this independent research into the into their algorithm and then the the app themselves can take action where risks and abuses are identified. Thanks for joining us on the program today, Lauren Arnstead there from deputy director at Amnesty Tech. Thanks very much for uh being with

New research shows that children who go onto the social media app TikTok and make enquiries about mental health will quickly find depressive content, and that within a few hours they are bound to see content from users expressing the will to kill themselves. That research from Amnesty Tech is entitled “Dragged Into the Rabbit Hole”. The organisation says the findings highlight TikTok’s ongoing failure to address its systemic design risks affecting children and young people, and also illustrate the failings of the European Union’s Digital Services Act. Since 2023, the act requires platforms to identify and mitigate systemic risks to children’s rights. In Perspective, we spoke to Lauren Armistead, deputy director at Amnesty Tech.

Read more about this story in our article: https://f24.my/BVxc.y

🔔 Subscribe to France 24 now: https://f24.my/YTen
🔴 LIVE – Watch FRANCE 24 English 24/7 here: https://f24.my/YTliveEN

🌍 Read the latest International News and Top Stories: https://www.france24.com/en/

Like us on Facebook: https://f24.my/FBen
Follow us on X: https://f24.my/Xen
Bluesky: https://f24.my/BSen and Threads: https://f24.my/THen
Browse the news in pictures on Instagram: https://f24.my/IGen
Discover our TikTok videos: https://f24.my/TKen
Get the latest top stories on Telegram: https://f24.my/TGen

5 comments
  1. No it's russian technology influence to childrens hope now EU send enough money to Great Ukraine 🇺🇦🙋🏻

  2. Any algorithm, including those used in EU engines, can be tricked into certain content after the intentional selection of videos. This is what the "research" testers wanted. It's like you can find flaws in anyone. But the problem is TikTok is singled out precisely because of its perceived non-Western origin.

Comments are closed.