abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Article

16 Apr 2024

Author:
Kate McDonald, RTÉ News

TikTok allegedly suggests self-harm & suicide content to children

"13 on TikTok: Self-harm and suicide content shown shocks experts", 16 April 2024

On foot of concerns published by researchers and advocacy groups like Amnesty International about young teens' mental health being negatively influenced by content on TikTok, Prime Time conducted an experiment.

Three new TikTok accounts were created on phones with newly installed operating systems. Each time, when asked to provide the users’ age, Prime Time gave a date in 2011. As a result, TikTok understood the user was 13 years old.

Prime Time did not search for topics, 'like’ or comment on videos, or engage with content in any similar way. We watched the videos shown by TikTok on the ‘For You’ feed, and when shown videos related to topics like parental relationships, loneliness, or feelings of isolation, we watched them twice.

Within minutes, the accounts which TikTok understood to be controlled by 13-year-old users, were into a mental health content rabbit hole.

Fourteen minutes into scrolling through videos on my new TikTok account and one appears of a teenager crying in what looks like their bedroom.

The text overlaid on the video reads ‘holding it together so my mom doesn’t have to sit at my funeral wondering what she did wrong.’

The sound with it is a song; ‘How to Save a Life’ by The Fray. There are more than 550,000 likes on the video.

It was just the latest piece of content shown in what was already becoming a stream related to depression, self-harm and suicide over the previous quarter-hour.

In 2023, Amnesty International, with the assistance of a Portugal-based research organisation called AI Forensics, carried out a similar exercise as part of a report examining the impact of TikTok on teens in the US, Kenya and Philippines.

Prime Time set the user locations to Ireland and recorded the videos shown for one hour on each account.

Content was then shown to the Chair of the Faculty of Child & Adolescent Psychiatry in the College of Psychiatrists of Ireland, Dr Patricia Byrne, and Dr Richard Hogan, a psychotherapist who specialises in working with families.

"I work in this area every day... but even for me, seeing that was very emotional and provokes a very strong, very strong emotional reaction," Dr Byrne said, visibly upset.

"I didn't expect to feel as emotional as I did watching it, but they're very powerful imagery and very intense."

Dr Richard Hogan told Prime Time: "I'm deflated, I'm emotional. I'm angry."

"It is absolutely suggesting suicide as a course to deal with your psychological upset there. I mean, that is, that is heinous," he added.

Dr Byrne says adolescents seeing such content - which in some instances glamorises self-harm and references suicide - poses many risks.

TikTok’s success is based on the power of its recommendation system - an algorithm which analyses users’ engagement with content for indicators of interests and uses those indicators to decide which video to display next.

Within 20 minutes, Prime Time was being shown videos directly referencing self-harm and suicidal thoughts.

The automated moderation process means TikTok will not return results when users directly search terms or words like ‘self-harm’ or ‘suicide.’ Instead, the results page will advise the user of organisations which provide mental health support services.

In a response to queries from Prime Time, TikTok said: "We approach topics of suicide and self-harm with immense care and are guided by mental health experts, including those we have on staff. As RTÉ found, we make it easy for people who may be struggling to access professional support, and our Refresh feature lets people change what is recommended to them."

The ‘Refresh’ feature allows users to clear what the app understands about their profile, and begin again. It is not a moderation system.

Videos shown to Prime Time also referenced suicide and suicidal thoughts.

"The problem... is how the algorithm basically drives users from less problematic content such as sadness or romantic breakups, to more concerning content like depression," Head of Research with AI Forensics, Salvatore Romano, told Prime Time.

After conducting the experiment, Prime Time gave TikTok the usernames of the accounts set up. This allowed TikTok to have access all the videos that had been shown to the accounts.

We also provided TikTok with screenshots from ten example videos.

TikTok said "RTÉ's test in no way accurately represents the behaviour or experiences of real teens who use our app... Out of hundreds of videos that would've been seen during RTÉ's testing, we reviewed ten that were sent to us and made changes so that seven can no longer be viewed by teenagers."

Dr Byrne, Chair of the Faculty of Child & Adolescent Psychiatry in the College of Psychiatrists of Ireland says the brain is in a period of massive change and cognitive development during adolescence, and as a result certain types of content can grasp their attention.

"Adolescents are designed to seek out novel experiences. Their brain is hardwired to seek out more and more experiences," she told Prime Time.

She says a major risk is what is known as ‘the contagion effect.’

The same risk exists for seeing harmful content, she said.

...

Statistics about rates of self-harm are compiled by the National Suicide Research Foundation, however, they only cover instances which result in people attending emergency departments.

Since 2002, when the foundation began collating the data, there’s been an increase in the number of presentations to emergency departments by around 18%.

There is no singular cause for that increase. While our research focused on TikTok, those working in child and adolescent psychiatry say the pressure on young people - in part, driven by social media in general - is a factor.

Mr Romano says the design and functionality of TikTok - and especially the way in which young people are presented with content from accounts they do not follow - is particularly concerning.

He says it is devised to "achieve the maximum engagement."