WARNING: This article discusses suicide and may be upsetting to some readers.
It took 22 minutes for harrowing suicide-related content to appear in the “for you” feed of a new TikTok account set up for a 13-year-old girl.
The account’s creator - a Christchurch documentary maker researching social media for a project - described what she saw as “disturbing, discombobulating and gross”.
Nadia Maxwell is now calling on social media companies to be much stricter on what young users are exposed to on their apps.
“It was quite shocking… I felt so gross,” she said.
Christchurch documentary maker Nadia Maxwell. Photo / George Heard
Maxwell set up pages on TikTok, Instagram and Snapchat posing as a 13-year-old girl as an “experiment” to better understand how the apps worked.
“I feel like we hear a lot about the disturbing content out there on social media but that most parents probably have a limited understanding of what is actually on there - myself included,” she explained.
“Where prompted, I gave my interests as things that my own teenage girls like: animals, health and fitness and Taylor Swift.
“I actively searched using words like kittens and netball to reinforce to the algorithm the harmless content I was seeking.
“It took 22 minutes and 15 seconds for TikTok to show me the first suicide-related video.”
In the days that followed, more graphic posts appeared in the feed - including content about murder, child abuse and violent crime.
TikTok says its For You Page (FYP) is a “personalised feed of content based on your interests and engagement” and “will reflect your interests and show you creators and content you’re likely to enjoy”.
“It was a very, very harrowing experiment,” Maxwell said.
“I was trying to redirect the algorithm. When the content got too heavy, I would type in something like ‘tiny farm animals’ or ‘netball’.
“But by day three, none of that was in the FYP.
“It was also quite emotionally confusing. You watch suicide-related video, but then it’s followed by a lip sync or a funny video, so it’s quite discombobulating. You don’t actually have any time to mentally process what you’ve just seen.
“And that’s me coming at it as an adult with a whole lifetime of experience to draw on. So what must it be doing to the brains of 11-year-old kids?”
TikTok has at least 2.05 billion users and the app has grown faster than most other social media apps. Bay of Plenty Times Photo / Alex Cairns
Maxwell’s project is ongoing but she told the Herald she feels “compelled” to share her TikTok experience.
“These kids have got a portal in their pocket to the digital world and it’s an unrealistic expectation on parents that they should be able to monitor every minute,” she said.
“An 11-year-old kid can just stumble upon this stuff on TikTok any minute of the day.”
Christchurch documentary-maker Nadia Maxwell carried out experiments on social media apps to see what content is being recommended to young users. Photo / George Heard
‘Distorted version of reality’
According to TikTok’s user guidelines, the more a person uses the app the more their FYP page will reflect their interests.
Interest is gauged, in part, by how and how long a person engages with a particular piece of content.
Users can apply filters to remove content containing specific words and their variations from their feeds.
And there are other ways to restrict content that is not “comfortable”.
A TikTok spokesperson told the Herald in a statement the app had clear guidelines about content, and automated technology removed 80% of videos that were in violation of those - 98.2% of those “proactively before a user report”.
He said some videos reported may not violate the guidelines.
Measures were in place too, to recognise younger users, he said.
TikTok policy requires users to be 13 or older to have an account.
“Our in-built safety features recognise that people develop at different stages. Teens aged 13-15 will experience a more strictly controlled version of TikTok than those aged 16-17, and young people aged 16-17 will have a different experience to an adult,” the spokesperson said.
“Our Family Pairing tools put parents and guardians in control of their teens' accounts, including how much time they can spend online, and the type of content they can see.”
All users could put their account into restricted mode, which prevents any content that “may not be comfortable... such as content that contains mature or complex themes” from appearing on their feed.
TikTok says users of different ages have a different experience on the app. Photo / TikTok
If people search for content related to suicide or self-harm on the app, they are shown intervention and support information from organisations like Lifeline Aotearoa or Youthline Helpline.
Cybersafety specialist John Parsons said exposure to social media at a young age led to “lost childhoods”.
“A young person’s brain is designed to absorb vast amounts of information and experiences during childhood. These early experiences significantly shape and inform the decisions and interests that young people make/have as they grow.
“Children’s developing brains should not be exposed to social media platforms like TikTok,” he told the Herald.
Parsons said social media “often represents a distorted version of reality” which was harmful to young people.
“Endless scrolling fosters dopamine-chasing behaviours and increases stress. This can lead to self-doubt and in my experience, particularly around ages 12 and 13, leave children struggling with anxiety.”
Parsons said New Zealand needed to “get serious about the issue”.
“The Government needs to step in and sanction platforms that store harmful content within their databases - content that can be retrieved by algorithms designed to exploit user preferences,” he said.
“These algorithms not only cater to a user’s existing interests but, in my opinion, expand them in potentially harmful ways.
“One potential solution is to remove algorithm-driven content recommendations for users under 18, allowing young people the freedom to explore content organically without being influenced by stored data and targeted suggestions.” Australia’s government introduced a law to parliament this week proposing to ban children under 16 from social media and threatening multimillion-dollar fines for companies which don’t comply. Prime Minister Anthony Albanese has said the pervasive influence of platforms like Facebook and TikTok is “doing real harm to our kids”.
Anna Leask is a Christchurch-based reporter who covers national crime and justice. She joined the Herald in 2008 and has worked as a journalist for 18 years with a particular focus on family and gender-based violence, child abuse, sexual violence, homicides, mental health and youth crime. She writes, hosts and produces the award-winning podcast A Moment In Crime, released monthly on nzherald.co.nz
SUICIDE AND DEPRESSION
Where to get help:
• Lifeline: Call 0800 543 354 or text 4357 (HELP) (available 24/7)
• Suicide Crisis Helpline: Call 0508 828 865 (0508 TAUTOKO) (available 24/7)
• Youth services: (06) 3555 906
• Youthline: Call 0800 376 633 or text 234
• What's Up: Call 0800 942 8787 (11am to 11pm) or webchat (11am to 10.30pm)
• Depression helpline: Call 0800 111 757 or text 4202 (available 24/7)
• Helpline: Need to talk? Call or text 1737
• Aoake te Rā (Bereaved by Suicide Service): Call or text 1737
If it is an emergency and you feel like you or someone else is at risk, call 111
SEXUAL HARM
Where to get help:
If it's an emergency and you feel that you or someone else is at risk, call 111.
If you've ever experienced sexual assault or abuse and need to talk to someone, contact Safe to Talk confidentially, any time 24/7:
• Call 0800 044 334
• Text 4334
• Email [email protected]
• For more info or to web chat visit safetotalk.nz
Alternatively contact your local police station - click here for a list.
If you have been sexually assaulted, remember it's not your fault.
Take your Radio, Podcasts and Music with you