Some might find information in this Press Release triggering.
- Technical research in partnership with the Algorithmic Transparency Institute and AI Forensics using automated accounts showed that after 5-6 hours on the platform, almost 1 in 2 videos were mental health-related and potentially harmful, roughly 10 times the volume served to accounts with no interest in mental health.
- There was an even faster “rabbit hole” effect when researchers manually rewatched mental health-related videos suggested to “sock puppet” accounts mimicking 13-year-old users in Kenya, the Philippines and the USA.
- Between 3 and 20 minutes into our manual research, more than half of the videos in the ‘For You’ feed were related to mental health struggles with multiple recommended videos in a single hour romanticizing, normalizing or encouraging suicide.
- TikTok’s very business model is inherently abusive and privileges engagement to keep users hooked on the platform, in order to collect evermore data about them. It unequally applies protections for users around the world.
TikTok’s content recommender system and its invasive data collection practicespose a danger to young users of the platform by amplifying depressive and suicidal content that risk worsening existing mental health challenges, two companion reports released today by Amnesty International show.
The two reports -Driven into the Darkness: How TikTok Encourages Self-harm and Suicidal Ideation and the I Feel Exposed: Caught in TikTok’s Surveillance Web -highlight the abuses experienced by children and young people using TikTok, and the ways in which these abuses are caused by TikTok’s recommender system and the underlying business model.
The findings of a joint technical investigation, with our partners – the Algorithmic Transparency Institute (ATI) at the National Conference on Citizenship and AI Forensics – show how children and young people who watch mental health-related content on TikTok’s ‘For You’ page are quickly being drawn into “rabbit holes” of potentially harmful content, including videos that romanticize and encourage depressive thinking, self-harm and suicide.
“The findings expose TikTok’s manipulative and addictive design practices, which are designed to keep users engaged for as long as possible. They also show that the platform’s algorithmic content recommender system, credited with enabling the rapid global rise of the platform, exposes children and young adults with pre-existing mental health challenges to serious risks of harm,” said Lisa Dittmer, Amnesty International Researcher.
The platform’s algorithmic content recommender system, credited with enabling the rapid global rise of the platform, exposes children and young adults with pre-existing mental health challenges to serious risks of harm.Lisa Dittmer, Amnesty International Researcher
The issue A young person’s perspective #FixTikTok – Guest blogs by youth activists ‘For You’ Feed
Driven into the Darkness: How TikTok Encourages Self-harm and Suicidal Ideation details how TikTok’s relentless pursuit of young users’ attention risks exacerbating mental health concerns such as depression, anxiety and self-harm.
TikTok’s ‘For You’ feed is a highly personalized and infinitely scrollable page of algorithmically recommended content, picked out to reflect what the system has inferred to be a user’s interests.
Technical research was conducted using more than 30 automated accounts set up to represent 13-year-olds in Kenya and the USA to measure the effects of TikTok’s recommender system on young users. An additional manually run simulation involved an account each in Kenya, the Philippines and the USA.
The technical research revealed that after 5-6 hours on the TikTok platform, almost 1 in 2 videos shown were mental health-related and potentially harmful, roughly 10 times the volume served to accounts with no interest in mental health.
There was an even faster “rabbit hole” effect when researchers manually rewatched mental health-related videos suggested to research accounts mimicking 13-year-old users in Kenya, the Philippines and the USA.
Between 3 and 20 minutes into our manual research, more than half of the videos in the ‘For You’ feed were related to mental health struggles with multiple recommended videos in a single hour romanticizing, normalizing or encouraging suicide.
TikTok can lead you to very dark places
Tiktok’s ‘For You’ feed risks easily leading younger users down rabbit holes of depressive and harmful content.We found that within 20 minutes or less, teen accounts that signalled their interest in mental health content were mostly shown videos related to depression and self harm. tell tiktokTake urgent measures now to stop at-risk users from falling into rabbit holes of harmful content.take action
Addictive by Design
Focus group discussions, interviews and simulations of children’s TikTok accounts in Kenya, the Philippines and the USA, as well as existing evidence from the fields of social media harms research and public health, reveal how TikTok’s platform design encourages the unhealthy use of the app.
When I watch a sad video that I could relate to, suddenly my whole ‘For You’ Page is sad and I’m in ‘Sadtok’.Francis*, an 18-year-old student in Batangas Province, Philippines
*Luis, a 21-year-old undergraduate student in Manila who has been diagnosed with bipolar disorder, told Amnesty International of his experience with TikTok’s ‘For You’ feed.
“It’s a rabbit hole because it starts with just one video. If one video is able to catch your attention, even if you don’t like it, it gets bumped to you the next time you open TikTok and because it seems familiar to you, you watch it again and then the frequency of it in your feed rises exponentially,” said Luis.
*Francis, an 18-year-old student in Batangas Province, Philippines, observed: “When I watch a sad video that I could relate to, suddenly my whole ‘For You’ Page is sad and I’m in ‘Sadtok’. It affects how I’m feeling.”
Another focus group participant explained, “The content I see makes me overthink [even] more, like videos in which someone is sick or self-diagnosing. It affects my mentality and makes me feel like I have the same symptoms and worsens my anxiety. And I don’t even look them (videos) up, they just appear in my feed.”
*Joyce, a 21-year-old woman in the Philippines said, “I deleted it [TikTok] for a while because I was very addicted to it… I would spend so many hours on TikTok just scrolling through videos because you can’t help but wonder what goes up next when you scroll down.”
Children and young people interviewed in Kenya said that they felt their TikTok use affected their schoolwork, social time with friends and led them to scroll through their feeds late at night instead of catching enough sleep.
These testimonies were corroborated by various adolescent psychologists consulted by Amnesty International as part of the research.
While young people’s individual responses and contextual factors affecting their social media use may vary, like other social media platforms, TikTok has made design choices intended to maximize users’ time spent on the platform.
“Our research shows that TikTok may expose children and young people to serious health risks by persisting with its current business model geared more at keeping eyes glued on the platform over respecting the right to health of children and young people”, said Lisa Dittmer, Amnesty International Researcher.
TikTok should be safe by design, not addictive by design.
TikTok’s addictive feature, the ‘For You’ feed, a highly personalized and endlessly scrollable page of algorithmically recommended content, taps into what psychologists describe as the “reward pattern of winning or losing on a slot machine”. TikTok is designed to tap into users’ desires to be rewarded, which can lead users to develop habits that develop habits that encourage addictive use.tell tiktokMake TikTok safe by design not addictive by design!take action
The Surveillance Web
“I feel Exposed”: Caught in TikTok’s Surveillance Web shows how TikTok’s rights-abusing data collection practices both underpin and are sustained by the harmful user engagement practices.
Amnesty International’s research shows that TikTok’s very business model is inherently abusive and privileges engagement to keep users hooked on the platform, in order to collect evermore data about them. TikTok then uses this data to create profiles of users and draw inferences about them, which allows it to cluster users in groups to target them with highly personalized content to keep them engaged. These groups and categories are also made available to advertisers so that they can target users with personalised ads.
To the extent that TikTok has put in place policies and practices to ensure greater respect of children’s rights, they differ from region to region, leaving children and young people in some parts of the world exposed to exploitative data collection in others.
“TikTok targets users, including children, with more invasive data harvesting practises in parts of the world where people have fewer protections for their data under local laws and regulations – meaning children living in countries with weak regulation, including many countries of the Global Majority are subject to the worst abuses of their right to privacy,” said Lauren Armistead, Amnesty Tech Deputy Programme Director.
“TikTok must respect the rights of all its younger users, not just those in Europe, by banning all targeted advertising aimed at those younger than 18 globally.”
TikTok must also stop hyper-personalizing the ‘For You’ feed by default, and instead let users actively choose which interests shape their content recommendations based on their informed consent and if they want a personalized feed.
While Amnesty International calls on TikTok to take these urgent steps towards a rights-respecting business model, binding regulation is also needed to protect and fulfil children and young people’s rights.
The best way to protect children from abuse of their personal data online is for governments to ban by law all targeted advertising based on the invasive collection of personal data.
Responding to our findings, TikTok pointed us to its Community Guidelines, which set out which types of content are banned and thus, if reported or otherwise identified, removed from the platform. These include a ban on content “showing, promoting, or providing instructions on suicide and self-harm, and related challenges, dares, games, and pacts”, “showing or promoting suicide and self-harm hoaxes” and “sharing plans for suicide and self-harm.”
TikTok stated that it is in the process of developing a “company-wide human rights due diligence process which will include conducting periodic human rights impact assessments.” The company did not provide details on which specific risks to children and young users’ human rights it has identified. That TikTok currently does not have a company-wide human rights due diligence process in place is a clear failure of the company’s responsibility to respect human rights.
Tiktok makes money by collecting data about you – such as who you are and what you like.
Children are no exception.This corporate surveillance for profit undermines children’s right to have control over their personal information.
It’s an abuse of the right to privacy and freedom of thought.tell tiktokWe demand an end to all targeted advertising aimed at children globally.take action
Both reports add to evidence explored in Amnesty International’s previous research reports. Surveillance Giants exposed how the business model of Facebook and Google is inherently incompatible with the right to privacy and poses a threat to a range of other rights including freedom of opinion and expression, freedom of thought, and the right to equality and non-discrimination.
Amnesty International’s reports on Myanmar: The social atrocity: Meta and the right to remedy for the Rohingya and Ethiopia: Meta’s failures contributed to abuses against Tigrayan community during conflict in northern Ethiopia both reveal how the Facebook platform’s engagement-based business model can result in devastating impacts through the amplification of extreme content that incites violence, hatred and discrimination ultimately contributing to serious human rights violations and abuses.
Together, these reports contribute to the growing evidence base of Amnesty International’s global campaign for corporate accountability and redress for human rights abuses associated with the surveillance-based business model of Meta, Google, TikTok and other “Big Tech” platforms.
Written by Luis (pseudonym), 21-year-old student from Manila.
What is TikTok?
I use TikTok primarily for entertainment. TikTok is a platform that allows you to watch a variety of short videos. While other social media such as Facebook, Twitter, or Instagram would focus on texts or images, TikTok really capitalizes on videos.
On other video-centric social media such as YouTube, content is recommended too, but you have more control over what you watch. You still have to search and click.
On TikTok, the content comes to you. TikTok feeds you content, rather than offering you content. You end up scrolling through a long, perhaps endless, list. Since the videos are short, you wouldn’t notice the time pass, and suddenly you’re there for hours. It’s addictive because it’s fast-paced and spectacle-based. It makes the decisions for you.
Falling into the rabbit hole
In your ‘For You’ feed, you would notice that it is indeed based on your past viewing. Even if you stay just for a short while on a particular video, it will then show the same type of content as you scroll through. I view it as a rabbit hole. With just taking a peek, you risk falling down, and then that type of content starts to bombard you.
It rises not just in frequency, but also in intensity. It all starts with just one video – that curious rabbit leading you down into a ‘wonderland’, that red-haired clown down the sewers.
The next thing I see would be a barrage of videos on self-harm and even death, mixed with videos using psychological language – psychospeak – that would claim to ‘unpack’ my feelings.
As someone with Bipolar II disorder, I used TikTok for both my hyperactive and depressive periods. When I’m hyperactive, TikTok is appealing, because the fast-paced cacophony is able to stimulate my mind and give me that ‘rush’.
The videos would be all bright and energetic, inducing a mental and bodily response that, if sustained, would ultimately be dangerous. It’s essentially a prolonged ‘high’.
When I’m down, I would ‘escape’ my mind by mindlessly wandering through the feed. I would then encounter videos that affirm my emotions, and I would get trapped in that type of content for a long time.
When I’m depressed for instance, I would ‘hyperfixate’ or get stuck with just one video of sad literature and photos. The next thing I see would be a barrage of videos on self-harm and even death, mixed with videos using psychological language – psychospeak – that would claim to ‘unpack’ my feelings.
Escaping the pit
In those moments it felt like TikTok sort of helped, because the videos validate your feelings and even allow you to reflect on them-genuinely or not, accurately or not. However, in hindsight, it just reinforced those states of mind.
It delayed my recovery by pulling me into that rabbit hole, and it makes it so that escaping the pit is even harder.
The content would even make me feel that I’m more disordered than I already am by ‘pathologizing’ or treating as abnormal even the most mundane of habits. It’s even more scary because it goes beyond TikTok. Suddenly, my other social media would then exhibit the same content.
It delayed my recovery by pulling me into that rabbit hole, and it makes it so that escaping the pit is even harder.
When I was already in a different state of mind, the feed would still show those types of content. So, what I would do is to manually try to ‘hijack’ the feed by watching, liking, and commenting on other types of content until these new genres come up.
It feels very heavy because you’re going against the wave. You need to make that conscious effort to escape the rabbit hole, but what if you’re still in a difficult place and you don’t have that energy and clarity of mind?
How to make TikTok safe for young people
I strongly believe that to make TikTok safe, firstly, the ‘For You’ page should not overly rely on viewing as an indicator of interest. The recommendations should be based more on what the user likes and follows, and other means through which users could intentionally express interests. With this, we are given more control over what we watch.
Having a separate ‘Following’ feed is not enough, because the platform must be able to balance in one feed the content the user has subscribed to, but also provide an opportunity to explore beyond. It feels like the ‘Following’ feed is deliberately made less enticing than the ‘For You’ feed, because the latter seems freer, even if it is not.
Second, there should be more visible warnings or disclaimers on the content, especially if they involve mental health. What’s dangerous is that these videos use psychological language or psychospeak to seem authoritative, but we do not know whether the creators are indeed knowledgeable about what they say. With this, professionals must be distinguished from those otherwise.
Third, TikTok should also show prompts telling the user the length of their stay on the platform. “You have been using TikTok for 30 minutes now. Do you want to continue?” This is to aid users who unconsciously stay in the rabbit hole for hours, thereby compromising their health and even daily functions.
Fourth, TikTok should have a more visible and accessible way to allow users to express that they do not like particular content and prefer not seeing them. This would avoid the burden of having to ‘trick’ the algorithm manually.
Fifth, I think the default setting of social media is to collect data on its users across platforms and websites. Social media may give us options to limit or stop some of these settings, but this shouldn’t be the default in the first place. This is so the reinforcing spiral of content does not spillover to the entirety of the person’s online presence.
Overall, the problem isn’t just limited to TikTok. It lies on the fundamental logic of social media to feed you what it thinks you want. It removes your agency by assuming your agency. To be safe and rights-based, the counter-logic is simple: Allow the users to personalize their feed themselves, and the means to which should be visible and accessible, especially to young people. Allow users to ‘climb’, not to ‘fall’.
These guest blogs are written by independent youth activists and are intended to give a platform to the voices of young people. They originate from Pakistan and Argentina.
Duaa e Zahra, Pakistan
Duaa recently graduated with a Bachelor’s in Economics. She has worked with organizations including Amnesty International, UNCTAD, Talloires Network, Girls Human Rights Hub, Pakistan’s Human Rights Ministry, and Gallup, primarily on human rights and education, and she hopes to continue contributing to these areas through on-ground community engagement, policy & legal reform, and journalism.
TikTok’s Toll on Me
My hands seem to have a mind of their own as they continue scrolling through video after video. Seconds spill into minutes spill into hours, but when sunlight slips into my room, I’m surprised: it all felt like seconds to me. My eyes are red from the screen, and I think any and all repercussions end there: a sleepless night and consequent fatigue for the day. But if I zoom out from my life until everything is a bird’s eye view, I’d get a fuller picture: how self-criticism has become my native language, how others’ validation (or lack thereof) shapes almost everything I think and feel, how almost nothing is stimulating or surprising anymore, how boredom and isolation are overwhelmingly familiar, how I feel numb, excluded, and disconnected more often than not. Meanwhile, TikTok’s ‘For You’ page continues to suck me in, optimized to maximize my engagement even as I feel like my well-being is subtly disintegrating.
TikTok, one of the many sensations of our generation. Enter app, and let the app do the rest as it pulls you into a world made just “for you” for hours on end. What could go wrong?
A platform designed to be hyper-personalized and addictive risks being a breeding ground for toxicity: children might view triggering content that can impact their mental health, and TikTok’s algorithm can potentially transform that viewing into an endless rabbit hole of similar content. So, what if our rights to privacy, health, and free-thinking hang by a thread, as long as it generates profit for our favorite multi-billion dollar app?
You might argue that TikTok responds to what we seem to love by drowning us in everything that bears resemblance. I would counter that love (especially one only skimming reality’s surface) can be misleading. How else would you describe the bubble that overly customized content could spin around you until you can’t think or feel beyond what you’ve always thought and felt? Given the fragility of human emotion, seeing even one distressing video can prove to be triggering. But in TikTok world, viewing that one video is a slippery slope to viewing one after the other. The algorithm takes views of mental health content to signify an interest in the topic that it can respond to by flooding a user’s feed with similar videos that could, according to Amnesty International’s research, romanticize self-harm and depressive thinking, aggravating psychological vulnerabilities.
Fragility becomes more fragile when there’s nothing to hold it up. And that’s exactly what happens when TikTok, with its built-in addictiveness, starts consuming our entire lives. Sleep, school, work, relationships, and hobbies dim against the alluring light of the screen, and suddenly, sacrificing all of them to watch “just one more video” becomes almost laughably effortless. After all, what’s more exhausting than life distracting us from TikTok?
Consent is only truly consent when anchored in free and informed choice. But the problem here (among many other problems) is that we never meaningfully made this choice. Although TikTok, like other apps, has terms and conditions that you technically “agree” to, they can be challenging to understand and are often not child-friendly. This leaves young people in the dark about how their data will be used and the business model that is built on the massive collection of users’ data. Alarmingly, the impact is disproportionate along geographic boundaries. While TikTok is being forced to rein in its extractive surveillance-based business model for children in Europe, where there is stronger regulation, it takes advantage of weaker laws or enforcement elsewhere, including in the Global South, by not extending the same protections to young people there at the cost of their rights.
In response to extensive research and young people’s concerns that have uncovered the unforgiving reality of TikTok, Amnesty International is urging the company to rectify its operations. TikTok needs to take responsibility and eliminate the harm entirely by adhering to Amnesty’s calls for a global ban on targeted advertising aimed at children; personalization that’s opted-in and based on child-friendly language, informed consent, and active two-way communication; and an introduction of daily limits.
Deleuze said, “The painter does not paint on an empty canvas, and neither does the writer write on a blank page; but the page or canvas is already so covered with preexisting, preestablished cliches that it is first necessary to erase, to clean, to flatten, even to shred, so as to let in a breath of air from the chaos that brings us the vision.” As we erase, clean, flatten, and shred layers of digital pain, we will soon be left with the space to breathe and the time to dream. So, if I close my eyes now, zoom out from the lives we could have until everything is a bird’s eye view, I’d get a fuller picture: how empathy has become our native language, how our diverse experiences and visions shape the world, how almost everything is informative and inspiring, how creativity and community are overwhelmingly familiar, and how we feel impassioned, included, and connected more often than not.
Abril is a human rights activist, currently studying a bachelor’s degree in political science. She’s part of the young group of activists at Amnesty International Argentina, and is also part of the RIGHTSclick programme. She is also working as a young foresight fellow at UNICEF Innocenti.
The B Side of Social Networks
In today’s digital age, social media, like Tik Tok, has become an integral part of our lives. These platforms offer us entertainment, knowledge and the possibility of connecting with people from all over the world. However, behind the facade of fun and connectivity, there is a B side that threatens our rights and exposes great social inequalities.
In the European Union and European Economic Area (EEA) countries the protection of privacy rights and the regulation of social networks for children and young people have become priority issues in recent years. Data protection laws impose strong restrictions on the collection and use of personal information by platforms. These regulations have the main objective of protecting the privacy of users and ensuring that companies operate in an ethical and responsible manner.
TikTok is taking concrete steps to protect children under the age of 18 in the European Economic Area, the UK and Switzerland from certain abuses, such as banning targeted personalised advertising aimed at this demographic. This clearly demonstrates that the platform can function without relying on these abusive elements in its business.
However, this approach leads us to ask important questions about the situation in countries outside the EU and the EEA, which is completely different.
In Argentina and other countries in the global south, inequalities in the regulation of social networks are evident. The privacy policies used by TikTok are more data extractive for children under the age of 18, because regulation or enforcement of existing regulation is weaker. This creates a significant gap in privacy protection that becomes more evident every day and marks an important alert to demand action.
The disparity in regulation raises important questions about global inequality in the digital age. Why can the same platforms operate with different standards in different parts of the world? Why does TikTok respect the rights of children in Europe more than in other parts of the world?
The social inequalities and disparity relationships that have been evident globally for many years are now being recreated in digital environments. And in an era where the lives of most children and young people are intertwined with technology, it is essential to demand adequate and equal protection for all.
Protecting children and young people online must be a global priority, not a privilege limited to certain regions. Tik Tok’s measures show us that another path is possible. Equal rights and privacy protection must extend beyond the borders of Europe and reach all children and young people, no matter where they are.
It is both a political and corporate responsibility that we must demand and defend, to ensure that no child or youth is exposed to relationships of disparity in the digital environment.