By Ryanna Reid
TikTok is a popular platform that allows users to watch and discover millions of personalized short videos on its algorithm-driven “For You” feed. It also allows these users to create and share videos across the platform on their mobile devices. The app has over 1.5 billion monthly active users worldwide, making it one of the fastest-growing social media platforms. About six in ten teens ages 13 to 17 (63%) say they use TikTok, including 57% who use it daily and 16% who say they are on it “almost constantly” (Eddy). Many young users enjoy using TikTok as an outlet for self-expression, often engaging in challenges and trends promoted by other users. It provides them with entertainment while offering chances to go viral, making the possibility of fame more accessible. While the platform fosters creativity and a sense of connection among users, it also has flaws that are simply too significant to be ignored. With millions of young users scrolling daily, its failure to effectively regulate harmful content makes it a dangerous platform for youth. It may actively shape their behavior through harmful content, disguising a threat to safety as “entertainment.” I will argue that young users are often exposed to inappropriate material due to ineffective safety measures, TikTok trends may promote dangerous challenges, and the algorithm can foster self-destructive behaviors by presenting users with self-harm and suicide content.
While TikTok exposes youth to dangerous material that may lead to harmful behaviors, the platform also has its advantages. In the article “What’s TikTok Doing to Our Kids? Concerns from a Clinical Psychologist,” Alan Blotcky states that TikTok has positive aspects, such as providing entertainment, fostering creativity, and offering a space for shy children to build social networks. Like other social media platforms, it allows users to engage with content from individuals who share similar perspectives or experiences. It serves as a source for people to discover new interests, learn new things, and interact with others. TikTok promotes dance trends that encourage users to learn and challenge each other through online choreography, making it a source of entertainment. It also allows users to reveal their personality, skills, and talents, which appeals to young audiences. The app provides them with the experience of multiple social media platforms in one. Similar to YouTube, TikTok is a video-based platform. Like Facebook and Twitter, the primary way to consume TikTok content is to navigate the feed with short, easily digestible posts. Like Netflix, the platform relies on a recommendation algorithm rather than follower-based networks. Like Snapchat and Instagram, TikTok videos are primarily created on mobile phones, appealing to younger users who are more likely to own smartphones than computers (Berman).
However, these advantages are minor in comparison to the significant risks young users will face. While Blotcky acknowledges some benefits, he ultimately asserts that the company needs to take more responsibility in regulating harmful content and that parents should play a more active role in monitoring their children’s online activity. His article reveals that more than one-third of TikTok’s daily users are under 14, making them particularly vulnerable to its influence. Blotcky’s investigation highlights that within just a week of creating an account, suicide and self-harm content was easily accessible to minors. Additionally, TikTok exposes children to sexually explicit and drug-related videos, as well as dangerous viral challenges like the “devious lick” trend, which encourages theft and vandalism in schools. The article also raises concerns about child safety, highlighting how predators can easily target minors on the platform.
The platform’s safety features for youth and sensitive audiences are ineffective and the consequences of harmful content are too significant to be exchanged for simple entertainment and online networking. While TikTok claims to have safety features such as content moderation, restricted mode, and parental controls, these measures have proven to be ineffective in fully protecting young audiences. The responsibility for safeguarding young users becomes even more critical as the likelihood of these safety measures failing increases. TikTok needs to improve its content moderation by implementing stricter age verification measures to prevent young users from gaining access to harmful content. Additionally, the platform should establish an automated system to monitor and verify content appropriateness so that harmful material can be flagged and removed before posting to the public.
The trends that stem from the TikTok platform may pose a significant threat to youth. Many children are easily influenced by what they see others taking part in. As such, they might feel encouraged to mimic these actions, believing them to be exciting, with a lack of understanding of the risks associated with such behaviors. When users on the platform start viral trends, other users may copy their actions. Given the app’s large child audience, it is inevitable that some children will be exposed to these trends. The downside is that some trending TikTok challenges promote harmful behaviors that negatively impact youth, sometimes with fatal consequences.
A tragic example of a harmful TikTok trend is the “Blackout Challenge,” which has been linked to multiple child deaths. In “The TikTok ‘Blackout Challenge’ Has Now Allegedly Killed Seven Kids,” Clark reports on the deadly impact of this trend. The article mentions lawsuits filed by parents whose children were exposed to the challenge through TikTok’s algorithm-driven “For You” page, rather than them actively searching for it. Clark notes that at least seven children under the age of 15 died in 2021 while attempting the challenge, which involves choking oneself with belts, purse strings, or similar objects until losing consciousness. Despite TikTok’s efforts to block searches for the challenge, harmful content continues to resurface.
The platform’s content policies constantly fall short in protecting minors from life-threatening trends. Its response to harmful content is reactive rather than proactive, as the app only takes action after incidents occur. Additionally, a teen was arrested for assaulting a disabled teacher due to the “Slap a Teacher” TikTok challenge (Byrne), illustrating the legal and ethical consequences of harmful viral trends. The constant issues arising from these harmful challenges in association with children emphasize TikTok’s inability to adequately safeguard vulnerable youth. This raises concerns about its effectiveness in ensuring user safety, especially given its large underage audience.
These trends do not only promote physically harmful behaviors but also encourage destructive psychological behaviors, such as poor decision-making, desensitization to risky actions, and a disregard for long-term consequences. Challenges like the “Skull Breaker Challenge,” where children kick another person’s legs out from under them during a jump, have resulted in severe injuries such as spinal cord damage and even paralysis. Similarly, the “Dragon Breath Challenge,” which involves dipping food in liquid nitrogen, encourages kids to inhale vapors that can lead to internal organ damage and breathing difficulties (Werteen). These types of challenges are just a few examples of the dangers that viral trends on TikTok have posed to youth. These harmful trends are often normalized on TikTok, making them seem less risky than they truly are, and have enticed youth in destructive ways. This normalization may negatively impact their decision-making skills, as some of these actions have an obvious negative outcome. It is clear that kicking someone’s feet while they are jumping can lead to severe injuries, yet the excitement of participating in a viral trend seems to override rational thinking. Even the name “Skull Breaker” should signal the danger of the challenge. However, repeated exposure to such harmful content may dull a user’s emotional and behavioral response, causing them to overlook the obvious risks. In “Analysis of the Psychological Impact of TikTok on Contemporary Teenagers,” Zheng Lin states “Nowadays, the phenomenon of following the trend has gradually become pathological, and they do not care what the consequences will be.” This reflects how some of TikTok’s viral challenges can negatively impact youth perception of risks and consequences, making dangerous behavior seem not only acceptable, but appealing. The pull of instant validation through likes on TikTok challenges can become all-consuming, impacting children’s mental and emotional well-being. It is easy for kids to lose track of time and priorities as they chase the next viral trend or seek acknowledgement from their online peers (Das). The desire to participate and potentially gain social validation from these challenges may impair their ability to see the consequences of their actions, promoting reckless behavior and desensitization over time.
In addition to promoting and normalizing dangerous challenges, TikTok’s algorithm may contribute to self-destructive behaviors by repeatedly exposing users to similar harmful content. The app’s algorithm is designed to recommend content that aligns with each user’s preferences, displaying videos on the “For You” page based on their interests. Like other social media platforms, the algorithm gradually evolves as users’ viewing habits and preferences change. However, at times, it may form associations between certain types of content, which may promote harmful material. For instance, In “#Paintok: The Bleak Universe of Suicide and Self-Harm Videos TikTok Serves Young Teens” by John Byrne, Raw Story created a fake TikTok account posing as a 13-year-old user to investigate what kind of content the platform would recommend. On the first day of opening a teen account, Raw Story paused on police and military videos. In response, TikTok’s algorithm recommended videos about guns, which eventually transitioned to content about depression. Within just two days, the account was shown disturbing videos related to self-harm, suicide, and eating disorders, despite TikTok’s supposed bans on such content. Even Raw Story staff members found the content distressing, emphasizing its potential impact on impressionable children. Additionally, Byrne criticizes advertisers for placing ads next to self-harm content, suggesting that companies profit while children’s mental health suffers. This example reinforces concerns that TikTok’s algorithm not only exposes children to harmful content but also fosters an environment where self-destructive behaviors may be normalized.
While TikTok offers a space for creativity and entertainment, it also exposes users to content that can have a detrimental effect, especially on young, impressionable minds. Consider the tragic case of the “Blackout Challenge,” which has resulted in the loss of several young lives. It is proven through real-life tragedies that the line between entertainment and endangerment has become blurred on this platform. Without urgent improvements in safety measures and stricter content regulation, TikTok will continue to be a platform where harmful behaviors may be normalized, and youth are exposed to dangers. The responsibility falls on both the platform and society to protect the vulnerable and ensure that what is intended as “entertainment” does not come at the expense of youth safety and well-being. The fact that young users on the platform are highly susceptible to harmful content should not be taken lightly. Imagine your child. Is their safety worth the risk for a moment of destructive digital entertainment?
Works Cited
Berman, Marc. “Why Do Children and Young People Love TikTok so Much?” Programming Insider, 23 Mar. 2021, https://programminginsider.com/why-do-children-and-young-people-love-tiktok-so-much/.
Blotcky, Alan. “What’s TikTok Doing to Our Kids? Concerns from a Clinical Psychologist.” New York Daily News, 18 Nov. 2021, https://www.nydailynews.com/2021/11/18/whats-tiktok-doing-to-our-kids-concerns-from-a-clinical-psychologist/.
Byrne, John. “#Paintok: The Bleak Universe of Suicide and Self-Harm Videos TikTok Serves Young Teens.” Raw Story, 27 Oct. 2021, https://www.rawstory.com/paintok-tiktok/.
Clark, Mitchell. “The TikTok ‘Blackout Challenge’ Has Now Allegedly Killed Seven Kids.” The Verge, 8 Jul. 2022, https://www.theverge.com/2022/7/7/23199058/tiktok-lawsuits-blackout-challenge-children-death.
Das, Reshmita. “How to Address TikTok Challenges with Children: A Guide for Parents.” Mobicip, 25 Dec. 2023, https://www.mobicip.com/blog/how-address-tiktok-challenges-children-guide-parents.
Eddy, Kirsten. “8 Facts about Americans and TikTok.” Pew Research Center, 20 Dec. 2024, https://www.pewresearch.org/short-reads/2024/12/20/8-facts-about-americans-and-tiktok/.
Lin, Zheng. “Analysis of the Psychological Impact of Tiktok on Contemporary Teenagers.” ResearchGate, Feb. 2023, https://www.researchgate.net/publication/368472470_Analysis_of_the_Psychological_Impact_of_Tiktok_on_Contemporary_Teenagers.
Werteen, Nancy. “TikTok’s Toxic Trends: What Parents Need to Watch Out For.” WFMZ.Com, 3 Dec. 2024, https://www.wfmz.com/features/life-lessons/tiktoks-toxic-trends-what-parents-need-to-watch-out-for/article_046b6288-a744-11ef-a530-2ff3a99d94a3.html.
Course: ENGL1010 Spring 2025
Assignment: Research Argument
Instructor: Kevin Lamkins
Instructor Comments: Ryanna created a well-researched and convincing argument. She does not shy away from the counterarguments, and ultimately makes her case with compelling evidence.
Photo Credit: “TikTok-im Chat” by Christoph Shultz (Creative Commons License)


Leave a comment