Disinformation’s cozy hub: TikTok

Photo+via+Business+Insider%2FRafael+Enrique%2FGetty+Images

Photo via Business Insider/Rafael Enrique/Getty Images

Delnaz Kazemi, Reporter

Previously published at The Progressive Teen

Editors’ note: This article was written before Congress conducted its March 2023 hearing on TikTok and its national security implications.


Anyone who studies political and social issues can see that there is a significant lack of traditional civic engagement and understanding among the young people of the United States, considering that only “half of Americans 18 to 29 voted in the 2016 presidential election,” according to the New York Times. Although the new generation could be described as bright, engaged, and politically active, this engagement is hampered by a number of dangerous external factors.

One might think that in this era of widespread social media use, and with young people being the largest consumer of digital information, that those users are well-informed about recent news. However, there is another issue that follows that: disinformation.

Photo via Business Insider/Rafael Enrique/Getty Images

Disinformation among young people has been a concern for many years, especially as social media has become a popular source of news for youth. There are specific instances that clearly show that social media disinformation works and causes extreme damage, such as the popularity of conspiracy theories related to vote count during the 2020 presidential election. TikTok is one of the most popular – and dangerous – platforms for dangerous political messaging among young people. Not only is the video-sharing app a national security issue when it comes to data security, but its risk of spreading false information quickly to the younger generation (which is a national security issue itself) is also incredibly dangerous. This will lead into the long term in regards to the way that people think both politically and otherwise.

Like other social media platforms, TikTok uses an algorithm to cater to users what they want to see. What’s different with TikTok is that rather than focusing on who one follows, it focuses on the specific amount of time a user views a video, how they interact with it, etc.

According to a New York Times article, SumOfUs, a corporate accountability advocacy group, found that after creating a TikTok account and watching 20 popular videos, “the algorithm had switched from serving neutral content to pushing more election disinformation, polarizing content, far-right extremism, QAnon conspiracy theories and false Covid-19 narratives.”

This is incredibly dangerous for anyone to view. Not only is TikTok’s main base growing in range of age, but children/teenagers’ being exposed to current news on an unregulated platform will not allow them to accurately learn about the world. An investigation by NewsGuard found that “within 35 minutes [of scrolling/engaging on TikTok], 8 out of 9 children were exposed to misinformation about Covid-19 while 6 were exposed to misinformation directly about Covid-19 vaccines.” This is just one example of the type of misinformation that children can quickly approach on the app.

The only way to mitigate the situation is to educate the public, especially young people, about disinformation, how to spot it, and how to find good sources. This could be done through classroom activities that test students’ ability to spot disinformation – after they’ve been taught what to look for – in articles, news reports, etc. With this knowledge, students will be prepared to confront the multitude of disinformation looming on TikTok and other social media platforms.


The views articulated in this piece are the author’s own and do not necessarily reflect the official stances of The Progressive Teen or High School Democrats of America