170073537056117.webp

WSJ study reveals exposure of youth to violent content on TikTok

In an enlightening experiment conducted carried out by The Wall Street Journal, automated accounts that posed as 13-year-olds using TikTok were bombarded by extremist and often polarizing content that relates to the Israel-Gaza war.

WSJ study reveals exposure of youth to violent content on TikTok

This study reveals the powerful impact of the algorithm on TikTok, which creates a highly customized feed based on the user's interactions.

The Wall Street Journal created various bot accounts with a 13-year-old user name to look at The content curatorship on TikTok. The bots, which merely stopped watching TikTok videos on the conflict between Israel and Gaza, soon saw a flurry of similar content. The algorithm played videos that are often divided with either Israel or Palestine-based views and many of them stoked anxiety and showed graphic scenes.


Within a matter of hours, bots were served videos that were highly conflicted, with a lot of videos that endorsed extreme views. The bots were presented with a myriad of videos that were alarmist, with some forecasting apocalyptic scenarios. A majority of these videos favored the Palestinian viewpoint, with a number depicting children suffering, protests as well as descriptions of death.

The response of TikTok and its the company's policies

TikTok said that the test is not a reflection of the actual experiences of teens because real users interact using the app in different ways, like sharing, liking and looking for videos. The platform also announced its efforts to get rid of millions of videos that contain dangerous content.

This research raises a lot of questions regarding the effect of TikTok's algorithm for young users, specifically in the way it could guide them into the rabbit hole of content. Being exposed to these kinds of intense and polarized media in a young age may affect their comprehension of global issues as well as their mental health.

TikTok provides family-friendly features that let parents filter information, but the test indicates that they aren't enough. Additionally, the results could draw the attention of regulators, given the increasing concern about the effects of social media on children's minds.

170073537014693.webp