WSJ study shows that youth are exposed to content that is conflict-related on TikTok

In a shocking experiment conducted carried out by The Wall Street Journal, automated accounts that posed as 13-year-olds using TikTok were bombarded with controversial and often extremist content that relates to the Israel-Gaza war.

WSJ study shows that youth are exposed to content that is conflict-related on TikTok

This study reveals the powerful impact of the algorithm on TikTok, that creates a highly personalised feed based upon user interaction.

The Wall Street Journal created various bot accounts that were registered as 13-year-olds in order to use the content curating capabilities of TikTok. These bots, who only stopped watching TikTok videos on the conflict between Israel and Gaza, swiftly were flooded with related content. The algorithm offered videos that were frequently polarized with either pro-Israel or pro-Palestinian views Many of them incited anxiety and showed graphic scenes.

Within a few hours, bots were presented with videos that were highly conflicted, with a lot of videos advocating extreme views. The bots were shown a variety of videos that were alarmist, with some of them predicting apocalyptic events. A majority of these videos favored the Palestinian viewpoint, with a number showing children in pain, protests and depictions of death.

The response of TikTok and its the company's policies

TikTok declared that the trial is not a reflection of the actual experience of teenagers because real users interact using the app in different ways, such as sharing, liking, and looking for videos. The platform also announced its efforts to get rid of millions of videos that contain dangerous content.

This research raises a lot of questions about the effects of the algorithm used by TikTok on youngsters, particularly in the way it could guide them into channels of content. Exposure to these kinds of intense and polarized media at an early age could affect the way they perceive complex global issues, as well as their mental wellbeing.

TikTok has features for family control that let parents filter content, however the study suggests that these are not enough. Additionally, the results could draw the attention of regulators, given the increasing concerns about the effects of social media in young minds.