Should your company hitch itself to the TikTok wagon?
There is growing evidence to suggest that communicators should proceed with caution when it comes to the latest player to grace the social media stage.
TikTok content–differentiated from other social platforms for its flood of lip-synching and dance “challenges” and short-form comedy videos—has widely been viewed as a refreshing diversion. Marketing and communications professionals have seen the app's participatory format as a potential solution to the passive viewership typical on social media. Perhaps more importantly, TikTok's audience is sizable and on the rise: The app has been downloaded over 1.5 billion times, and sees 800 million monthly active users.
In stark contrast to the platform's sunny outlook, a new study from University of Haifa in Israel found that TikTok is fast becoming a cache of extremist sentiment and hate speech. The study, conducted from February through May 2020, revealed far-right extremist content promoting fascism, racism, anti-Semitism, chauvinism and xenophobia, researchers said. From encouraging violence to promoting conspiracy theories and glorifying terrorist organizations, the 200 posts studied mirror the harmful content that Facebook, Twitter, YouTube and others have been grappling with for years.
This isn't the first time the app has come under scrutiny. Since the platform, owned by Chinese tech company Bytedance, began gaining steam among global users in early 2018, concerns around security and surveillance have swirled and its use has been banned by the U.S. military.
The attraction of social media platforms to bad actors is nothing new. Just as hate and discrimination are an undeniable part of human history, so has hateful discourse spread on platforms built to amplify content to broad audiences. The Haifa study calls the phenomenon the "democratization of communications driven by user-generated content," which has increased extremists' awareness of social media's potential for weaponization.
What makes TikTok a particular risk for the general public (and by proxy, PR practitioners), according to the study, is its attraction to young users. Close to half of TikTok's users are between the ages of 16 and 24, and "although its Terms of Service prohibit users under age 13, many users who appear in videos are clearly younger. This creates an environment of vulnerability that is exploited by extremist groups," said the researchers.
While TikTok's algorithmic selection of content to promote in viewers' feeds is a closely guarded secret, Buzzfeed's Lauren Strapagiel conducted an experiment in which she found the app had the tendency to exclude content made by Black creators, particularly around the #BlackLivesMatter hashtag. This may be partially due to the app’s "bubble effect," in which the algorithm rapidly deploys similar content after a viewer likes or otherwise engages with a TikTok video. While TikTok issued an apology and stated it is taking steps to promote diversity and inclusivity on the platform, including establishing a "diversity task force," the platform has become a repeat offender in allowing racist content to spread. However, some users—most recently, a veritable army of K-pop fans, who allegedly helped to ensure a recent Trump rally in Tulsa, Oklahoma saw an underwhelming number of attendees—have managed to turn the tide, leveraging the platform for activism in solidarity with #BlackLivesMatter.
It remains to be seen whether TikTok will be able to get its problems under control any time soon, although it seems unlikely, given more established platforms continue to struggle against the weight of toxic content. Regardless, communicators would be well-served to continue weighing the platform’s risks against its potential rewards as they determine whether launching or continuing a presence on TikTok is advisable for their brand.
Follow Sophie: @SophieMaerowitz