search

Not just Kanye – it’s an online coalition of hate

TikTok and the Dark Web don’t merely host haters, they merge and amplify their messages, feeding a malignant mix to impressionable online audiences
Illustrative. Hacker. (via Twitter)
Illustrative. Hacker. (via Twitter)

The fusion of hate that is mounting online is bolstered by an environment that, to borrow the embattled rapper Ye’s own song lyrics, makes hate groups “harder, better, faster, stronger.”

Now, after being reinstated on Twitter after a temporary ban, Ye’s antisemitic rants continue to ignite a conversation that shows no signs of abating anytime soon. Yet the widespread discourse on the artist formerly known as Kanye West ignores the highly concerning trends on social media that brought us to this very point in the first place. 

Unencumbered by censorship, hate groups are free to raise money, coordinate activities, sell and purchase weapons, and set up a virtual library for their propaganda. Take the neo-Nazi website The Daily Stormer, which moved to the Dark Net when its OpenWeb registration was revoked. Its founder, Andrew Anglin, remains very much active and, unsurprisingly, is a fan of Ye’s rhetoric despite his distaste for Black people. 

That alliance of hate is a telling one. On the surface, a white racist should not support a Black man, but their shared hatred of Jews has them speaking the same language. With bigotry so easy to spread online, hate groups band together as strange bedfellows to form a disturbingly potent coalition of hate.

Shifting to less restricted platforms

After more than 20 years of research, my students and I have identified five main trends surrounding the behavior of antisemitic and other hate groups: migrating to other social media platforms, using the Deep Web, storing and disseminating propaganda in the cloud, developing a secret language of hate speech, and capitalizing on the power of algorithms. 

Hate groups have shifted from mainstream platforms to less restricted ones such as TikTok and Telegram. As networks like Facebook, YouTube, and Instagram began to crack down on hate speech and disinformation, placing many offenders in “Facebook jail” and with the platform’s demographics naturally skewing older, many hate groups have migrated to Chinese-owned TikTok – the world’s fastest-growing social media platform. There, hate speech is virtually unregulated and available to predominantly young audiences. While China has clamped down on freedom of expression in its own country, TikTok is allowed to spew hateful content worldwide. 

A recent study I co-authored found antisemitic comments on TikTok surged an alarming 900% from 41 in 2020 to 415 in 2021. Usernames with antisemitic titles such as “@holocaustwasgood” or “@eviljews” increased from only four in 2020 to 59 in 2021. Considering that a third of TikTok’s more than 138 million active users are 10-19 years old, the fact that they are exposed to persistent misogynistic, racist, and antisemitic content is highly disturbing. Exacerbating the problem, TikTok’s algorithm feeds its users similar posts to the ones they already engage with, sending them down an endless rabbit hole of hate. 

Taking hate more seriously

Some dismiss TikTok as an innocuous forum for children who want to be creative. Yet TikTok’s pattern of catering to young, impressionable, naïve audiences, combined with the impact of bad-faith actors who post hateful content, must be taken more seriously. Despite claims that TikTok and other platforms are monitoring content, a new variety of antisemitism has emerged in which hatred is articulated through “dog whistles” or coded language used for a specific audience. Jews, for example, are referred to as Skypes (to rhyme with kikes). Black people are “Googles,” Latinos are “Yahoos,” and Muslims are “Skittles.”  

Current concerns also extend to more mainstream platforms like Twitter, whose acquisition by Elon Musk casts doubts on whether the social media giant will engage in any form of content moderation – even when it comes to hate speech.

But it is on the Dark Web where antisemitic content truly thrives and festers. Inaccessible via what’s known as the “Surface Web,” where you and I search for restaurants, order books and play Wordle, the Dark Web operates in the vast walled-off realm of the Deep Web. It’s a lawless and faceless environment where hateful groups find a comfortable home not only on their own but more concerningly together, as a coalition that amplifies their individual and collective impact.

While it may be tempting to shrug off Ye’s defenders as a hateful nuisance, it is crucial to remember that violent terrorist groups spew similar rhetoric and also have access to the Deep Web. ISIS had to use cloud storage when navigating mainstream platforms became impossible. Thousands of films from Al-Qaeda, ISIS, Hamas, and Hezbollah are floating in internet archives.   

In an ideal world, Ye’s words should not matter. However, they reflect the cesspool of online hate that translates to violence on the streets. The Anti-Defamation League has documented a rise in antisemitic incidents in the US from 927 in 2012 to a record-high 2,717 in 2021. That is no coincidence.

We are missing a vital opportunity to call out not just Ye the individual, but the chronic trend of Jew-hatred itself. And what starts with Jews never ends with them. It spreads to other groups and reflects a decay in the moral fiber of society.

Let’s not talk about Ye. Let’s redirect the conversation toward forming a new coalition that counters the fusion and coalition of hate. While the Dark Net presents a tough challenge and there is no way to regulate it, however, it can be studied. Because words — whether they are uttered by an anonymous source or a celebrity — can and do kill.

About the Author
Dr. Gabriel Weimann is a professor of communication at the University of Haifa, a visiting professor at the University of Maryland, and the author of nine books, including “Terrorism in Cyberspace: The Next Generation.”
Related Topics
Related Posts