Deplatforming the Far-Right Could Accelerate an Extremist Underground 

Deplatforming far-right figures like Alex Jones is just the tip of the iceberg. Photographs provided by Jared Holt. Graphic designed by Joanna Andreasson for Paradox.

The short-term benefits of deplatforming far-right actors and extremist content presents a paradox in long-term consequences. If the only collective answer to web-fueled extremism is to sweep it into the internet’s gutters, and deeper systemic issues are neglected, the far-right’s most dangerous elements will likely accelerate. 

Removing voices on social media who peddle disinformation, conspiracy theories, and extremist rhetoric can drastically reduce those voices’ prominence in national political discourse. It’s something large platforms can do quickly, and the move is usually a positive one in terms of protecting a free and open society. In some cases, this kind of action even thwarts attempts by foreign governments to exploit polarizing issues on social media in hopes of destabilizing the United States; as was the case when Facebook last year removed two networks of fake accounts linked to Iran and Russia. By removing bad actors as they flail on stage, healthy dialogue is fostered and the Democratic tradition strengthened.

However, these toxic voices—and their corresponding communities of support—don’t often disappear altogether after a banning. Instead, they migrate to platforms often smaller and less moderated. Dozens of social media platforms promising little-to-no “censorship” have existed for years on standby, ready to welcome these digital nomads in search of places to unleash their toxic beliefs and false claims. Several such platforms have explicitly advertised themselves to the far-right.

In my work as a professional researcher and reporter focused on political extremism and the internet, I often slide into the virtual shoes of an extremist-internet power user, consuming content at a rate that would put even the most dedicated fan to shame. My work does not stop after de-platforming occurs. Repeatedly, it becomes more concerning.

Some of the earliest adopters of these smaller, free-for-all platforms were extremists banned from mainstream social media in the wake of Unite the Right, the 2017 white supremacist gathering that wreaked havoc in Charlottesville, Virginia, and left Heather Heyer murdered in the street at the hands of a neo-Nazi. Then it was Proud Boys. Then it was QAnon believers. Then it was unlawful militia movement organizations. Then it was election conspiracy theorists. And so on.

As more extreme figures and audiences from different sects of the far-right join these smaller platforms, they enter a new kind of media environment that lacks strong opposition to radicalism and fosters cross-pollination of egregious political ideologies. Without “libs” to own on the new platforms, I’ve observed several prominent figures delve further into the most hateful ideologies animating the far-right: anti-Semitism, racism, and xenophobia, to name a few. And when they spiral down the rabbit hole, they often bring their audiences along with them.

You don’t have to take my word for it: It’s already happened.

Alex Jones, the infamous conspiracy theorist behind the Infowars media outlet, was never a particularly level-headed broadcaster. Jones’ hallmark is his obsessive and outlandish accusations of a type of New World Order; he believes that a shadowy ring of business, political and media leaders is quietly orchestrating a slow-motion demise of the Western world.

Deplatformed far-right figure Alex Jones captured at the Richmond Gun rally by researcher Jared Holt.

In 2018, Jones was banned from nearly every tech platform one can imagine. I played a role, albeit small, in making that so. He found himself unable to broadcast his increasingly extreme claims and ruthless harassment to the masses and Infowars was forced to move to alternative platforms to host its content. (Eventually, Infowars would build much of its own digital hosting infrastructure.) The outlet became less dangerous to the public and I would not change what happened today.

Though as the world turned on Infowars and the outlet went semi-underground, in respect to the mainstream social media ecosystem, Jones and his co-hosts began flirting with ideologies rarely welcomed on his program before. White nationalist youth activists began appearing regularly on air—one even assumed the temporary role as a B-list host—and anti-Semites received friendly interviews. Jones spoke more about his willingness to die for his cause and argued that his fans should feel the same.

Perhaps the incentives to conduct any form of self-moderation that may have existed at Infowars dissolved. Maybe Jones always supported those ideas to some degree. But out of the view of the masses, he became noticeably more vocal about these radical ideas. His audience lapped it up.

Milo Yiannopoulos, a similarly outlandish and noxious presence in national politics, underwent a similar transformation after he was banned from some major platforms in early 2017. Yiannopoulos has more openly catered to white nationalists in recent years, going so far as to create a recommended reading list for young movement members that contained several works of literature favorited by violent extremists.

It’s difficult to know for sure whether Yiannopoulos genuinely held these sympathies before his de-platforming (an expose published by BuzzFeed News in 2017 revealed leaked emails and video recording that seem to indicate he did). But in the transition to new platforms, Yiannopoulos abandoned all pretense of plausible deniability, just as Jones did.

When these audiences jump to new platforms, they will find their beloved figures among others who are even more extreme. These radical ideologues can then use their influence on smaller platforms to pressure and sway influencers.

Far-right rally goers at the Richmond Gun Rally. Captured by researcher Jared Holt.

And when figures like Jones and Yiannopoulos tilt further into extremism, their audiences—numbering in the tens and hundreds of thousands—often do the same. Online followers can develop para-social relationships with the internet personalities they adore, establishing unfounded degrees of trust and camaraderie with the content creators they consume. That means that at least some amount of their audiences will follow them from platform to platform and from ideology to ideology, no matter how dangerous.

There is little doubt as to deplatforming’s efficacy in reducing potential harms to the general public posed by bad actors. But the action isn’t a one step solution to the broader drivers of extremism online, and ultimately treats symptoms of radicalization rather than its causes.

Jared Holt is a resident fellow at Atlantic Council’s Digital Forensic Research Lab focused on extremism and the Internet. Follow him on Twitter @jaredlholt