How social media became a breeding ground for far-right movements IRL

The internet has become a breeding ground for hate, but the insidious rise of discrimination online is spilling over from the underground and into the real world, and it’s triggering tangible consequences.

Now that social media is so intrinsic to our everyday lives, it’s difficult to imagine there was a period when every thought couldn’t be shared with millions of people at the click of a button, with little to no consequences at all. Thanks to the anonymity that being online provides, there’s no real reason to hide your views any more, no matter how heinous or hateful. Gone are the days when individuals had to keep their prejudices to the confines of their minds out of fear of backlash, losing employment or being ostracised. “People with hateful and discriminatory beliefs won’t bump into like-minded groups strolling down the streets, in social settings, or in their workplaces – that’s because there are real-life consequences for embracing such ideologies,” says Sabby Dhalu, co-convenor of Stand Up to Racism and joint secretary of Unite Against Fascism. “The internet certainly kept these groups satisfied for a period, but over time these communities have spread like wildfire, and now they’re leaving the confines of social media and gaining the confidence to take their beliefs into the real world.” These far-right groups are thriving like never before, gaining prominence not only on the internet, but in the mainstream, on TV and in politics. “This kind of bigotry is nothing new, but it seemed that at one point we were finally making progress in combating it – now we have the likes of [the home secretary] Suella Braverman using derogatory terms in front of millions,’” Dhalu adds. So how exactly have these beliefs become so prominent and acceptable to be shared in the real world again?

“The internet is a double-edged sword. We all agree that we should have as much privacy
as possible. However, with privacy comes the confidence to say almost whatever you desire,” says Amarnath Amarasingam, assistant professor at the School of Religion at Queen’s University in Ontario, Canada. “There are regulations, terms of service and guidelines to help combat hate speech and discrimination, but with so many users, it’s near impossible to police every single person on these huge platforms. The current systems put in place work to an extent, but there’s far too much slipping through the cracks.”

One social media platform that has become a place where these hateful ideologies can thrive
is 4chan, a message board with more than 22 million unique monthly visitors and notorious for spreading the influence of the far right. In 2020, 4chan user Payton Gendron murdered 10 people in a predominantly black neighbourhood of Buffalo, New York state, detailing his motivation in a 180- page manifesto filled with pseudo-scientific racism plagiarised from other extremists on the site. On Discord (an instant messaging social platform), included in chat logs believed to have been written by Gendron, are the words: “I only really turned racist when 4chan started giving me facts.” Gendron’s manifesto borrowed heavily from another written in 2019 by Brenton Tarrant, who killed 51 people at two mosques in Christchurch, New Zealand. According to a government report, Tarrant was also a frequent user of 4chan and its sister board, 8chan. Tarrant, who had uploaded his manifesto to 8chan before his attack, in turn, had significantly plagiarised that of Anders Breivik, who murdered 77 people in Norway in 2011 in an anti- immigration-motivated attack.

“The furious nihilism, racism and angst of 4chan has become deeply worrying over the years,” Amarasingam says. “The number of dangerous ideologies that have been birthed on that platform and led to real-world racially motivated attacks is something that just doesn’t get enough attention in mainstream media for whatever reason.” According to Amarasingam, among the driving forces behind these attacks are conspiracies like the great replacement theory – a white nationalist far-right belief that white people are being culturally and demographically replaced by non-white people. “It’s playing a greater role in mobilising individuals to violence because it has a somewhat unique ability to foster a sense of emergency. You can hear it all over the Buffalo shooter’s manifesto – a deep sense of urgency that there is an imminent collapse of white people and white culture,” the professor continues.

While we may view 4chan as the most extreme example of racism finding a home on the internet, fuelling real-world violence, as time has gone on, extreme levels of hate are becoming more prominent on traditional platforms. “Reddit gives rise to toxic subcultures, YouTube to a network of reactionary-right racist influencers, and co-ordinated harassment is pervasive on Twitter,” says Callum Hood, head of research at the Center for Countering Digital Hate (CCDH), a British non-profit with offices in London and Washington. “There’s a part that all of these platforms have to play when it comes to furthering far-right ideologies, and it seems to only grow more prominent as time progresses. Every few years we see one of these platforms rise as the go-to for these types of communities.” And many within these communities come with a face: rather than feeling protected by anonymity they are finding fame because of their ability to voice this hate.

“Take YouTube, for example, which only recently has seen a number of far-right commentators build huge fan bases,” Hood says. “Of course these individuals, like [the political commentator] Steven Crowder, often operate through dog whistles, but as of late, it’s almost like they’ve stopped trying to hide it. And now it seems as though the far right have adopted Twitter as their next breeding ground for hate.”

Following “free speech absolutist” Elon Musk’s $44 billion acquisition of Twitter in October 2022, hate speech on the platform has increased substantially. Before Musk bought Twitter, according to the CCDH, slurs against Black Americans showed up on the social media service an average of 1,282 times a day. After the billionaire became Twitter’s owner, the number jumped to 3,876 times a day. These changes are “alarming”, Hood says, adding that the CCDH had “never seen such a sharp increase in hate speech, problematic content and formerly banned accounts in such a short period on a mainstream social media platform”.

Clearly these extremist views are no longer restricted to the dark corners of the internet. “It’s now throughout all of the mainstream platforms too, providing the illusion that these views are somewhat acceptable,” Hood continues. “If someone is beginning to delve into the rabbit hole of the far right, they do not have to dig deep on the internet to enter that pipeline any more. These groups are right there in your face on the most popular sites in the world.”

In a true showing of the times, alt-right public figures have seen an increase in followings and fame online – now the leaders are almost celebrity-like figures to look up to. Take Nick Fuentes, for example, an avid white supremacist live streamer who has often called for the subjugation of Jewish people and the embrace of Christian nationalism in his home country of America. He regularly attacks African Americans, declaring at one point that Jim Crow segregation “was better for them, it’s better for us, it’s better in general”. Incredibly, this individual was catapulted into the spotlight further after he caught the eye of Kanye West during his anti-semitic tirades, introducing him to a much larger and mainstream audience than he’d ever be capable of reaching on his own. “It’s no surprise that individuals with these damaging ideologies gain more and more confidence with their views as prominent figures in the space rise,” says Caroline Orr, a postdoctoral research associate at the University of Maryland studying misinformation, who has tracked far-right figures such as Fuentes and their networks for years. “There’s a certain level of validation that comes alongside that – suddenly it’s not just those in your community that agree with you, but celebrities and politicians with millions of followers and massive influence.”

That strength in numbers has led to far-right groups collating in the real world, assuming that their worldview is more accepted than ever; they now hold values that are no longer underground in the traditional sense. As previously mentioned, in the UK, we’re even witnessing bigotry at the top levels of government. Braverman has regularly made discriminatory comments about migrants in the UK, describing the number of asylum seekers in the country as an “invasion”. Meanwhile, in April, the home secretary said grooming gangs were almost entirely made up of British-Pakistani men who “hold cultural attitudes completely incompatible with British values”. Dhalu believes that these attacks by politicians will only contribute to more discrimination from the public. “When you see some of the most powerful figures in the country throwing around xenophobic terminology, with no regard for the influence that can have, what kind of message does that send out to the rest of the country? With regard to Braverman’s numerous comments, she’s practically giving the far right in the nation a free pass to behave the same way. If top MPs can say it, why can’t they? That’s how these individuals will view it.”

What makes those comments even more shocking is that just a few months prior, in February, far-right protesters clashed with police at a hotel in Knowsley, near Liverpool, that was housing asylum seekers. There were reports that the protest was organised by the far-right Patriotic Alternative – a group welcomed back onto Twitter following Musk’s takeover before being banned a few months later – but the group denied this in social media posts. Knowsley’s Labour MP Sir George Howarth said the demonstration was triggered by an “alleged incident posted on social media’’ and criticised misinformation about refugees being “feather- bedded” at the hotel. In a tweet, Stand Up to Racism blamed the violence on the government’s “scapegoating of refugees”, which Dhalu claims is validating far-right groups’ beliefs. “There’s no doubt that the normalisation of hatred against immigrants contributed to the incident that occurred in Knowsley, and if it continues, we’ll reach a boiling point that could lead to more and more of these protests. When politicians use language that you’d expect from someone like Tommy Robinson, you just have to wonder how exactly we got to this position, why almost nothing is being done about it, and how far it can go.”

Overall the far-right pipeline is more accessible than ever, and those susceptible enough are always just one click away from heading down the rabbit hole. “Sadly it’s impossible to police the entire internet, so these views will always have a home on the web,” says Sofia Koller, senior research analyst at the Counter Extremism Project (CEP). “However clearly not enough is being done to combat these far-right groups, especially on mainstream social media platforms, and if we allow their ideologies to thrive and continue to spread, it will only lead to more horrific attacks both on and away from our screens.”

So what can be done? “There are organisations like ours that urge anti-racists to stand up to these ideologies through marches and peaceful protests that are great for getting our voices heard,” Dhalu says. There are charities like Stand Against Racism & Inequality and Stop Hate UK, who are also doing great work in the community to fight against discrimination. However, there is so much more that needs to be done, especially online. Hatred is like a virus. We’ve already let it spread more than it ever should have. Just because you might not see it doesn’t mean it’s not happening. We must actively search out these groups and take a stand. Otherwise we’ll see the virus continue to spill out into our everyday lives.”

WriterChris Saunders
Banner Image CreditGB News