Advertisement

‘What we’re trying to teach is empathy’: The grassroots strategies to de-radicalize the far-right

The Trump administration has fallen woefully short in dealing with the threat. That's where these groups come in.

Tech companies and the Trump administration have fallen woefully short in dealing with the threat of far right extremism. But successful strategies are out there -- and grassroots activists have more than a few solutions. (Photo credit: Chip Somodevilla/Getty Images)
Tech companies and the Trump administration have fallen woefully short in dealing with the threat of far right extremism. But successful strategies are out there -- and grassroots activists have more than a few solutions. (Photo credit: Chip Somodevilla/Getty Images)

In the end, it was the violence that finally got to Scott Ernest.

From 2004 for the better part of a decade, Ernest was an avowed white nationalist and a regular user of the neo-Nazi website Stormfront. He posted there over 50,000 times, moderated one of its numerous forums, and helped make white nationalism “seem reasonable” for new recruits.

But things started to change for Ernest, who was vocally anti-violence despite his white nationalist beliefs, after July 2011. That’s when Norwegian far-right extremist Anders Behring Breivik — also a prolific Stormfront user — killed 77 people, most of them teenagers at a summer camp.

“I objected to Stormfront deleting posts [praising or justifying Breivik’s actions] — I was raging about it and constantly going into threads and yelling at people who were supporting it,” Ernest told ThinkProgress, explaining that leaving such posts intact exposed the user’s true nature. “Jack Boot, the Stormfront EIC at the time, gave me a temporary ban and said, ‘You are ruining our deniability by pointing out all the support.’ That was the first time I ever questioned the path.”

Advertisement

Despite his anger, Ernest kept his blinders up. “I still continued to go ‘the people committing these crimes were not the same as me,’” he said. “I’d deflect, deflect, deflect, and then there’d be another shooting and another.”

The final straw was in 2014 when neo-Nazi David Lenio started making threats to kill schoolchildren in Montana — Ernest’s home state at the time. It was then that he began to fully realize how poisonous the ideology was that he’d embraced.

Ernest initially planned to pull away from the movement quietly, but then his personal information got leaked to Andrew Anglin, who runs the infamous neo-Nazi website the Daily Stormer. “That ticked me off,” Ernest said. “At that point I said, ‘screw being quiet.’” He met with Christian Picciolini, a former skinhead who renounced the movement to run the grassroots peace group Life After Hate and toured through Montana speaking out against white nationalism. Ernest now lives in Florida, where he is pursuing a degree in public health.

“A lot of times… somebody will start off as some sort of conservative, regular and traditional. Then they start getting hit by all this propaganda.”

Ernest’s experiences highlight two crucial factors often overlooked in a time of increased fear over the resurgent far-right. First, the movement is not a monolith, made up of avowed racists unable to change their ways. Many who tumble down the far-right rabbit hole are, in a similar fashion seen with gangs and other extremist groups, lost, alienated young men, corrupted by far-right propaganda and searching for meaning and camaraderie. That doesn’t excuse their vitriol, but it does present the opportunity to de-radicalize them and, perhaps, stop them from committing an act of far-right violence.

Advertisement

“Part of it is community. They’re afraid of being left outside and they want to be included,” Ernest said. “But on the other hand, they start linking up with all the people around them who have the same opinions and sometimes more extreme ones. Then you go deeper and you may go from someone who watches Alex Jones to someone listening to [the white nationalist podcast] The Right Stuff or reading the Daily Stormer, that’s kind of what I see.”

“A lot of times… somebody will start off as some sort of conservative, regular and traditional,” he added. “Then they start getting hit by all this propaganda — it might be [far-right YouTubers like] Stefan Molyneux, it might be Steven Crowder, it might be Dave Rubin. They start getting hit by all this propaganda and they start getting upset. Then they start seeking out more extreme people and then they go from there.”

Daryle Lamont Jenkins, a veteran anti-racism activist with the group One People’s Project who recently appeared in the documentary Alt-Right: Age of Rage, described a similar profile of young men caught up in the far-right.

“The kind of people that I see gravitate to these groups are the same people I see gravitate to cults, gangs, to saying they want to join ISIS,” Jenkins said in an interview with ThinkProgress. “They’re already lonely, they’re already disenfranchised as individuals, they have a lot of psychological damage, and they are trying to find a way to make their lives worth a damn. The wrong people simply got to them.”

The second crucial point that Ernest’s experience demonstrates is that it is possible for those involved with the far-right to de-radicalize and walk away. But the problem here is that this isn’t a focus. Instead, efforts are mostly concentrated on a top-down approach, asking Big Tech corporations and law enforcement to do more. While that pressure is undoubtedly important, the mixed signals from those groups about combating the far-right and white nationalism means that it is a fraught prospect to rely solely on them to curb far-right extremism.

In early March, for instance, in wake of the far-right attack in Christchurch, New Zealand, Facebook announced it would be banning white nationalist and white separatist content from its site. A month later, HuffPost reported that content posted to the platform by Faith Goldy, a prominent white nationalist, had been left up because it did not violate the company’s policies, despite repeatedly advocating far-right talking points.

The incident raised questions about the effectiveness of Facebook’s policy.

“The litmus test was, would [Facebook] remove Faith Goldy and would they do it quickly?” said Evan Balgord, executive director of the Canadian Anti-Hate Network, speaking with HuffPost earlier this month. “If they’re unwilling to do so under their new rule, then I don’t see any meaningful change.”

Advertisement

Evidence from YouTube, arguably the biggest culprit in pushing viewers down a conveyor-belt to far-right extremism, points to an even more blasé attitude. In March, Neal Mohan, YouTube’s product chief, told The New York Times that he didn’t buy the idea that the site was pushing its users toward extremist content. Four days later, Bloomberg reported that five senior YouTube staffers had left the site since 2017 over its inability or unwillingness to tackle extremist content.

Experts who spoke with ThinkProgress agreed that tech companies shared a notable portion of the blame for creating an environment for resurgent far-right radicalization — and then doing little to stop it.

“Darren Osborne attacked Muslims at Finsbury Park mosque after radicalizing himself,” said Matthew McGregor, campaign director of the British anti-racism group HOPE not Hate. “He went from watching a documentary about grooming gangs to doing his own research, from alt-lite to alt-right, in a very delineated, conveyor-belt type way.”

“The way that Facebook is handling moderation is that they’re subcontracting it out, and part of the problem there is not only trauma for the moderators but the threat of their own radicalization,” said Jessie Daniels, Ph.D., a sociology professor at Hunter College and a Data & Society fellow. “There’s this whole new set of problems we have no to solution to yet. This is not being thought out by tech companies.”

“There are always going to be true [far-right] believers, unfortunately, but if there’s an alternative to that movement a lot of people will take it.”

A similar problem exists with law enforcement. FBI Director Christopher Wray warned in April that white nationalists posed a “persistent, pervasive threat.” At the same time, however, the Department of Homeland Security shuttered an intelligence unit specifically designed to combat white nationalism, and there have been multiple examples of law enforcement officers discovered to have far-right allegiances.

For Ernest and Jenkins, those inconsistencies and the over-reliance on law enforcement to deal with the far-right are losing propositions, especially bearing in mind the current administration. “The current government is doing whatever it can to enable these extremists, and it’s obvious to anyone who used to be an extremist,” Ernest said. “They’re noticing these people before they commit the crimes, dangerous people are being pointed out, and nothing is being done.”

“Insofar as the feds and Facebook will act, it will only happen through some insistence on our part, and our insistence comes from our actions. Charlottesville indicated that,” Jenkins said separately. “Everything you saw people do after Charlottesville means that, a) people can do something about it and, b) to the degree that we can’t it’s because people are in our way.”

This mixed response from law enforcement and Big Tech presents an opening for smaller groups and individuals to offer their own solutions to those doubting their involvement with the far-right.

“There are always going to be true [far-right] believers, unfortunately, but if there’s an alternative to that movement a lot of people will take it,” Jenkins said.

Life After Hate is a notable example of this type of activism. In late March, the group announced it was actually partnering with Facebook to help draw individuals away from white nationalism and far-right extremism. HOPE not Hate also runs anti-racism workshops for schools and colleges in the U.K., which aim to stamp out bigotry before it can metastasize into more serious extremism.

“What we’re trying to teach is empathy, and understanding the impact of discrimination on people,” McGregor explained. “I think we do have to have lots and lots of those conversations trying to rebuild kindness in a world where algorithms are promoting and rewarding hatred and violence.”

A more unusual but undoubtedly encouraging example of individual activism is the YouTube channel Faraday Speaks. Run by a former member of the far-right, it posted an inaugural video on March 21 documenting the account owner’s descent into extremism and imploring those still in the movement to leave the “decentralized cult.” It has since been viewed nearly 300,000 times.

Both McGregor and Daniels also underscored the need for renewed tech literacy, so people can spot the warning signs of when an individual is being radicalized. The Baltic states of Latvia, Lithuania, and Estonia have adopted a similar approach in an effort to counter Russian digital disinformation. This is especially true as Scott, by his own admission, regularly pushed talking points when he was active on Stormfront — like the idea of White Genocide — which he knew to be false but would help attract recruits.

“Everyone said in the run-up to Charlottesville that ‘if you just ignore them they’ll go away.’ That is by far the biggest mistake you can make with this crowd.”

“There’s a fine line between trying to counter radicalization and tracking people on the internet but I do think that digital literacy needs to be a lot better and people need to spot when they’re being sold a bill of goods,” McGregor said. “The Swedes had a good program ahead of their elections so they could look out for bots and fake news, we need to be teaching people how to question what they’re seeing.”

For Daniels, this sort of tech literacy even applied to tech companies themselves. “The most challenging part of this problem is that most tech companies do not have anyone on staff, in the office or on contract, who understands white supremacy,” she said. “They’re starting from below zero to try and prevent something they themselves do not understand.”

Daniels added that there were some relatively minor changes that platforms like Facebook and YouTube could make, like removing the autoplay and autosuggest features on videos. This in turn could make it easier for activists to pressure these platforms by focusing on specific changes as opposed to broader, ill-defined demands like cracking down on white nationalism.

“I think it’s very important to stop this radicalization at its earliest stages,” Daniels said. “There’s something about the algorithm of autoplay and auto-recommendations that fuel that radicalization, and that, from a tech angle, is really easy to solve. That’s something that tech companies could address right now which would be helpful.”

It would be a mistake to think that these efforts could be easily accomplished, especially when the Trump administration is eagerly pulling any sort of federal funding for resources designed to support these grassroots efforts. Just days after the far-right shooting at the Tree of Life synagogue in Pittsburgh last November, the Trump administration pulled funding for the Countering Violent Extremism Grant Program,  intended to fund local law enforcement and organizations combating far-right extremism.

In March, when asked in wake of the far-right mosque shootings in Christchurch whether he saw white supremacy as a rising global threat, Trump responded, “Not really.”

According to Jenkins, that lack of official response from people in power is precisely what the far-right wants, because it allows them to continue to slowly and surely creep into the public life.

“Apathy is the biggest thing,” he said. “The biggest threat to us and the greatest momentum given to them comes when people on our side don’t want to deal with it.”

He added, “Everyone said in the run-up to Charlottesville that ‘if you just ignore them they’ll go away.’ That is by far the biggest mistake you can make with this crowd.”