Demystifying the problem
Global, Connected and Decentralized
To understand violent white supremacy, we need to learn from former white supremacists, abandon myths about lone actor violence, and investigate the informal networks through which this movement spreads.
Violent white supremacism is flourishing through robust communication networks with weak social ties1
O ver the course of two years, Jigsaw interviewed dozens of former white supremacists, a number of whom have never shared their stories publicly. We believe that conducting research with former extremists offers valuable perspectives on violent movements, including how people join and how, eventually, they leave. From granular insights on the chat apps they use to personal catalysts for leaving extremist groups, every story offers nuanced insights for countering extremism. We recognize that each individual’s story is just that—an individual story.
While some researchers question the credibility of formers’ insights, formers have made important contributions in the past and we feel it is our responsibility to listen234 to better understand the human side of the challenge. We have sought to approach this research ethically, prioritizing confidentiality, consent, and an understanding of when and when not to engage vulnerable people5. Names used are pseudonyms wherever requested.
In tracing these formers’ paths into and out of extremism, we draw insights that inform our understanding of modern white supremacy and dispel common myths. This effort has reinforced four insights put forward by other researchers.
In March of 2019, Brenton Tarrant stormed a mosque in Christchurch, New Zealand and murdered 51 people.
In March of 2019, Brenton Tarrant stormed a mosque in Christchurch, New Zealand and murdered 51 people.
A few months later, Patrick Crusius entered a Walmart in El Paso, Texas and murdered 23 people.
A few months later, Patrick Crusius entered a Walmart in El Paso, Texas and murdered 23 people.
Neither terrorist’s manifesto claimed allegiance to a formal white supremacist group. Each man appeared to be acting alone, and the characterization of these events as unhinged behavior of a self-radicalized “lone wolf” became a prevailing narrative.
The categorization of “lone wolf” is both glamorizing and misleading. However, it is important to understand people like Tarrant and Crusius who commit violence outside of a material support group or formal terror cell6. More sophisticated analyses distinguish this growing category of lone actor terrorists who perpetrate their respective terror attacks alone but are active participants in discursive networks.
In fact, it is lone actors’ posts in online communities that affords law enforcement and other experts vital signals to counter violent extremism78910.
Not only were Tarrant and Crusius not "lone wolves," they were part of the same distributed online network. Both wrote similar warning messages to their online community before their attacks. They used similar rhetoric and ingroup language to broadcast their intent11. They both posted manifestos on 8chan describing similar supremacist ideals of a white ethnostate. They both described a perceived existential threat to whites from a conspiracy theory - a “great replacement” by immigrants. They both described their automatic weapons used in the attack as “gear,” as though they were players in a video game.
Their online communities described the death tolls of each attack as “high scores,” comparing the attacks as events in the same apocalyptic live action role-playing video game12. The very first sentence of Crusius’ manifesto is “I support the Christchurch shooter and his manifesto”13.
Section I
The mythology of the lone wolf
B renton Tarrant and Patrick Crusius are part of a violent white supremacist movement that is growing online. Attacks like theirs may be carried out by an individual, but they radicalized within a highly interactive network14. As JM Berger notes, “terrorism and extremism are inherently social activities, usually carried out by individuals because they dramatically overvalue their membership in a particular social grouping.”15
These attacks are part of a distributed, global movement
Terrorist attacks can inspire others within the movement to commit “copycat” atrocities elsewhere, providing a role model figure and attack blueprints for them to follow1617. We spoke with a former member of Feuerkrieg Division, a violent neo-Nazi organization, who described how the Christchurch attack was a source of motivation and 96% inspiration. The group frequently shared Tarrant’s manifesto in online chat forums and encouraged others to “top Tarrant’s score,” a common video game reference among violent white supremacists encouraging even higher death tolls in future mass casualty attacks18. This gamification of violence by white supremacists online is common, and is incentivized either top down - where group leaders offer members badges, titles, or other affordances for hateful acts - or bottom up - where individual forum members create online competition around a distributed virtual scoreboard tallying deaths or other acts of violence19. Regardless of the incentive structure, there are social motives to white supremacist violence in terms of achieving in-group belonging and status by participating in shared, gamified goals. 96%
Several formers cite the allure of shared community and sense of belonging as the key drivers of their radicalization. Interacting with members of the in-group, which can be as simple as replying to Tweets or a comment thread, is vital to learn their sub-cultural norms and reinforce mental walls against perceived out-groups. For white supremacism, this radicalization process is known as “red pilling” and entails learning the coded language, foundational conspiracy theories, and the violent ideals of racial supremacy2021. While this radicalization increasingly occurs sitting alone in front of a computer screen, these individuals are busy building and inhabiting highly interactive communities.
“Social camaraderie, a desire for simple answers to complex political problems, or even the opportunity to take action against formidable social forces can coexist with, even substitute for, hatred as the reason for participation in organized racist activities.”
After returning home to the States from tours with the Marines, James was looking for a new mission, and a new band of brothers.
He first joined the KKK chapter near where he was living, but when that group didn’t share his orientation for action, he went searching online for alternative white supremacist communities.
He went “hate group shopping” by going to the SPLC’s hate map, which tracks the locations of hate groups, ironically seeking to find groups near him that he could join.
James proceeded to join and leave three increasingly exclusive and violent neo-Nazi groups over the next few years before leaving the movement altogether.
He became convinced that hate prevented people from building the healthy, supportive community he sought.
Several formers cite the allure of shared community and sense of belonging as the key drivers of their radicalization. Interacting with members of the in-group, which can be as simple as replying to Tweets or a comment thread, is vital to learn their sub-cultural norms and reinforce mental walls against perceived out-groups. For white supremacism, this radicalization process is known as “red pilling” and entails learning the coded language, foundational conspiracy theories, and the violent ideals of racial supremacy. While this radicalization increasingly occurs sitting alone in front of a computer screen, these individuals are busy building and inhabiting highly interactive communities.
“Red pilling” is the process of buying into an extremist ideology, and entails learning the coded language, conspiracy theories, and ideals of the movement.
Signalling allegiance to a group or ideology often becomes an all-consuming project for extremists. Even once an extremist has been formally admitted into a group or online forum of extremists, formers described the need to prove themselves as “down for the cause” or “white enough” by committing more and more time and energy. This performance of dedication often escalates in a competitive fashion, resulting in hate speech and violence.
“When people plant bombs, burn crosses, or run their cars into crowds of peaceful protesters, they are reinforcing their place in a community by inflicting terror.”
This grotesque emphasis on performing an attack for one’s community is exemplified by the terror attack on a synagogue in Halle Germany in October 2019. During the attacker’s trial, he recalled discovering mid-attack that his live-stream had stopped and remarked "That's bad, because the stream was more important than the attack itself"22. Experts note that this seeming contradiction of physical isolation from society and online attention seeking is particularly pronounced for lone actor extremists23. In a study of lone actor extremists, which was not confined to violent white supremacists, the National Center for the Analysis of Violent Crime found that 96% of lone actor extremists “produced writings or videos intended to be viewed by others” including videos, blogs, or manifestos online24.
As a result of these social dynamics, white supremacist attacks often serve the dual purposes of inflicting terror to advance the movement’s agenda and sending a signal to their in-group that the actor belongs, supports, and reinforces the community’s worldview. By treating these attacks as isolated incidents by rogue actors, we fail to see and address the deep roots of the problem. This can have important implications for interventions, as well as law enforcement and intelligence gathering.
Today’s white supremacist movement is moving away from formal groups
Part of the reason for the persistence of the lone wolf myth is that white supremacists are not always part of visible, formal groups. The internet lowers barriers for those curious about a supremacist idea to anonymously learn about it, lurk in supremacist spaces online, and eventually interact with others as part of loose, informal networks. This enables supremacists to pick and choose which aspects of supremacist ideology resonate and engage selectively with those ideals. In other words, supremacists no longer have to find a group with which they fit; there is less friction to joining the distributed movement because they can retain idiosyncratic beliefs. This distributed network of extremism has been described as a “post-organizational paradigm”, where white supremacists increasingly radicalize and operate together without easily recognizable branding, membership or hierarchy25.
Attacks are seldom coordinated by a central authority
This decentralized organizing principle predates the internet. Today’s white supremacist movement builds in part on the concept of “leaderless resistance,” which has been used by numerous other political extremist movements. In 1983, white supremacist Louis Beam argued that like-minded individuals should operate independently, without central coordination, in order to evade law enforcement and create, “an intelligence nightmare”26. While Beam gave this order in the context of existing, organized terror cells of the KKK and Aryan Nations, many recent violent white supremacist attacks, particularly by groups like Atomwaffen Division or the Base, draw on this strategy of leaderless resistance. In many cases, there is no evidence that perpetrators are taking orders from a source of authority, but instead are acting on their interpretation of the movement’s ideals, and broadcasting those back through manifestos and live streamed attacks27.
Section II
The Rise of
Informal Groups
Radicalization is accelerating as informal online extremist groups lower the barriers to entry
T he rise of informal white supremacist communities online over the last 20 years has made it easier to find, join, and become radicalized into white supremacist groups. Whereas groups in the past might have recruited at white power rock shows or amenable bars, online radicalization no longer requires this investment of in-person time and physical vetting. Today a person can spend hours online moving through increasingly radical content and discussion forums. The immersive nature of these online forums and communities can create a “new normal” that contradicts offline social norms28. This also means younger children and teens, who could not be recruited at bars or white power concerts in the past, can more easily join white supremacist groups at impressionable ages. One former interviewed had entered a white supremacist forum at age 13 and, drawn to the provocative memes and secretive chat forums, had adopted national socialist beliefs by 15. By 18, he was part of a neo-Nazi terrorist organization proscribed in the UK.
“Membership” rarely implies paying your annual KKK dues as it once did; today, it more often entails accessing a hyperlink into a private online forum.
The internet not only facilitates easier entry into radical circles, but it may accelerate the radicalization process, as well. In one study of US Islamist extremists, those who radicalized after 2010 and cited the internet as a source of radicalizing material engaged in extremist actions in less than half the time of those who radicalized with little internet use29. While more research is needed to understand the impact of the Internet on the speed of radicalization, a number of former white supremacists interviewed who radicalized online went from discovering white supremacist ideas to openly advocating for them in just 3-6 months.
“Every time I went online it was like putting on a mask, one where you're shielded from empathy, from consequences... I’d say all sorts of horrible things. And then I’d get offline and hang my mask up and go back to my family”
Formal and informal extremist groups often coexist online, playing complementary roles
While much of today’s white supremacist movement operates in looser, less formal networks, there are certainly still white supremacist groups that operate with rigid hierarchies and formal membership. In fact, our interviews with formers suggest that formal group membership is not necessarily declining as these informal groups swell in numbers. Rather, group membership is a fluid concept and groups are not mutually exclusive. Formers have reported that participating in formal hate groups and informal online groups served complementary functions for recruitment and radicalization.
A former white supremacist, Cat, says her journey from suburban housewife to recruiter for a white supremacist group took her from mainstream social media into private chat groups. Before she’d even met an extremist in person, she was vetted and invited into Identity Evropa. Her radicalization took under six months.
Section III
White supremacy is a globally networked movement
T he white supremacist movement has always been global. But the internet allows white supremacists to exchange ideas through immediate, low cost communication channels and recruit from anywhere. White supremacist influencers in particular have built dedicated followings around the world, empowered by mainstream and fringe social networks. They exploit a romanticized ideal of “the West,” unifying supremacists from Europe and North America around transferable racist ideas that migrants pose a threat or that Islam is incompatible with democracy.
Alex’s online communities were global from the first time she logged on at age 16, exchanging memes and ideas with white supremacists from Australia to Estonia. She had never traveled abroad, but the connections she made in online white supremacist forums ultimately led her to move from the US to Europe. There, she was able to help set up a branch of one of the deadliest neo-Nazi terrorist organizations in the world.
Formal hate groups can operate like international franchises
Formal hate groups often adopt familiar entrepreneurial business tactics to expand their reach; they start small, brand themselves to grow, rebrand when there is a hitch, and internationalize their efforts when they have gained enough traction locally. They may use a franchise model to grow more rapidly, decentralize leadership, and extend overseas. Some groups even gather at annual leadership conferences to exchange best practices and update their branding, much like a corporate franchise might. Seeding spin-off groups internationally is particularly popular with openly violent groups that operate as terrorist cells, such as Atomwaffen Division which has headquarters in the US Note: Atomwaffen Division in the US announced its disbanding in March 2020 following an FBI crackdown, then re-emerged in July 2020 under the name National Socialist Order31. , Germany and Estonia, and more divisions across Canada, Russia, and Europe30.
One subset of the white supremacist movement that is not as openly violent - the self-styled “Identitarian movement” - employs slick branding tactics and strict franchising rules in efforts to thinly veil their white supremacist messaging32. These groups have crafted extensive guidelines to unify their group marketing and branding efforts and create an image of mainstream acceptability. Generation Identitaire in France pioneered this approach, sharing its anti-migrant and Islamophobic messaging through gauzy videos and fliers that mask the inherent violence of white supremacy under the guise of “European heritage.”33
Collaborating across borders helps organizations evade detection and enforcement
As an American identitarian group built to mirror European groups, the American Identity Movement (AIM, fka Identity Evropa) maintains transnational ties. Former members report speaking regularly with their European contemporaries, including Martin Sellner in Austria and Arktos Media in Sweden. These groups exchange ideas in private messaging apps, swapping stories of how to attack their political enemies and learning recruitment tactics from one another. While these groups aspire to grow large enough to establish nationwide white ethno-states, former members noted how in-person activities were often limited to cities where at least 80% of the population was white.
Recent research by scholars at George Washington University identified multilingual hubs for the white supremacist movement. They identified the encrypted chat app, Telegram, as the major vector for multilingual white supremacist discourse, hosting groups with explicitly extreme names (e.g. “SoupNazis”) and known neo-Nazi terrorist groups like the Nordic Resistance Movement. Importantly, the researchers observed that some of these channels appear to express more radical or hateful speech using local languages—particularly German and Russian. When they switched to English, the messages in these channels were less hateful, suggesting self-censorship to prevent detection and/or modulation to attract foreign members34.
Section IV
The Alt-Tech Ecosystem
E vading detection on mainstream platforms is a challenging task. So as white supremacists faced increasing removals from social media platforms, they began to design or co-opt alternative online platforms to host their communication. This constellation of new online spaces unified by a philosophy of free speech absolutism and a desire to self-govern to enable hateful activity is known as the “alt-tech ecosystem”35. Alt-tech platforms tend to be less hierarchical by design, allowing decentralized control over an infinite number of niche sub-groups36.
Alt-tech platforms facilitate dangerous, unmoderated spaces for white supremacy to flourish
White supremacists use alt-tech platforms for a range of functions from recruitment to coordination. The ecosystem comprises two types of platforms that are used in tandem:
Alt-tech social networks
These platforms—Gab, Bitchute and Voat, among others—were designed to mirror mainstream social media platforms with easily shareable visuals and memetic content, except that they take an absolutist approach to free speech. These social media lookalikes facilitate conversations with unmoderated extremist discourse and entry points for recruitment3738.
Closed-community communication apps
Messaging apps and gaming platforms that facilitate private group discussions, including Telegram, Discord, Parler, etc. can allow dangerous conversations and behavior to go undetected. Messaging apps are used when hate actors are more concerned about privacy and rapid response, such as when organizing events or coordinating harassment. The messaging apps that have gained popularity with white supremacists offer common security features like end-to-end encryption and coordination capabilities like audio, video, and multi-person channels39.
This combination of both open social networks and private messaging used in tandem by hate actors has improved resilience to removals by mainstream platforms with stricter policies against hate, allowing hate communities to regroup and return quickly.
Alt-tech platforms are used as complementary back-ups to mainstream social media
While white supremacist influencers from mainstream social media have encouraged their followers to migrate to these new platforms, adoption has been slow.
As much of the alt-tech ecosystem embraces privacy and anonymity for their members, these groups are also fairly inaccessible for casual observers. Some groups even go so far as to implement vetting and interviews to gain entry.
At 16, Taylor began spending time on online forums discussing “ironic” hate speech, making and sharing “edgy” memes.
Taylor had never felt racist or misogynist, but her new online friends piqued her curiosity by contradicting mainstream liberal views on race science Scientific racism is an attempt to co-opt the authority of science to justify racial prejudice. Race is a product of culture and human imagination; it has no scientific basis. The ideas presented by proponents of scientific racism are designed to make racism seem scientific and acceptable, when in reality, it is neither.Source: Scientific Racism: Definition & Examples, link: Why race science is on the rise again and feminism. As their discussions devolved into unironic support for extremism and violence, they moved to communities on Discord, Wire, and Telegram.
In these private channels, Taylor befriended more radical extremists. They groomed her as a “prospect,” prompting her for her thoughts on fascist ideologies and sharing texts, ranging from well known white supremacist texts to niche “
accelerationist
In the case of white supremacists, the accelerationist set sees modern society as irredeemable and believe it should be pushed to collapse so a fascist society built on ethnonationalism can take its place.
To put it most simply, accelerationists embrace terrorism.Source: splcenter.org
” manifestos.
Taylor didn't know what she was a prospect for, but she was eager to accept the alluring private chat invitations in order to access the “hidden knowledge” these groups claimed to possess. She gained status by reading and making like-minded comments.
Before long, Taylor was the one reaching out to new prospects, running “honeypot servers” to vet potential recruits, and creating propaganda materials for fascist groups.
Across ideologies, former white supremacists described getting “vetted into” online groups based not only on friendships, but also on hard skills they could offer to the group’s cause, like software engineering or military training.
Women were often vetted in as recruiters because some believe they have greater people skills necessary to persuade new recruits.
Taylor ultimately got vetted into a neo-Nazi group, Atomwaffen Division, where she became a recruiter herself.
On Discord, administrators of white supremacist chat groups (“servers” in Discord parlance) often vet new entrants through a rigorous two-step process. New entrants must answer a series of questions, such as “what is the solution to illegal immigration?” and how to apply relevant ideologies, like fascism or accelerationism40. If the administrator deems the written answers sufficient, they may further vet the entrant through an audio interview to assess personality fit and attempt to weed out imposters.
Because of these barriers to widespread adoption, alt-tech platforms are sometimes used as “back-ups” by white supremacists who expect that they might be removed from mainstream social media platforms. Influencers often link to their alt-tech profiles in their mainstream social media profiles to connect redundantly with their followers in anticipation of deplatforming.
A former neo-Nazi, Luke, explained that YouTube was where he found “normie content” like punk rock videos, whereas he used YouTube’s alt-tech lookalike, Bitchute, to find white supremacist content like white power music. In this way, alt-tech served complementary functions to mainstream platforms by circumventing their restrictions against hateful and extremist content.
Extremists self-censor and encode their language on mainstream platforms
To remain on mainstream platforms, extremists share advice with one another about what words to use and avoid in their public-facing messages.
The terminology white supremacists use changes rapidly to avoid detection. Words often dual-use words that have legitimate purposes; for example, white supremacists popularized “jogger” as a replacement derogatory term for the n-word in the aftermath of Ahmaud Arbery’s murder in February 202041.
They also use this tactic in the naming of groups and fan pages, rapidly rebranding to mislead or evade enforcement on tech platforms. A notable example is the Boogaloo movement, which faced a crackdown by Facebook in June 2020, when it removed 220 accounts, 28 pages, and 106 groups that advocated for violence in affiliation with the Boogaloo. Many Boogaloo pages returned to Facebook shortly thereafter with duplicitous names like “CNN bois,” having rebuilt their groups from their redundant networks on alt-tech platforms42.
“We used smaller private platforms to coordinate alongside bigger platforms to pull people in...”
White supremacists use a hierarchy of platforms to communicate
Former recruiters from Identity Evropa (IE) have described using layers of communication channels. The conversations, they explain, become less sanitized and more openly violent as people move to deeper, core layers.
One former Identity Evropa recruiter described being first invited via a direct message on Twitter into a text message group for “Alt-right women” where they discussed “women’s issues” from childcare to their favorite celebrity videos online.
As she gained the trust of IE’s leadership, she was invited into smaller and smaller online groups with names like “Women of the West”, moving to more self-governed platforms like Discord and Slack.
Once she entered this private inner circle, the facade of respectable pro-Western rhetoric on their public social media pages fell away, revealing that IE’s leadership held neo-Nazi beliefs and expressed a thirst for violence against minorities.
The organizers of the Charlottesville rally exemplify this tactical doublespeak across different communication platforms. They advertised a rally for “white pride” in their sanitized promotions on mainstream platforms, while in their private chats on Discord—which were leaked in 2019—they encouraged violence and glorified fascist ideals43.
Sam was a former member of a group that helped organize Charlottesville. She remarked that these groups are “wolves in sheep’s clothing”, with an intense focus on managing their public personas, which starkly contrasted their private identities.
She explained that identitarian leaders explicitly disavowed violence at events in public communications...
...yet in the private chats she was a part of, they talked openly about what weapons they planned to bring and what violent acts they hoped to see at Charlottesville.
Some extremist groups and forums were so concerned with crafting a “respectable” public reputation that they published rules on what members could post. In one of Identity Evropa’s Discord servers, leadership forbid members from posting “extremism”, “ethnic chauvinism”, and “vulgar language”, despite the inherent contradiction of being an extremist, white chauvinist group. The leadership of IE were often the first to break these rules44.
For the foreseeable future, alt-tech platforms will not replace social media so much as complement it to facilitate the spread of hate in parallel, interconnected and self-governed online spaces.
New perspectives
The
Way Out
Much of the conversation about white supremacy today focuses on the radicalization process. But the way out can be perilous and requires insights from those who've made the journey.
References
Hamm, Mark S, and Ramon Spaaij. “The Age of Lone Wolf Terrorism.” Columbia University Press, 22 Feb. 2017, https://cup.columbia.edu/book/the-age-of-lone-wolf-terrorism/9780231181747
Clubb, Marina Tapley and Gordon. “The Role of Formers in Countering Violent Extremism.” ICCT, 12 Apr. 2019, https://icct.nl/publication/the-role-of-formers-in-countering-violent-extremism/
The Role of the Internet in Facilitating Violent Extremism, Part I: Insights from Former Right-Wing Extremists - VOX - Pol - July 22, https://www.voxpol.eu/the-role-of-the-internet-in-facilitating-violent-extremism-part-i-insights-from-former-right-wing-extremists/
Scrivens, Ryan, et al. “Former Extremists in Radicalization and Counter-Radicalization Research.” Radicalization and Counter-Radicalization, vol. 25, EMERALD Group PUBL, 2020, https://www.emerald.com/insight/content/doi/10.1108/S1521-613620200000025012/full/html
Galloway, Brad. “The Ethics of Engaging Former Extremists to Counter Violent Extremism Online-Brad Galloway.” Moonshot CVE, 5 Sept. 2019, https://moonshotcve.com/ethics-of-engaging-formers/
The Age of Lone Wolf Terrorism, https://cup.columbia.edu/book/the-age-of-lone-wolf-terrorism/9780231181747
“HSDL: The Nation’s Premier Collection of Homeland Security Documents.” Homeland Security Digital Library, https://www.hsdl.org/c/lone-wolf-terrorism/
United States, Congress, Federal Bureau of Investigators, Behavioral Analysis Unit, et al. Lone Offender: A Study of Lone Offender Terrorism in the United States, U.S. Department of Justice, Behavioral Analysis Unit, 2019, https://www.fbi.gov/file-repository/lone-offender-terrorism-report-111319.pdf
Byman, Daniel L. “Can Lone Wolves Be Stopped?” Brookings, Brookings, 15 Mar. 2017, https://www.brookings.edu/blog/markaz/2017/03/15/can-lone-wolves-be-stopped/
Hamm, Mark S, and Ramon Spaaij. NCJRS, 2015, Lone Wolf Terrorism in America: Using Knowledge of Radicalization Pathways to Forge Prevention Strategies, https://www.ncjrs.gov/pdffiles1/nij/grants/248691.pdf
Macklin, Graham. “The El Paso Terrorist Attack: The Chain Reaction of Global Right-Wing Terror.” Homeland Security Digital Library, CTC Sentinel, 2019, https://www.hsdl.org/?view&did=832570
Feldman, Matthew. “Terrorist 'Radicalising Networks': A Qualitative Case Study on Radical Right Lone-Wolf Terrorism.” SpringerLink, Palgrave Macmillan, Cham, 1 Jan. 1970, https://link.springer.com/chapter/10.1007/978-3-319-65566-6_2
Berger, J.M. “The Strategy of Violent White Supremacy Is Evolving.” The Atlantic, Atlantic Media Company, 7 Aug. 2019, https://www.theatlantic.com/ideas/archive/2019/08/the-new-strategy-of-violent-white-supremacy/595648/
Nacos, Brigitte L. “Revisiting the Contagion Hypothesis: Terrorism, News Coverage, and Copycat Attacks.” Perspectives on Terrorism, vol. 3, no. 3, 2009, pp. 3–13. JSTOR, https://www.jstor.org/stable/26298412?seq=1#metadata_info_tab_contents
Langman, Peter. “Different Types of Role Model Influence and Fame Seeking Among Mass Killers and Copycat Offenders - Peter Langman, 2018.” SAGE Journals, 2 Nov. 2017, https://journals.sagepub.com/doi/abs/10.1177/0002764217739663
Schlegel, Linda. “Points, Rankings & Raiding the Sorcerer’s Dungeon: Top-down and Bottom-up Gamification of Radicalisation and Extremist Violence.” GNET, 27 Mar. 2020, https://gnet-research.org/2020/02/17/points-rankings-raiding-the-sorcerers-dungeon-top-down-and-bottom-up-gamification-of-radicalization-and-extremist-violence/
Schlegel, Linda. “Points, Rankings & Raiding the Sorcerer’s Dungeon: Top-down and Bottom-up Gamification of Radicalisation and Extremist Violence.” GNET, 27 Mar. 2020, https://gnet-research.org/2020/02/17/points-rankings-raiding-the-sorcerers-dungeon-top-down-and-bottom-up-gamification-of-radicalization-and-extremist-violence/
Knight, Ben. “Germany: Anti-Semitic Attack Suspect Shows No Remorse in Court: DW: 21.07.2020.” DW.COM, 21 July 2020, https://www.dw.com/en/german-synagogue-attack-trial-starts/a-54241893
Comerford, Milo. “Confronting the Challenge of 'Post-Organisational' Extremism.” ORF, 19 Aug. 2020, https://www.orfonline.org/expert-speak/confronting-the-challenge-of-post-organisational-extremism/
Byman, Daniel L. “Can Lone Wolves Be Stopped?” Brookings, Brookings, 15 Mar. 2017, https://www.brookings.edu/blog/markaz/2017/03/15/can-lone-wolves-be-stopped/
Byman, Daniel L. “Can Lone Wolves Be Stopped?” Brookings, Brookings, 15 Mar. 2017, https://www.brookings.edu/blog/markaz/2017/03/15/can-lone-wolves-be-stopped/
Gaudette, Tiana, et al. The Role of the Internet in Facilitating Violent Extremism: Insights from Former Right-Wing Extremists. https://www.tandfonline.com/doi/full/10.1080/09546553.2020.1784147?scroll=top
“Atomwaffen Division.” Wikipedia, Wikimedia Foundation, 22 Oct. 2020, https://en.wikipedia.org/wiki/Atomwaffen_Division
Makuch, Ben. Neo-Nazi Terror Group Atomwaffen Division Re-Emerges Under New Name, 2020, https://www.vice.com/en_us/article/wxq7jy/neo-nazi-terror-group-atomwaffen-division-re-emerges-under-new-name
Ebner, Julia. “Going Dark.” Bloomsbury Publishing, 2020, https://www.bloomsbury.com/uk/going-dark-9781526616784/
Zúquete, José Pedro. “The Identitarians.” Notre Dame University Press, 18 July 2019, https://undpress.nd.edu/9780268104214/the-identitarians/
Velásquez, N., et al. “Hate Multiverse Spreads Malicious COVID-19 Content Online beyond Individual Platform Control.” ArXiv.org, 21 Apr. 2020, https://arxiv.org/abs/2004.00673
Malter, Jordan. “Alt-Tech Platforms: A Haven for Fringe Views Online.” CNNMoney, Cable News Network, 2017, https://money.cnn.com/2017/11/10/technology/culture/divided-we-code-alt-tech/index.html
Roose, Kevin. “The Alt-Right Created a Parallel Internet. It's an Unholy Mess.” The New York Times, The New York Times, 11 Dec. 2017, https://www.nytimes.com/2017/12/11/technology/alt-right-internet.html
Dearden, Lizzie. “Inside the UK-Based Site That Has Become the Far Right's YouTube.” The Independent, Independent Digital News and Media, 22 July 2020, https://www.independent.co.uk/news/uk/home-news/bitchute-far-right-youtube-neo-nazi-terrorism-videos-a9632981.html
Donovan, J., Lewis, B., & Friedberg, B. (2018). Parallel Ports. Sociotechnical Change from the Alt-Right to Alt-Tech. Post-Digital Cultures of the Far Right. Retrieved 2020, from https://transcript.degruyter.com/view/book/9783839446706/10.14361/9783839446706-004.xml
Owen, Tess. White Supremacists Have a Disgusting New Code for the N-Word After Ahmaud Arbery's Death, 2020, https://www.vice.com/en_us/article/bv88a5/white-supremacists-have-a-disgusting-new-code-for-the-n-word-after-ahmaud-arberys-death
“Boogaloo, Rebranded.” Anti-Defamation League, 13 July 2020, https://www.adl.org/blog/boogaloo-rebranded
“#Rules (Discord ID: 405211244023382016) in Nice Respectable People Group, Page 1.” #Rules (Discord ID: 405211244023382016) in Nice Respectable People Group, Page 1 | Unicorn Riot: Discord Leaks, https://discordleaks.unicornriot.ninja/discord/channel/301
Explore the issue: