Countermeasures in Practice

There is no one solution to dismantle white supremacy. But a range of countermeasures exist—organizational, political and technological.

Violent extremism draws people into increasingly radical circles. Interventions must break this dynamic to create lasting change.

T here are a number of organizations working to curtail white supremacy by providing interventions that meet people where they are and help integrate them back into society. It will take the work of many organizations and leaders to create meaningful, lasting change. Below, we profile just a few of the teams that are doing impactful work in this space to highlight different prevention and disengagement strategies at work today.

Organizations making change

This list is in no way exhaustive, but rather a sample of organizations that have engaged in the challenging task of measuring the impact of social intervention programs in which every case is different.

  • Prevention

    Preventive efforts aim to disrupt the radicalization process and curb trajectories towards violent extremism.

  • Disengagement


Moonshot CVE

Moonshot CVE is a social enterprise specialising in countering violent extremism and other online harms. Moonshot designs new methodologies and technologies to enhance the capacity of partners to respond effectively to the threats posed by harms such as violent extremism, disinformation, and gender-based violence, online and offline. Moonshot’s work ranges from software development and digital capacity building to leading global counter-messaging campaigns.

Case study

Redirect: Canada

The Redirect Method was deployed for over a year across Canada’s thirteen provinces, with additional localised campaigns in Canada’s six largest cities. The campaigns safeguarded individuals seeking at-risk content related to jihadist and violent far right ideologies, reaching a combined audience of over 170,000 at-risk individuals. Canada Redirect aimed to match content that aligned closely as possible with the at-risk users’ original search intention, tapping into a range of ecosystems including music, gaming and literature. Content specific to the Canadian context was also utilised to increase its relevance and impact on the at-risk audience. During the final phase of the project, Moonshot conducted experiments to test innovative methods increasing audience engagement using different messaging styles, the results of which identified best practices for future CVE digital campaigns in Canada.

Learn more

Bundesverband Mobile Beratung (Mobile Advisory Network)

In Germany, a self-organized network of Mobile Advisors (Mobile Beratung) offer localized advice against right-wing extremism. Regional mobile advisory teams exist nationwide from Berlin to Bavaria and are professionally supported by a central association (Bundesverband Mobile Beratung)1. The Mobile Advisory teams work with those concerned about an instance of right-wing extremism to develop contextualized solutions and identify allies with relevant expertise.

The requests that the mobile advisory teams respond to are diverse; a planned demonstration by a right-wing extremist party, a racist trainer at the gym, swastika graffiti on bus stops, hate speech in stickers at school. Each regional team offers counsel by phone, email, video conference or on-site visits for those interested in working against right-wing extremism. Advice is given confidentially, free of charge, and anonymously if requested. While they provide one-off assessments and advice, the mobile advisors can also develop long-term strategies for municipalities and civil society to respond to right-wing extremism. In some regions, they offer specialized counsel or programming for targets of extremist violence, for local educators and associations, for local political groups or companies, and for relatives and friends.


Life After Hate

Life After Hate is committed to helping people leave the violent far-right to connect with humanity and lead compassionate lives. They were founded in 2011 by former violent extremists committed to a singular cause: Making sure that anyone wanting to leave a hate group would never have to do it alone. They know that hateful ideology is not a prerequisite for joining a hate group. Alienation, trauma, shame, and abuse are often pre-existing factors, and exiting a hate group means tackling these issues holistically.

To facilitate this process, they launched ExitUSA™, an intervention program operated by highly developed and empathic formers. It provides around-the-clock tailored support, education, and referral services. It can be challenging to quantify the impact of social intervention programs like those provided by Life After Hate because every individual’s case is different. Since the deadly “Unite the Right” rally in Charlottesville, VA in August 2017, they have helped more than 500 individuals and their families to leave white supremacy.


Institute for Strategic Dialogue

Founded in 2006, ISD is a UK-based ‘think and do’ tank dedicated to understanding and innovating real-world responses to the rising tide of polarisation, hate and extremism of all forms. They combine anthropological research, expertise in international extremist movements and an advanced digital analysis capability that tracks hate, disinformation and extremism online, with policy advisory support and training to governments and cities around the world. ISD also works with civil societies around the world, helping them understand the fractures in their community and the paths to reconciliation. Their programmes operate in contexts as diverse as Kenya, Macedonia, France, Lebanon, and the UK.

In 2011, ISD built the first global network of former extremists dedicated to countering extremism in their communities. The 550 members of the Against Violent Extremism Network (AVE) — the world’s largest network of former extremists and survivors of attacks — continues to provide interventions that address the broad range of underlying factors that can lead young people into extremism2. One such targeted intervention ISD tested was direct messaging on Facebook as a means connecting former extremists and counsellors with over 800 extremists on Facebook. The programme tested the effectiveness of a variety of direct intervention approaches, finding that nearly two out of three (64%) right-wing extremists who responded engaged with intervention providers in a sustained way3.

Tech platforms can create immediate and lasting impact

The internet is enabling white supremacy to spread at an alarming rate. But technology can help to combat violent extremism. While more collaboration and solutions are needed, technology policy changes and strict enforcement in the last three years have resulted in some marked improvements.



Over the past few years, YouTube has introduced new methods for detecting and removing violative content, combined with tougher stances toward videos with supremacist content.



Facebook has made a concerted effort to combat violent white supremacy on its platforms, providing a case study that explores the impacts of deplatforming white supremacists.

It will take the work of many organizations, policymakers and technology companies to counter violent white supremacy.



“Mobile Beratung Gegen Rechtsextremismus.” Bundesverband Mobile Beratung E.V.,


“Against Violent Extremism (AVE).” ISD, 27 Aug. 2019,


DAVEY, JACOB, et al. “Counter-Conversations: A Model for Direct Engagement with Individuals Showing Signs of Radicalisation Online.” ISD, 23 Oct. 2018,


Nouri, Lella, et al. “Following the Whack-a-Mole: Britain First’s Visual Strategy from Facebook to Gab.” RUSI, 4 July 2019,

I n 2017, YouTube introduced a new treatment for videos with supremacist content, whereby they limited recommendations and disabled features like commenting and the ability to share the video. While the videos remained on YouTube, these steps on average reduced views to videos by 80 percent.

In June 2019, the company went a step further by updating their hate speech policy with new language prohibiting content that alleges any group is superior in order to justify discrimination based on qualities like race, religion, or sexual orientation. Thousands of the videos that were previously stripped of features were removed altogether. In the following 3 months, YouTube saw a 5x increase in the number of videos removed and the number of channels terminated due to hate speech.

F acebook has employed a range of moderation efforts to combat abuse at scale across its platforms (Facebook, Instagram, and Whatsapp). While much more work remains, a case study from 2018-19 highlights the impacts of deplatforming white supremacists and suggests Facebook’s efforts at deplatforming influential organizations in particular is a step in the right direction.

The researchers found that the group’s move from Facebook to Gab resulted in a dramatic decrease in activity; one year after Britain First created its official Gab page, it had attracted less than 1% of its previous online followers on Facebook. Engagement with posts on Britain First’s page also dropped precipitously, with image posts on Gab receiving close to 1% as many comments and shares/reposts as image posts had received on Facebook. In the transition, Britain First’s posts changed qualitatively to contain more extreme anti-LGBTQ messages and threats to dox individuals once on Gab, which it had not previously posted on Facebook, likely due to stricter policies and enforcement. This suggests that while deplatforming can reduce extremist community size and engagement rates, it may also enable subsets of these communities to engage with more extreme content on alternative platforms4.