F
acebook has employed a range of moderation efforts to combat abuse at scale across its platforms (Facebook, Instagram, and Whatsapp). While much more work remains, a case study from 2018-19 highlights the impacts of deplatforming white supremacists and suggests Facebook’s efforts at deplatforming influential organizations in particular is a step in the right direction.
In March 2018, Facebook removed Britain First, a UK-based extremist group known for violent anti-migrant positions. Britain First re-emerged rapidly on Gab, an alt-tech platform with few restrictions on hate and a significantly smaller user base. Researchers analyzed the impact of this deplatforming on Britain First, providing some of the only empirical evidence on the larger impact of tech platform actions.
The researchers found that the group’s move from Facebook to Gab resulted in a dramatic decrease in activity; one year after Britain First created its official Gab page, it had attracted less than 1% of its previous online followers on Facebook. Engagement with posts on Britain First’s page also dropped precipitously, with image posts on Gab receiving close to 1% as many comments and shares/reposts as image posts had received on Facebook. In the transition, Britain First’s posts changed qualitatively to contain more extreme anti-LGBTQ messages and threats to dox individuals once on Gab, which it had not previously posted on Facebook, likely due to stricter policies and enforcement. This suggests that while deplatforming can reduce extremist community size and engagement rates, it may also enable subsets of these communities to engage with more extreme content on alternative platforms4.