The Problem

Toxicity is Damaging our Digital Public Spaces

W ith an increasingly connected world, the internet has become the primary place where people go to share their thoughts, ideas, and opinions. But as more voices join the conversation, toxic comments can crowd out healthy dialogue. Hateful speech and online harassment make people less likely to participate online and more likely to leave platforms altogether, or retreat to filter bubbles. Toxicity is a global problem, and has potentially dire consequences for freedom of expression.

While toxicity is pervasive online, it disproportionately impacts the most vulnerable members of society. People of color are more likely to be harassed online, as are women, religious minorities and people who identify as LGBTQ+. By silencing underrepresented voices and discouraging participation from specific targeted groups, toxicity reduces diversity of thought in conversations online.

“While at best, comment sections became places for dynamic conversation and exchange, they could also become irrelevant or loaded with spam and vitriol.”

-The New York Times

Marginalized groups are most affected by toxicity

According to a study conducted in December 2018 by the Anti-Defamation League, marginalized groups experienced the most toxicity online. Identity-based harassment was most common against LGBTQ+ individuals, with 63% of LGBTQ+ respondents experiencing harassment online because of their sexual orientation. Race-based harassment was also found to affect 30% of Hispanics or Latinos, 27% of African-Americans, and 20% of Asian-Americans. Women reported harassment disproportionately, with gender identity-based harassment affecting 24% of female-identified respondents, compared to 15% of male-identified respondents.

According to 2015 data from the United Nations, Women are 27x more likely to experience cyberviolence than men

Source: The New York Times

In 2018, 63% of LGBTQ+ respondents experienced harassment online

Source: Anti-Defamation League

Research about toxicity online and the effects of prolonged exposure to hateful content is a nascent field of study. But research about online discrimination consistently shows that it can lead to negative outcomes. For example, 10% of female journalists have considered leaving the profession out of fear while others avoided certain coverage areas in an effort to mitigate the risk of harassment. This is even more dire since new research from 2020 points to a need to retain diverse journalists in newsrooms.

Toxicity makes it harder for independent journalism

While toxicity affects the internet as a whole, publishers are especially affected by the burdens of moderation and toxic content. Independent and local news sites pride themselves on offering spaces for community dialogue, and see healthy conversation as a vital part of civic participation. The New York Times say they aim to “provide a safe platform for diverse communities to have diverse discussions and allow readers’ voices to be an integral part of nearly every piece of reporting.” With a more local view, The SouthEast Missourian says “We believe a community that is positively and robustly engaged is a better place to live.”

“A publisher's ability to provide a safe space for people to engage with each other online is paramount to our success. Audience development through engagement has shown to provide sustainable results; increasing traffic, loyalty, and subscriptions, all necessary components for a publication to succeed today.”

-Adi Jain, Vice President, Product at Disqus

Many publisher business models rely on reader engagement to drive ad revenue. As toxic conversations pervade their forums, reader engagement falls and that ad revenue diminishes. This could mean that smaller publishers can’t afford the increased costs of providing high-quality moderation on their platforms, putting further strain on local new organizations that are already struggling. The value of local news sources to their community is hard to overstate, and the loss of these sites of community dialogue has far-reaching consequences. Toxicity directly imperils independent journalism, and makes it difficult for publishers to host healthy dialogue in their comment sections.

“At Coral, we focus on making the essential work of moderators easier and faster. Perspective helps us by identifying comments in several languages that require immediate attention. The Perspective team are terrific collaborators in this mission.”


Fighting a losing battle against your own comments section can be costly

Battling toxicity can create challenges for publishers. In an effort to provide high-quality moderation on their forums, some publishers are able to hire more staff or seek out volunteer moderators to keep their comments clean. Other publishers ask authors to spend hours moderating their own columns, taking valuable time away from people who could be writing instead.

“[Toxicity] had forced us to close comments on stories sooner than we would like simply because we didn’t have the resources to sort through them all.”

-The New York Times

Facing this uphill battle against toxicity, many publishers like Vice, The Atlantic, and Reuters made the decision—at the expense of reader engagement, advertising revenue and healthy public dialogue—to shut down their comments section altogether. Even The New York Times, prior to partnering with Jigsaw to address the issue, has said that the challenge of moderating their forums, “has forced us to close comments on stories sooner than we would like simply because we didn’t have the resources to sort through them all. Many of our best stories are never opened for comments at all.”


What would the internet look like if it was less toxic?

Reducing toxicity online could help restore a healthier, more vibrant internet for everyone. If publishers can offer more spaces for their users to share and express ideas, people would be able to more freely engage and connect with each other. Children would be less at risk of being exposed to bullying, harassment, or toxic comments at an impressionable age. Employees of platforms would be freed up to work on more productive projects instead of patrolling comments for hate and toxicity.