Online Harassment: Testing the Limits of Free Speech

July 26, 2016

Few places on the Internet can claim to be free from harassment. Newspaper websites, social media platforms, online sports forums, gaming sites, and television fan comment boards deal with this problem. All corners of the web are impacted in some way by the scourge of vile and degrading speech aimed at public and private people alike. Eradicating this speech and creating an Internet that can be confidently used by everyone is a challenge that must be confronted by all sectors of society.

Importantly, this is not about stifling rigorous debate, nor is it about the right to not be offended. The content that we are concerned with is abusive, violent, sexist, racist, and dangerous. Words that carry with them the real threat of offline violence, psychological harm, and more generally the chilling of online speech.

Swatting, doxing, stalking, and serious reputational harm are all real-world consequences of what often begins as online harassment. Swatting i.e. causing a police SWAT team to be deployed to a victim’s house under false pretenses and doxing, i.e. tracing someone’s real world identity from information gathered online and then further abusing them are two acts that have developed over recent years with the increase in abusive online behavior.

All this is not to say that there are not good, supportive and beneficial communities that have been established online. Parenting forums, puzzle groups, local neighborhood sites, and self-help message boards all provide advice, encouragement, friendships, and respectful debates to regular users. The challenge is to discover how these supportive communities are able to develop and to work on fostering civility across the Internet using similar approaches.

Online abuse is far from a problem that solely impacts one demographic or part of society. Kids on gaming sites can become victims of cyberbullying. Adults on parenting message boards can be threatened with physical violence, and journalists can be verbally assaulted for doing their job. Criticism of elected officials can go far beyond their politics and performance and comments quickly descend into misogyny, racism and frequently threats of violence.

The causes of online anti-social behavior are debated at length by all those who are trying to encourage positive and productive online discussions. Questions about anonymity, community standards and terms of use, moderation and digital citizenship are frequently raised by those trying to address harassment on the Internet.

Anonymity has long been considered to be a cornerstone of a free Internet. With online anonymity, citizens living under oppressive regimes are able to criticize their government and discuss issues that they would be unable to do if their names were associated with their commentary. Teenagers are able to explore their sexuality and identity, without fear of repercussions. And those suffering from mental health issues can find support online, in confidence that their employer or others will not find out.

However, anonymity brings its own challenges. The absence of accountability that comes with being personally and openly associated with what you say or write, can result in more offensive and abusive language, tone and positions. Sites with a real-name policy have tried to solve that issue. In having a true real-name policy can require checking identity, including through government identity documents and negates the advantages of an anonymous web. Some services have explored requiring users to create a permanent profile to which comments are then tied. This creates more accountability without removing anonymity.

Community standards and terms of use are very important. They set the expectations for the conversation and they also allow for regulation of postings or comments. It is important that the terms are easily understood and companies should make the terms of service prominent on the site. Reporting functions for breaches of the standards are vital and this goes along with the possible need for moderation. Users should be able to easily report abuse and be provided with information about what reporting means and how the site may handle reports.

For certain forums and message boards it is necessary to have human moderators. Moderation will be needed for topics of particular sensitivity, for example, where young children are regular users, or particularly emotive news reports. Moderators should be well trained and supported. Their decisions should also be subject to appeal process. This helps ensure that those who are regulating conversation do so in an appropriate way, and those who have their comments removed are able to understand why. In turn this creates a better, more transparent, online environment.

Digital citizenship education is vital for children and needs to be an ongoing effort into adulthood. The individual responsibility that comes with posting and participating in online discussions, should be reinforced by parents, teachers, website operators and the wider online community. The conversations about civility should start when a child first gets online, and continue through their exploration of social networks as teenagers, the reading of newspapers as young adults and the use of parenting sites as adults.

Responding to the issue of online harassment, and the potential offline implications, has resulted in numerous initiatives, laws and policies. A fascinating exploration of the issue, as it relates to online newspapers, was carried out by The Guardian through its “The Web We Want” series in 2016. The reporting looked into the history of comments on the site, and examined how to end online abuse and have better online conversations. The Coral Project, a collaboration between Mozilla Foundation, The New York Times, and The Washington Post, and funded by a grant from the John S. and James L. Knight Foundation, looks to build better communities around journalism through open-source software. Industry are also working to take a stand against harassment through the Hack Harassment initiative.

Online harassment is a particular challenge for news sites and at times it has become a workplace safety concern for some journalists. Constructive debate should be encouraged and facilitated by these websites, but the abuse and hate that is often spewed prevents discussion and is a serious safety concern. Those who are subject to daily abuse have the right to expect to be supported and protected by their employers. In the most extreme circumstances, the services and assistance that are given to those who respond to reports of child sexual abuse material, may be used as a model for journalists and forum moderators.

In the United Kingdom, a group of MPs have launched a campaign entitled “Reclaim the Internet.” It is designed to challenge abuse online and brings together a broad spectrum of people and organizations to crowd-source solutions. They are looking at the role of companies, individuals, and police in creating a better Internet.

The serious nature of offline implications of online harassment has resulted in legislative proposals. For example in the United States, Congresswoman Katherine Clark introduced a bill to specifically criminalize swatting. Other on and offline crimes associated with harassment are already covered by existing legislation.

A real multi-stakeholder approach is needed, and it should extend far beyond the realms of the technology sphere. There need to be both technology tools as well as behavioral changes in order to address online harassment. Lessons need to be learned from the ways in which offline racism and sexism have been countered, and those techniques need to be applied to Internet abuse. At the same time the online world needs to advance beyond the limitations of the offline world; where hatred and misunderstanding remain. The Internet community needs to try and find responses and solutions to all forms of harassment and abuse.

The right of freedom of expression is owed to all Internet users. Not just those who feel that their right to express divisive comments is being infringed upon by moderators, terms of service or restrictions on postings. It also extends to those who are too afraid to participate in online activities and debates because of fear of harassment. Their right to express themselves must be protected too.

Policy Brief
Online Safety
Privacy

Written by

Emma Morris

As Global Policy Director Emma brings a global perspective and expertise to the broad spectrum of Internet privacy and safety issues. With particular focus on the United States, Europe, Oceania and parts of the Middle East she is able to interpret domestic actions and place them in an international context.