Social Media Purges are Actually a Human Rights Victory

By: Nicole Himel

It was the error message heard around the world. 

The former US President was not able to log in to his favorite methods of mass communication, and the internet was just a little quieter.  Although Facebook and Twitter receive criticism for not upholding a theoretical obligation they may have as a quasi-public utility to facilitate and protect free speech, they, in fact, do not care about the 1st amendment.  Actually, they would prefer to keep misinformation and violence on their platform because it is great for engagement, which is the cornerstone of their business.

Purging users who misuse the platform is not the moral actualization of a soulless company that has found the light.  Nor is it the end of freedom and the herald of new age wherein we must forever fear and obey our corporate overlords.  This was the result of action by human rights advocates who have been diligently working behind the scenes for years to change the operations of incredibly profitable and powerful companies.

An advantage of the private sector is its ability to innovate; companies can move fast and in any direction. One of the things that social media companies innovated in the last few years was an algorithm that gathers the most engaging content and then connects people around that content. The consequences are that many people who got connected to certain content, then got disconnected from friends, family, and reality.

The Government of Myanmar thought the anti-muslim, ultranationalist, rhetoric in the country was very engaging, and so they decided to use Facebook to reinforce this rhetoric and restrict independent reporting.  In the United States, leading government officials have amplified specific engaging content that encourages people to kidnap or murder their political opponents.

So, what do we expect Facebook and Twitter, as private companies, to do about it?  Well, civil society organizations like the Dangerous Speech Project think they should make dangerous speech, which is language meant to persuade a group to dehumanize another and commit violence against them, either less abundant or less convincing. The Change the Terms policies, created by a coalition of human rights and consumer protection organizations, has corporate policy recommendations that range from adjustments to terms of service, to how to manage ongoing evaluation even in the face of state actors and bot campaigns abusing their products, and, of course, when to block a user from the platform.

Human rights can be a tricky field, and when it comes to getting businesses to respect human rights, sometimes you need to think a little creatively.  The important thing to remember is that there are more stakeholders than just the government and the company; everyone who is affected by these decisions should be able to share how they're affected and demand change.

Despite the availability of expert recommendations, what made Facebook finally act was a growing asset in the human rights advocacy toolbox: the investor letter. In response to their concern about content management, investors representing $390 billion in assets under management urged Facebook and Twitter to “reduce the amplification of false and divisive information used to incite violence” and “address its business model and reliance on algorithmic decision making, which has been linked to the spread of hate and disinformation online.”

It can be frustrating that investors have had more success in changing Facebook's promotion of dangerous speech than the government has, and it's true that social media companies do need appropriate government regulation. But what is amazing is that due to the continuous hard work of human rights experts, who actually are concerned with freedom of speech, these recommendations got through the morass of both corporate and government bureaucracy and affected actual changes to a problem that threatens people’s lives. They should be celebrated for that.

Social media companies use their concern about “free speech” as a way to justify their inaction on difficult human rights issues. We should not buy that excuse. We should not think of these social media purges as a threat to our freedom but as the result of smart people working incredibly hard for years to affect change within a complicated system and move a world in a better direction. That's the kind of content we should be sharing. 

Nicole Himel is pursuing an MPA at Columbia University and is a graduate consultant on projects concerning the private sector and human rights.

OpinionNicole HimelComment