How Facebook should proceed in managing censorship on the site
May 16, 2019
Last week, Facebook banned a number of right-wing and anti-Semitic hate groups from the website. This is an improvement for Facebook’s moderation, but they should be mindful to respect the constitution in their censorship. Some might say that it is dangerous for Facebook to censor anything at all, but I would argue that is a slippery slope fallacy, given how unequivocally hateful some of the individuals banned were.
For example, one of the people banned from Facebook was Alex Jones, a right-wing conspiracy theorist that runs InfoWars, a news outlet of dubious integrity. One of the reasons Facebook banned Alex Jones was because he created the conspiracy theory that the Sandy Hook shooting was a hoax. In response, InfoWars fans sent hateful mail to family members of victims for their participation in the “hoax.” That sort of activity online is unacceptable no matter what you think of Alex Jones and InfoWars and Facebook made the right move in removing him from their platform.
In April, Mark Zuckerberg made a statement indicating willingness to work on further improving the management of censorship. He proposed a private, third-party censorship committee that would be responsible for censoring Facebook’s content and responding to user appeals. He failed to elaborate on just who the third-party would be, how it would operate and how its policies would be crafted in a constitutional manner. In his defense, his ambiguity is probably because creating a constitutionally-acceptable form of censorship in the United States can be quite difficult, so it is understandable for him to want to walk on eggshells when discussing this sort of thing.
The last industry that tried to leave censorship to a third party was the movie industry and it failed pretty hard. The censorship was called the Hays Code and it is the reason why in old movies, married people sleep in separate beds, no one uses even the most mild of profanities and you never see so much as a belly button in exposed skin. When the Hays Code got brought to the US Supreme Court, it was swiftly struck down, with movies being rightly declared art and thus not subject to censorship. Granted, pre-Hays code Hollywood was a pretty filthy place (to not mince words, it was the era of pornographic theaters and other really raunchy material), so maybe an overly-restrictive code for Facebook censorship could be beneficial insofar as it would allow Facebook to gauge how far the United States will allow it to go in censoring content. On the other hand, though, Facebook is an international forum with users across the planet – fundamentally different from the Hollywood movie industry — so it will not be easy to craft a sensible censorship plan that would appease all governments. Policy-wise, this may lead Zuckerberg to stick with the European standards of censorship, which is probably a pretty safe move, considering that the European censorship standards seem to me to be pretty reasonable.
One place where Facebook should really focus on improvement is its staff. As it stands, Facebook moderators are expected to review millions of posts per week, many of which are morally reprehensible. Expecting one person to go through so many terrible, potentially scarring photos is bordering on unethical business practices in my mind, and considering how wealthy Facebook is, it should really hire more moderators to help in censoring content on the website. It would be a safe move that can only help, since increased moderation staff doesn’t have to mean a change in moderation policy, which is the main concern.
It’s important for Facebook to improve its moderation. It is one of the largest websites in the world, and as an institution, Facebook can really offer a lot to the public in helping to disseminate information. In social movements, like #NoDAPL, Facebook can be a key factor in skirting around government censorship internationally, allowing progressive activists to be effective in trying to create social change. As such, Facebook owes it to the public to do its best to eliminate misinformation and hateful content from the website while maintaining a respectful relationship with the United States Constitution. With careful deliberation, this is an attainable goal for the website.