September 26, 2017
Rachel Librach
Last month, Google removed Gab from Google Play because it violated Google’s hate speech policy.
Gab, an alternative social networking service that seeks to promote free speech, filed a lawsuit against Google for violating antitrust laws on Sept. 18, according to BBC News.
While it is within Google’s right, as a private company, to be selective in who uploads content to their site, I can’t help but see the glaringly obvious double standard here.
Google’s hate speech policy states that the company does not support speech that promotes violence based on race, ethnicity, gender, religion and disability, among other classifications.
“This can be a delicate balancing act, but if the primary purpose is to attack a protected group, the content crosses the line,” according to Google’s User Content and Conduct Policy.
I understand that the Gab app’s users, who have allegedly been affiliated with the so-called “alt-right,” were participating in conversations that were in violation of Google’s hate speech policies.
But the app itself was not in violation of these policies; its users were.
If a less influential company was mandating these policies, I probably wouldn’t care as much.
But this is Google.
Google believes that they cannot be held responsible for user-generated content on all of their pages, forums, blogs, YouTube channels and other outside feed. It seems like they feel that creators of that content should be held responsible.
According to a CNET article, AdSense, Google’s advertising service, implemented policies that seemed to conflict with Google’s own content guidelines.
One of these policies state that “[the uploader is] responsible for ensuring that all of [their] content, including user-generated content, such as forum posts, blog comments or outside feeds, is in compliance with AdSense policies on any page or site for which [they’ve] enabled AdSense ads.”
In March 2007, Viacom sued Google for copyright infringements in videos that had been uploaded through YouTube.
However, Google argued that the Digital Millennium Copyright Act’s safe harbor provision protected it and other Internet service providers from being held responsible for copyright infringements committed by users, according to CNET.
Google argued that they should not be held responsible for what outside users post on YouTube, but they are now holding the creators of Gab responsible for the content that their users posted.
Is it right for Google to hold their creators to a higher level than themselves?
Google is a globally used search engine. The average user searches questions and keywords on a variety of diverse topics using Google search results.
If a company this big begins censoring content that they deem inappropriate, how far can they go before they completely disregard the First Amendment rights?
This double standard doesn’t just apply to Google. Social media sites like Twitter and Facebook have taken down content or deleted accounts on the basis that these users have violated a hate speech policy.
But it does not take much to see the hypocrisy and leanings of a political agenda.
It seems that, for the most part, people who post “hate speech” directed at conservatives or police officers face no real consequence.
For example, Michael Issacson, a professor at the John Jay College of criminal justice and one of the leaders of ANTIFA, tweeted this on August 23: “Some of y’all might think it sucks being an anti-fascist teacher at John Jay College, but I think it’s a privilege to teach future dead cops.”
How is this not categorized by Twitter as “hate speech?”
This is a clear attack on a specific group of people, almost threatening that if these students sitting in his class now become cops, they will most certainly be killed; he views this concept as a “privilege” to witness.
While this professor was placed on administrative leave, his tweet and his Twitter account still remain active, with no true consequences.
Don’t get me wrong; I don’t want Google or Twitter to remove Issacson’s or anyone else’s content. I believe that would be a violation of the First Amendment.
But if companies choose to censor certain content and use “hate speech” as an excuse, then they should apply this policy in all areas and not be selective, based on their political agenda.