Social media outlets fail in responses to cyberbullying

April 6, 2015

April Wefler
[email protected]

One of my good friends was recently harassed on YouTube.

Her harasser then stalked several different channels my friend was subscribed to. Many people were spammed with comments insulting a fictional character on a popular soap opera. But then it became very real.

My friend was told to kill herself and then one of her friends was told derogatory comments about her race. All of this because of differing opinions on a television show.

But what’s even more disheartening is that YouTube didn’t do anything about the situation. We all blocked the user and reported the comments.

Nothing happened.

On the YouTube Safety Center page, it recommends deleting comments or blocking the user. If that doesn’t work, then the next course of action is to turn off the comment option entirely.

My friend did this. She even unlinked Google with YouTube so no one could reply to her comments, but still the harassment continued.

Finally, if nothing has worked, YouTube says to report the harassment.

On its Policy Center page, YouTube claims “accounts dedicated to harassing a particular user or the community at large will be terminated.”

This user has harassed not only my friend on numerous occasions, but also many other people, and about the same subject.
Her account has not been terminated.

YouTube clearly has some issues with enforcing its termination policy. Accounts are terminated daily due to strict copyright laws, but when it comes to harassment, nothing is done.

The user’s harassing went beyond insulting a character on a television show. It turned into racist remarks and death threats, blatant cyberbullying.

Under “Hate Speech” on its Policy Center page, YouTube states that it is “not okay to post malicious hateful comments about a group of people based solely on their race.”

But apparently it’s fine to post malicious hateful comments about an individual based solely on her race.
Then, the user started harassing my friend on Twitter.

Included in its Online Abuse page in its Support Center, Twitter recommends blocking an abusive user. The site states that “abusive users often lose interest once they realize that you will not respond.”

My friend eventually contacted her local police to ask who she could talk to about cyberbullying. Twitter recommends this step if the other steps fail.

While police are well equipped to deal with these situations, the lack of discipline for cyberbullying on both YouTube and Twitter is appalling.

Facebook is a little better when it comes to cyberbullying, and YouTube and Twitter should take notes. Facebook’s Statement of Rights and Responsibilities includes policies on bullying, intimidating or harassing and hate or threatening speech.

According to the Facebook Help Center, when a user reports something to Facebook, it is reviewed and removed if it violates the terms.

Many states have laws on cyberstalking, including California, where YouTube and Twitter are headquartered.

Clearly state laws aren’t enough.

The Internet is so widely used that it is imperative for websites to have stricter cyberbullying laws and to take reports of cyberbullying seriously.