Things You Should Know About Hate Speech in YouTube Videos

By on October 15, 2019

After several changes to the world’s most popular video streaming platform, YouTube becoming more restrictive in 2019 surprises no one. However, unlike most changes operated lately, focused on ads, YouTube takes steps towards eliminating hate speech or any other type of potential harassment. What does that mean for the average YouTube user? Find out in the article below.

What Defines Hate Speech in a YouTube Video?

We all know how hard it is to police the Internet, especially when we’re talking about millions of users posting original content every day. However, even the most viewed YouTube video could be targeted by the upcoming changes. YouTube is constantly trying to control the type of content on their platform and, since June, has hate speech in its cross-hairs. It means any video that features supremacist views, any racial kind of attack or even those YouTube videos denying the existence of well-documented violent events. For example, a YouTube video in which the Holocaust is being questioned can be easily marked as against YouTube’s policy now.

Constant Changes in Policy

This new take on hate speech comes after a year filled with policy updates from YouTube. In April, they updated the harassment policy because of the constant harassment going on between various content creators on the platform. Such a huge platform is obliged to stay up to date and adapt to various issues that may appear as people share their work, ideas, and passions in a free environment.

Despite all the constant changes, the regular YouTube user or content creator should have nothing to fear. Only as long as what they are creating is not touching sensitive subjects such as harassment or hate speech.

What’s the Context for all the Changes?

Referring strictly to YouTube, there have been all kinds of slip-ups from the platform in recent times. It brought a lot of pressure to enforce and reform its policies, including the ones mentioned above. However, another incident attracted a lot of controversy about the ability of YouTube videos to have any control over the content posted. Back in March, the Alphabet-owned company struggled hard to keep copies of videos depicting mass-shootings at mosques in New Zealand off its platform. Even if it was one of the most disliked videos on YouTube, it still had plenty of time online, and it was even shared on other channels.

Everyone agrees that stricter control over the content posted on YouTube and other social media platforms is needed. With that in mind, YouTube also announced it could make big changes to the content addressed to children after a federal investigation. It could very well mean that we will have an easier time when we want to remove videos from YouTube. Also, Facebook announced it would remove all self-harm images on the platform to limit exposure on sensitive subjects like suicide or abuse of any kind.

What do you think? Should YouTube continue to improve its policies regarding sensitive subjects like harassment, hate speech of self-harm? Or should the Internet be left without censorship since it was designed as a free environment after all? We will be glad to hear your opinion on this very controversial aspect!

About Thomas Glare

Thomas Glare is passionate about all things tech and Internet. He’s been a freelance writer for several years now and loves to tackle subjects on social media and the way it can shape the future. When he’s not behind the keyboard, you can find him here for top notch entertainment.
Close

Like what you're reading?

Like us on Twitter, Facebook or Google+ for more!