YouTube clarifies 'hate speech' rules
- Published
YouTube has clarified its rules on hate speech to enable video-makers to know which content it considers to be "advertiser-friendly".
In a the video-sharing website said it would not allow adverts to appear alongside "hateful" or discriminatory content.
YouTube said it was making the changes to "address advertiser concerns around where their advertisements are placed".
But some bloggers say the rules are too strict and will affect their income.
In August, many YouTube stars complained that their videos had been flagged as "not advertiser-friendly" and were no longer earning ad revenue.
The latest announcement clarifies in detail the type of content that will not be able to earn money on the website.
It describes "hateful" content as any video that promotes discrimination or "disparages or humiliates" people on the basis of their race, ethnicity, nationality, religion, disability, age, veteran status, sexual orientation, gender identity, or "other characteristic associated with systematic discrimination".
The website will also refuse to place ads next to videos using "gratuitously disrespectful language that shames or insults an individual or group".
The guidelines also discourage film-makers from making "inappropriate" parody videos using popular family entertainment characters.
Previously, some video-makers had taken children's cartoon characters and created explicit videos with them. While such parody videos may not fall foul of copyright law, they will not be able to earn advertising revenue.
'Double standard'
YouTube added that even videos judged not to be "advertiser-friendly" could remain on the website as long as they did not fall foul of its guidelines.
However, the announcement has met with some criticism.
One user, Captain Sauce, pointed out that the algorithm used to detect whether a video may contain inappropriate content was not perfect.
"Context around many words is incredibly important and needs to be addressed," the user wrote.
Another pointed out that mainstream news networks often post inflammatory studio debates that could be judged "incendiary and demeaning", while music videos often pushed the boundaries of sexually-explicit content, but these still carried advertisements.
"Why punish the little guy, but not the big networks? This is a double standard," wrote Eugenia Loli.
YouTube said it would continue to "work to improve the ecosystem for creators, advertisers and users".
- Published24 April 2017