YouTubers attract an enormous amount of viewers with their content and generate millions of dollars from ad revenue on a platform that claims to run its business without tolerating harmful or hateful videos.
The world’s largest video platform, YouTube, in turn, gets billions of dollars from this ad revenue.
However, some of the moderators who work around problematic content on the website have pointed out that YouTube lets top creators get away with problematic content more easily than those with smaller channels.
11 current and former moderators that have worked with content decisions on YouTube have said that the platform doles out more lenient punishments for top creators who bring in the most money for the company.
According to these moderators, popular accounts “often get special treatment in the form of looser interpretations of YouTube’s guidelines prohibiting demeaning speech, bullying and other forms of graphic content.”
The moderators said that the platform made exceptions for big YouTubers such as Logan Paul, Steven Crowder, and PewDiePie. YouTube, in response, has denied these claims and says that it enforces its rules equally upon all of its content creators.
YouTube, the biggest video platform on the internet, has constantly come under fire in the past for letting harmful content linger on its website. The company has tried to resolve this issue several times, with efforts like manually reviewing millions of videos on the website and removing them completely. It has also constantly updated its guidelines and tweaked its recommendation system to stop suggesting harmful content to users.
In just the third quarter of 2018, YouTube removed over 58 million videos that were in violation of its guidelines.
Whether these accusations are true or not, it puts a big question mark on YouTube’s internal practices. It is clearly time for the platform to claim responsibility and be fully transparent about what is going on behind the scenes.