YouTube Starts Crackdown Against Controversial Videos With New Moderation Policy

2018 could’ve hardly started on a worse note for YouTube, which saw one of its biggest stars in a major controversy. Now, the platform is making sure its contributors act more responsibly and the quality of content meets its criteria.

Preferred Creators Program

Having almost exclusively used its AI to moderate content earlier, YouTube is hiring a team of 10,000 workers to manually moderate and approve each video in its premium Preferred Creator program. Hopefully, the protocol will be stronger this time, especially as the Logan Paul video itself was alleged to have been manually moderated.

Such a step is necessary, as the Preferred Creator program sells itself on the back of its safe and engaging content that appeals most to teenagers and young adults. Since the ads sold on this program charge a premium over the regular content, it is important to ensure the credibility of the content.

Also Read

Popular YouTuber Under Fire for Showing Suicide Victim Video

Dark Mode for Mobile

YouTube will also improve the UI in the near future, with a dark mode which covers the entire mobile app instead of the player only, as well as a new gesture for conveniently closing video ads.

In the aftermath of the Logan Paul controversy, which showed the 15 million subscriber-backed star making fun of a corpse in Japan’s suicide forest, YouTube ousted Paul from its preferred ad program cancelling his YouTube Red projects, though, not before publishing a meaningless apology which hardly addressed the core issue.

The publishing and monetization of such a video in the first place does a lot to jeopardize the platform’s future. Hopefully, such an event will be better handled in the future and raise the benchmark of what to expect from YouTube’s prime creators.