YouTube has officially revealed to the world how they will deal with videos that don’t violate any of its policies but still contain offensive religious and supremacist content: hide them and make sure they can’t make any money.
The news comes as a status report on the promises made by Google general counsel Kent Walker in a June Financial Times op-ed, which announced YouTube was taking several steps to inhibit extremist videos. These steps included investing in machine-learning technology to help identify videos associated with terrorism, increasing the number of “Trusted Flaggers” to identify content that can be used to radicalize terrorists, and redirecting potential extremist recruits to watch counterterrorism videos instead.
In a post today, YouTube provided a better sense of what that means. Now, when YouTube decides that a flagged video doesn’t break policy but still contains “controversial religious or supremacist content,” the video will be put in a “limited state.” Here, the video will exist in a sort of limbo where it won’t be recommended or monetized. It also won’t include suggested videos or allow comments or likes.
This new approach will apply to desktop versions of YouTube within the next few weeks and on mobile soon after that.
Typically when YouTube removes a video, the video can easily be re-uploaded or a copy version can spread through different channels. More often than not, the video will obtain more attention and it encourages more people to watch and upload it in order to reach a wider audience. This is YouTube learning from its mistakes and preventing offensive spreading of content without the need for a full sensor.
“These organizations bring expert knowledge of complex issues like hate speech, radicalization, and terrorism that will help us better identify content that is being used to radicalize and recruit extremists. We will also regularly consult these experts as we update our policies to reflect new trends.”