Does YouTube need to tighten up it’s regulations?

Nikhil Bhave
Sunday, 7 January 2018

‘Snuff videos’, in layman’s terms, is a genre of films where persons are shown being actually murdered or killing oneself. Of course, the likes of ISIS have now shifted the genre from the realms of hypothetical to scary reality. ‘Fighters’ from conflict zones continue to add such stuff. Such videos generally make their way to sites like YouTube, despite best efforts. But what about YouTube’s own contributors called YouTubers?

‘Snuff videos’, in layman’s terms, is a genre of films where persons are shown being actually murdered or killing oneself. Of course, the likes of ISIS have now shifted the genre from the realms of hypothetical to scary reality. ‘Fighters’ from conflict zones continue to add such stuff. Such videos generally make their way to sites like YouTube, despite best efforts. But what about YouTube’s own contributors called YouTubers?

Last week saw the social media up in arms as a famous YouTuber called Logan Paul ventured into Japan’s famous ‘suicide forest’. Naturally, he came across a hanging corpse.

Then, instead of turning the camera off, he continued filming and shot several close-ups of the dead man’s face (albeit blurred). Yes, it wasn’t a snuff video per se, but came pretty close to being one. This set off a firestorm on social media, which ultimately resulted in Paul taking down the video and apologising. But what about YouTube itself?

Generally speaking, while YouTube’s policy guidelines are generally not clear regarding death videos. The video did not contain any violence. However, it certainly contained sensitive material. It somehow escaped YouTube’s attention till Paul pulled it down. Or did it?

According to Wired, YouTube takes about 45 per cent of the earnings from the video. And like stated above, its policy norms are not much transparent. In such cases, there are bound to be aspersions cast at the content provider. When the user flags the content as inappropriate, nobody except Google know what happens next. Whether a human or an AI scan the video and check whether it should be flagged or not. In a step towards more transparency, YouTube can start by publishing a log of the process or making the guidelines clearer.

Of course, in today’s era of ‘get rich or die trying’, such content is never going to completely disappear. Like Cringepop, it also brings in more eyeballs and more profit to both YouTube and the content uploader. YouTube seriously needs to up its game if it doesn’t want unwanted publicity like this again. 

Related News