Suicide Video on YouTube Highlights Moderator Incompetence
Posted by: Timothy Tibbetts on 01/04/2018 06:30 AM [ Comments ]
The internet was outraged by news that YouTube star Logan Paul, who has 15 million subscribers and is part of YouTube’s Red subscription service, posted and later deleted a video that included extensive footage of a suicide victim filmed at Japan’s ‘Suicide Forest’.
Paul deleted the video less than 24 hours after posting it following outrage, but not before some six million people had watched it - after it had been okayed by YouTube’s moderation team.
Not only did it get past the moderation team, but it was also in the top 10 trending videos on a YouTube channel that has many subscribers under 18.
While not even close to this disgusting video, MajorGeeks has had videos removed and given a check (3 checks and you're out) for no reason. YouTube won't give a reason, nor listen to an explanation. Two videos, in particular, were IObit tutorials done with the permission of IObit, not that it's needed.
We would call allowing this sort of video to be posted a tragedy, especially given that the video was posted on a viral channel that makes YouTube money.
We're not against making money, but it's time Google and YouTube revamped their moderation policies and made it useful in any way. Popular channels should not be getting a pass while smaller channels aren't given a chance to grow. It's the common sense that Google used to be excellent at.
TechCrunch reports that Google is working on a moderation AI, and adding more moderators. Now, that should be hilarious:
YouTube has pledged to increase its investment in artificial intelligence moderation and increase its army of content checkers and moderators to 10,000 people, but a more thorough revamp of its approach seems to be needed. There’s also plenty of much-justified concern that relying on AI won’t be enough, as evidenced by Google’s failure to respond to questions and exampled aired by the Home Affairs Committee in the UK’s Parliament weeks ago.
Source: TechCrunch
Not only did it get past the moderation team, but it was also in the top 10 trending videos on a YouTube channel that has many subscribers under 18.
While not even close to this disgusting video, MajorGeeks has had videos removed and given a check (3 checks and you're out) for no reason. YouTube won't give a reason, nor listen to an explanation. Two videos, in particular, were IObit tutorials done with the permission of IObit, not that it's needed.
We would call allowing this sort of video to be posted a tragedy, especially given that the video was posted on a viral channel that makes YouTube money.
We're not against making money, but it's time Google and YouTube revamped their moderation policies and made it useful in any way. Popular channels should not be getting a pass while smaller channels aren't given a chance to grow. It's the common sense that Google used to be excellent at.
TechCrunch reports that Google is working on a moderation AI, and adding more moderators. Now, that should be hilarious:
YouTube has pledged to increase its investment in artificial intelligence moderation and increase its army of content checkers and moderators to 10,000 people, but a more thorough revamp of its approach seems to be needed. There’s also plenty of much-justified concern that relying on AI won’t be enough, as evidenced by Google’s failure to respond to questions and exampled aired by the Home Affairs Committee in the UK’s Parliament weeks ago.
Source: TechCrunch
Comments