It’s the never-ending battle for YouTube.
Every minute YouTube is bombarded with videos that run counter to it many guidelineswhether it’s pornography or copyrighted material or violent extremism or dangerous misinformation. The company has further developed its artificially intelligent computer systems in recent years to prevent most of these so called violent videos from the upload to the site, however comes on put to the test for failing to curb the spread of dangerous content.
To demonstrate its effectiveness in finding and removing videos that violate the rules, YouTube released a new metric on Tuesday: the Violative View Rate. This is the percentage of total views on YouTube that came from videos that don’t meet the guidelines before the videos are removed.
In a blog post, YouTube said violent videos accounted for 0.16-0.18 percent of all views on the platform in the fourth quarter of 2020. In other words, out of 10,000 views on YouTube, 16 to 18 were content that violated YouTube rules and was eventually removed.
“We’ve made a lot of progress, and it’s a very, very low number, but of course we want it to be lower,” said Jennifer O’Connor, director of YouTube’s Trust and Security Team.
The company said the rate of violations of the view improved year over year, from 0.63 percent to 0.72 percent in the fourth quarter of 2017.
YouTube did not disclose the number of times problematic videos were viewed before they were removed. This reluctance underscores the challenges facing platforms like YouTube and Facebook that rely on user-generated content. Even if YouTube makes progress in intercepting and removing banned content – computers detect 94 percent of problematic videos before they are even viewed – the overall views remain a noticeable number because of the large platform.
YouTube decided to include a percentage rather than a total as it helps contextualize the importance of the problematic content for the entire platform, Ms. O’Connor said.
YouTube released the metric, which has been tracking the company for years and which is expected to fluctuate over time, as part of a quarterly report detailing how it enforces its policies. In the report, YouTube offered buzz for the number of objectionable videos (83 million) and comments (seven billion) it had removed since 2018.
While YouTube cites such reports as a form of accountability, the underlying data is based on YouTube’s own decisions for which videos violate the guidelines. When YouTube finds that fewer videos are violent – and therefore less distant from it – the percentage of violent video views may decrease. And none of the data is subject to independent scrutiny, although the company hasn’t ruled it out going forward.
“We’re starting out by just making these numbers public and we’re providing a lot of data,” said Ms. O’Connor. “But I wouldn’t take that off the table yet.”
YouTube also said it counts views generously. For example, a view counts even if the user stopped watching before reaching the offensive part of the video, the company said.