In a new report on Monday, YouTube said it took down more than 8 million videos between October and December 2017 for violating its community guidelines. The majority of the videos were spam or people trying to upload “adult content.”
The information was included in YouTube’s first quarterly report on how it’s enforcing its community guidelines. “This regular update will help show the progress we’re making in removing violative content from our platform,” the video-sharing site said in a blog post.
According to the report, computers detect most of the videos that end up getting taken down. It said 6.7 million videos were first flagged for review by machines, not humans. Of those, 76% were taken down before receiving any views from users.
In recent times, YouTube has faced complaints from critics and advertisers who say the company has trouble tackling offensive content on its site.
YouTube also said it will add more details to the quarterly reports by the end of the year, such as information about comments, the speed of removal and policy removal reasons. YouTube also announced a “Reporting History” dashboard where users can check to see the status of videos they’ve flagged for review. On Monday, Google’s parent company, Alphabet said profits hit $9.4 billion in the first three months of 2018, a big jump from the $5.4 billion it reported a year ago.