Monday, December 23

YouTube Took Down 58 Million Videos that Violated its Policies

The biggest video streaming platform YouTube has reportedly taken down more than 58 million videos and 224 million comments during the third quarter based on violations of its policies.

Along with this, government officials and interest groups in the United States, Europe and Asia have been pressuring YouTube, Facebook and other social media services to quickly identify and remove extremist and hateful content that critics have said incite violence.

The European Union has proposed online services should face steep fines unless they remove extremist material within one hour of a government order to do so. 

An official at India’s Ministry of Home Affairs speaking on the condition of anonymity stated that social media firms had agreed to tackle authorities’ requests to remove objectionable content within 36 hours.

This year, YouTube began issuing quarterly reports about its enforcement efforts. As with past quarters, most of the removed content was spam, YouTube said.

Automated detection tools help YouTube quickly identify spam, extremist content and nudity. During September, 90 percent of the nearly 10,400 videos removed for violent extremism or 279,600 videos removed for child safety issues received fewer than 10 views, according to YouTube.

But YouTube faces a bigger challenge with material promoting hateful rhetoric and dangerous behavior.

Automated detection technologies for those policies are relatively new and less efficient, so YouTube relies on users to report potentially problematic videos or comments. This means that the content may be viewed widely before being removed. 

Google added thousands of moderators this year, expanding to more than 10,000, in hopes of reviewing user reports faster. YouTube declined to comment on growth plans for 2019.