YouTube said on Wednesday it would start removing content that falsely allege widespread fraud changed the outcome of the U.S. presidential election, in a change to its more hands-off stance on videos making similar claims.
The update, which applies to content uploaded from Wednesday, comes a day after “safe harbor,” a deadline set by U.S. law for states to certify the results of the presidential election.
YouTube said it would start enforcing the policy in line with its approach towards historical U.S. presidential elections.
Online platforms have been under pressure to police misinformation about the election on their sites.
YouTube, owned by Alphabet Inc’s Google, was widely seen as taking a more hands-off approach than Facebook Inc and Twitter Inc, which started labeling content with election misinformation. YouTube labels all election-related videos.
After the November election, Reuters identified several YouTube channels making money from ads and memberships that were amplifying debunked accusations about voting fraud.
Last month, a group of Democratic senators asked YouTube to commit to removing content containing false or misleading information about the 2020 election outcome and the upcoming Senate run-off elections in Georgia.
Asked about how the policy would apply to Georgia elections, a YouTube spokeswoman said this policy only applied to the presidential election.
YouTube said in a blog post on Wednesday that since September it had removed over 8,000 channels and thousands of misleading election-related videos for violating its existing policies.
The company said more than 70% of recommendations on election-related topics came from authoritative news sources.
YouTube also said that since Election Day, fact-check information panels had been triggered over 200,000 times on election-related search results
…