YouTube on Thursday said it’s overhauling its system for verifying users on its platform, as the Google-owned video site faces intense controversy over the content it pushes to users.
The company said the policy changes, which’ll go into effect next month, will move away from using subscription numbers to determine verification. Instead, the company will prioritize verifying “prominent channels that have a clear need for proof of authenticity.” (Currently, any channel with 100,000 subscribers or more is eligible for verification.)
YouTube also said it’s changing the way its verification badges look. Now the site will show a gray background behind a creator’s name, instead of using a checkmark or music note.
“Through our research, we found that viewers often associated the checkmark with an endorsement of content, not identity,” Jonathan McPhie, a YouTube product manager, said in a blog post. He said the company was making the change to “reduce confusion about what being verified means.”
After the change was announced Thursday, it caused outrage among some of YouTube’s millions of creators, who said their verified statuses were revoked because of the new requirements. “No one lost a verification badge today,” YouTube tweeted, responding to complaints. “If you received an email that your channel will no longer be verified, this was just an advanced notice & you can appeal.”
The new policy comes as YouTube faces an onslaught of scandals, including blowback for recommending content related to extremism and child exploitation. The video site isn’t the only big tech platform rethinking its verification policies. Twitter CEO Jack Dorsey pledged last year to improve the site’s famous blue checkmark system.