Topline
YouTube on Thursday became the latest tech platform to restrict QAnon content, but the Alphabet unit didn’t outright ban posts about the conspiracy theory as Facebook did last week.
Key Facts
YouTube updated its policy to “prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence.”
Content that “threatens or harrasses someone by suggesting they are complicit in one of these harmful conspiracies, such as QAnon or Pizzagate” would be banned, YouTube said.
Previously, YouTube only removed QAnon content when it violated other policies related to hate or harassment and tried to de-amplify “borderline” content that didn’t explicitly break its rules.
Since 2019, YouTube’s efforts to deamplify QAnon content has resulted in an 80% drop in views to prominent QAnon channels through recommendations, the company said.
A number of prominent Q channels have already been removed as a result of the crackdown, but some are still on Twitter and lamenting the ban.
Key Background
As QAnon has surged in popularity, social media platforms are taking more aggressive action. Facebook banned QAnon earlier this month. Twitter blocked QAnon topics from its Trending section in July. Etsy, Peloton and Pinterest, too, have recently taken steps to restrict the fantastical conspiracy theory, which posits that President Donald Trump is secretly fighting a child sex-trafficking ring run by the Deep State and Satan-worshiping global elites.
What To Watch For
Even those who are banned may still may still have a presence on YouTube. QAnon followers often co-opt legitimate topics, such as #SaveTheChildren, use dog whistles and disguise their terminology to evade filters. QAnon still has a significant presence on Twitter in particular, despite the company’s policies.