YouTube said Thursday it was tightening rules for propagation of conspiracy theories, notably targeting the QAnon movement already limited on Twitter and Facebook.

The Google-owned video-sharing service said it was expanding its policies on hate and harassment "to prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence."

This could mean removing videos threatening or harassing people by suggesting they are complicit in a conspiracy like Pizzagate, about a supposed child sex trafficking ring with links to former Democratic White House candidate Hillary Clinton that operated from a Washington pizzeria.

QAnon has grown sharply during the pandemic because it acted as a binding force, mixing its core tenet of anti-Semitic and white supremacist tropes with long-running conspiracy theories about vaccines and 5G mobile technology, as well as far-right and libertarian politics.

YouTube said it had previously removed "tens of thousands of QAnon videos" and terminated some channels used by the movement, notably those that explicitly threaten violence or deny the existence of major violent events.

Earlier this month Facebook banned QAnon-linked accounts on its core social network and on Instagram. Twitter began a crackdown on QAnon earlier this year.

The latest move by YouTube comes amid heightened tensions over misinformation spreading on social media, while some conservatives have accused platforms of bias in taking down content.