Oakland, California: Following Twitter and Facebook, YouTube said it is taking more measures to limit QAnon and other unprovoked conspiracy theories that may lead to actual violence.
The Google-owned video platform said Thursday that it will now ban materials targeting individuals or groups that use conspiracy theories as a ground for violence.
One example is video, which threatens or harass others by suggesting that someone is conspiring with someone, such as QAnon, which portrays Donald Trump as a secret warrior against celebrities and “deep state” government officials. The so-called child trafficking.
Pizzagate is another Internet conspiracy theory-essentially the predecessor of QAnon-in the prohibited category. Its promoters claimed that children were harmed in a pizzeria in Washington. Direct current. A person who believed in the conspiracy entered the restaurant in December 2016 and fired an assault rifle. He was sentenced to prison in 2017.
YouTube is the third of the major social platforms to announce control of QAnon’s policy, QAnon is a conspiracy theory conspiracy to spread.
Twitter announced its crackdown on QAnon in July, although it did not prohibit its supporters from exiting its platform. It does ban thousands of accounts associated with QAnon content and prevents URLs associated with it from being shared. Twitter also stated that it will stop highlighting and recommending tweets related to QAnon.
At the same time, Facebook announced last week that it will ban organizations that openly support QAnon. It said it will delete pages, groups and Instagram accounts representing QAnon-even if they don’t incite violence.
The social network stated that it will consider a variety of factors when determining whether a group meets its ban standards. This includes the name of the group, its biography or “about” section, and the page or discussion within the group on Facebook or Instagram in an account owned by Facebook.
YouTube said it has deleted thousands of QAnon videos and cancelled hundreds of channels in accordance with its existing policies, especially those that explicitly threatened violence or denied the existence of major incidents of violence.
The company said in its blog on Thursday: “All of this work is essential to contain harmful conspiracies, but we have more work to do to resolve certain conspiracies that justify real-world violence. Theory, such as QAnon.”
Experts said that this move shows that YouTube is taking the threat of violent conspiracy theories seriously and recognizes the importance of limiting the spread of such conspiracies. However, as QAnon more and more penetrated mainstream politics and American life, they wondered if it was too late.
“Although this is an important change, YouTube has been the main site for the spread of QAnon for the past three years,” said Sophie Bjork-James, an anthropologist at Vanderbilt University. “If there is no platform, Q may still be an obscure conspiracy. Over the years, YouTube has provided an international audience for this radical organization.”