YouTube on Thursday became the latest social media giant to take steps to stop QAnon, the sprawling pro-Trump conspiracy theory community whose online fantasies about a cabal of satanic pedophiles running the world have spilled over into offline violence.

The company announced in a blog post that it was updating its hate-speech and harassment policies to prohibit “content that targets an individual or group with conspiracy theories that have been used to justify real-world violence.”

» JULY: Twitter bans thousands of accounts promoting QAnon conspiracy theory

The new policy will prohibit content promoting QAnon, as well as related conspiracy theories such as Pizzagate, which falsely claims that top Democrats and Hollywood elites are running an underground sex-trafficking ring from the basement of a Washington pizza restaurant.

Other social networks have also taken steps to curb the spread of QAnon, which has been linked to violence and vandalism.

Last week, Facebook hardened its rules related to QAnon content and compared it to a “militarized social movement” that was becoming increasingly violent. This week, several smaller platforms, including Pinterest, Etsy and Triller, also announced new restrictions on QAnon content.

Under YouTube’s new policy, which went into effect Thursday, “content that threatens or harasses someone by suggesting they are complicit” in a harmful theory like QAnon or Pizzagate will be banned.

» THIS WEEK: Twitter removes dozens of fake pro-Trump accounts

News coverage of these theories and videos that discuss the theories without targeting individuals or groups may still be allowed.

The QAnon movement began in 2017 when an anonymous poster under the handle “Q Clearance Patriot,” or “Q,” began posting cryptic messages on 4chan, the notoriously toxic message board, claiming to possess classified information about a secret battle between President Trump and a global cabal of pedophiles.

QAnon believers — known as “bakers” — began discussing and decoding them in real time on platforms including Reddit and Twitter, connecting the dots on a modern rebranding of centuries-old anti-Semitic tropes that falsely accused prominent Democrats, including Hillary Clinton and the liberal financier George Soros, of pulling the strings on a global sex-trafficking conspiracy.

Few platforms played a bigger role in moving QAnon from the fringes to the mainstream than YouTube. In the movement’s early days, QAnon followers produced YouTube documentaries that offered an introductory crash course in the movement’s core beliefs. The videos were posted on Facebook and other platforms, and were often used to draw recruits. Some were viewed millions of times.

» JULY: Facebook losing millions of dollars as companies cut off advertising

QAnon followers also started YouTube talk shows to discuss new developments related to the theory. Some of these channels amassed large audiences and made their owners prominent voices within the movement.

“YouTube has a huge role in the Q mythology,” said Mike Rothschild, a conspiracy theory debunker who is writing a book about QAnon. “There are major figures in the Q world who make videos on a daily basis, getting hundreds of thousands of views and packaging their theories in slick clips that are a world away from the straight-to-camera rambles so prominent in conspiracy theory video making.”

YouTube has tried for years to curb the spread of misinformation and conspiracy theories on its platform, and tweak the recommendations algorithm that was sending millions of viewers to what it considered low-quality content.

In 2019, the company began to demote what it called “borderline content” — videos that tested its rules, but didn’t quite break them outright — and reduce the visibility of those videos in search results and recommendations.

» APRIL: Twitter zaps GOP commentators for misinformation about coronavirus

The company says these changes have decreased by more than 70 percent the number of views that borderline content gets from recommendations, although that figure cannot be independently verified.

YouTube also says that among a set of pro-QAnon channels, the number of views coming from recommendations dropped more than 80 percent after the 2019 policy change.

Social media platforms have been under scrutiny for their policy decisions in recent weeks, as Democrats accuse them of doing too little to stop the spread of right-wing misinformation, and Republicans, including Mr. Trump, paint them as censorious menaces to free speech.

YouTube, which is owned by Google, has thus far stayed mostly out of the political fray despite the platform’s enormous popularity — users watch more than a billion hours of YouTube videos every day — and the surfeit of misinformation and conspiracy theories on the service.

Its chief executive, Susan Wojcicki, has not been personally attacked by Mr. Trump or had to testify to Congress, unlike Jack Dorsey of Twitter and Mark Zuckerberg of Facebook.

Vanita Gupta, the chief executive of the Leadership Conference on Civil and Human Rights, a coalition of civil rights groups, praised YouTube’s move to crack down on QAnon content.

“We commend YouTube for banning this harmful and hateful content that targets people with conspiracy theories used to justify violence offline, particularly through efforts like QAnon,” Ms. Gupta said. “This online content can result in real-world violence, and fosters hate that harms entire communities.”

Mr. Rothschild, the QAnon researcher, predicted that QAnon believers who were kicked off YouTube would find ways to distribute their videos through smaller platforms.

He also cautioned that the movement’s followers were known for trying to evade platform bans, and that YouTube would have to remain vigilant to keep them from restarting their channels and trying again.

“YouTube banning Q videos and suspending Q promoters is a good step,” he said, “but it won’t be the end of Q. Nothing has been so far.”