Twitch, the livestreaming site beloved by video gamers that has become a key internet communications service, said on Thursday that it had created new rules aimed at clamping down on egregious purveyors of misinformation.
The company, acknowledging the real-world harms that could come with its rapidly expanding influence, said it would prohibit “harmful misinformation superspreaders who persistently share misinformation on or off of Twitch.”
Twitch will take down only channels that meet a handful of criteria. To fall afoul of the new policy, users must be persistently sharing harmful misinformation that has been widely debunked, the company said, adding that it had “selected these criteria because taken together they create the highest risk of harm, including inciting real-world harm.”
The site is “taking this precautionary step and updating our policies to ensure that these misinformation superspreaders won’t find a home on our service,” Angela Hession, Twitch’s vice president of trust and safety, said in a statement.
Twitch started 15 years ago as a tiny start-up called Justin.tv. It was acquired by Amazon in 2014 and has become widely used by video game players and a major internet platform where celebrity content creators and ordinary people broadcast every moment of their daily lives.
About 31 million people visit Twitch each day, according to company data, and more than eight million broadcast each month. Most of the content is associated with video games, with streams about popular titles like Call of Duty and Fortnite.
The company framed the new policy as a move to get ahead of surges of misinformation that could afflict the platform, rather than as a response to current issues.
Twitch has argued that misinformation is less of an issue on its platform than on other social platforms, in part because the lengthy but fleeting nature of livestreams makes it harder for falsehoods on Twitch to be taken out of context and go viral. The new policy will initially affect fewer than 100 channels, the company said.
But misinformation experts it consulted, like the Global Disinformation Index, a nonpartisan team of researchers, told Twitch that a handful of users could account for a preponderance of online lies.
Misinformation covered by the new policy includes content related to dangerous medical treatments, lies about Covid-19 vaccines, falsehoods “promoted by conspiracy networks tied to violence,” content that “undermines the integrity of a civic or political process” — including lies about election fraud — and content that could harm people during emergencies like wildfires and shootings.
The policy will also apply to Russian state-controlled media channels that spread misinformation, Twitch said, adding that it had found only one such channel so far, with very little activity.
Twitch generally has had stricter rules than other social media platforms about what views its users can express. But in 2020, after platforms like YouTube and Twitter clamped down on far-right conspiracy theorists promoting false theories about the presidential election, Twitch saw an uptick in such streamers, who used it as a new place to earn money and spread lies.
Followers of the baseless QAnon conspiracy theory — which posits that former President Donald J. Trump is fighting a cabal of Democratic pedophiles — were particularly well represented among this group of several dozen Twitch users.
In April, Twitch told The New York Times that it was developing a misinformation policy. It said it would “take action against users that violate our community policies against harmful content that encourages or incites self-destructive behavior, harassment, or attempts or threatens to physically harm others, including through misinformation.”
Also last year, the company announced a policy that would allow it to suspend the accounts of people who have committed crimes or severe offenses in real life or on other online platforms, including those who engaged in violent extremism or were members of a hate group.
QAnon content, though, was still allowed, because Twitch said at the time that it did not consider QAnon to be a hate group. A Twitch spokeswoman said the new policy included QAnon as a conspiracy theory that promoted violence.
Some channels have also spread health-related conspiracy theories: One belonging to a man who goes by Zak Paine, with more than 17,000 followers, pushes debunked theories about vaccines and cancer while also promoting the QAnon conspiracy theory.
In one stream, he and a guest encouraged his viewers to drink a bleach solution that claims to cure cancer but that the Food and Drug Administration has said is dangerous. Other streamers, like Terpsichore Maras-Lindeman, have fought to overturn the result of the 2020 presidential election.
Minutes after the new policy took effect, the channels belonging to Ms. Maras-Lindeman and Mr. Paine disappeared. Mr. Paine, who was livestreaming at the time, was cut off in the middle of a recorded advertisement. They were replaced with a message: “This channel is currently unavailable due to a violation of Twitch’s community guidelines or terms of service.”
Neither could immediately be reached for comment.