THUMBS DOWN ON UP NEXT

YouTube’s algorithm apparently helped a Chinese propaganda video on Hong Kong go viral

A watchdog site says YouTube’s algorithm heavily promoted this video.
A watchdog site says YouTube’s algorithm heavily promoted this video.
Image: CGTN/YouTube
By
We may earn a commission from links on this page.

A Chinese-made propaganda video about the Hong Kong protests went viral, apparently thanks to YouTube’s algorithm. The video—called “Who’s behind Hong Kong protests?”—argues that US agents are stirring protests in Hong Kong, and has more than half a million views on YouTube. It was created by China’s state broadcaster, China Global Television Network.

On August 24th, YouTube recommended it six times more than the average video when users searched for “Hong Kong protestors,” according to AlgoTransparency, making it one of the most-recommended videos on that subject.  The most-viewed video on Hong Kong protests by the Wall Street Journal, BBC, and New York Times all had fewer views as of October 31st 2019.

The Chinese government has repeatedly claimed that the US is behind protestors, though it has failed to present any evidence. Attempts to present Hong Kong protests as an American uprising often involve outright falsehoods, such as Chinese media reports that a toy weapon was a US army grenade launcher.

AlgoTransparency, created by former YouTube engineer* Guillaume Chaslot, analyzes videos recommended on thousands of channels daily. The data is based on blank profiles; as YouTube recommends videos based on users’ history, different individuals may have different recommendations. Overall, YouTube’s algorithm was created to optimize for watch time which, Chaslot has shown, often leads to YouTube recommending more extreme videos in a bid to capture attention. “You can go from more radical video to more radical video,” says Chaslot. “There’s a rabbit-hole effect.”

YouTube said it disagreed with Algotransparency’s methodology, data, and conclusions, and was unable to reproduce Algotransparency’s results. “The Algotransparency tool was created outside of YouTube and does not accurately reflect how YouTube’s recommendations work, or how users watch and interact with YouTube,” said YouTube spokesman Farshad Shadloo. “We’ve designed our systems to help ensure that content from more authoritative sources is surfaced prominently in search results and watch next recommendations in certain contexts, including when a viewer is watching news related content on YouTube.”

After publication of this story, Google posted a six-tweet thread.

In an email to Quartz, Shadloo cited two Vox videos and an Economist video with more views than the Chinese video. He also referenced a New York Times video which, while it had more views than the one referenced in the chart above, still had fewer views than the Chinese video.

The Chinese network’s video also now includes a warning that it “may be inappropriate for some users”:

Screenshot of a blank video and the words, "This video may be inapproprate for some users"
“Who’s behind Hong Kong protests?” now carries this warning.
Image: YOUTUBE

Google similarly challenged AlgoTransparency’s methods when Chaslot exposed similar behavior previously; in April he showed that YouTube recommended a Russia Today video on the Mueller report more than 400,000 times. YouTube algorithm’s goal of maximizing watch time makes it vulnerable to election interference, says Chaslot: “We have no guarantee that it cannot be gamed.”

Chaslot said that when he worked at Google, prior to 2013, his YouTube colleagues did not prioritize creating an algorithm that was difficult to abuse. Recently, Chaslot has been critical of his former employer for failing to release data on recommendations; he discussed some of these concerns at MozFest, hosted by Mozilla, this month. The algorithm has massive impact: 70% of videos watched on YouTube are recommended by AI. Though YouTube said it would stop recommending conspiracy theories earlier this year, Chaslot says issues with the algorithm recommending extremist content remain. “There was no radical change,” he adds.

This story was updated at 10:45 am on Nov. 1, 2019 to include YouTube’s tweeted rebuttal, information from Shadloo’s email to Quartz, and the appearance of the warning on the Chinese video.

* UPDATE Nov. 6, 2019: YouTube acknowledges Chaslot was an engineer for its parent company, Google, and YouTube spokespeople Farshad Shadloo and Ivy Choi declined repeated opportunities to dispute Chaslot’s assertion to Quartz that he worked on a YouTube project. Shadloo and Choi took issue with calling Chaslot a former “YouTube engineer,” saying it was incorrect because he was not employed by the Google subsidiary.

Let us know if you know more.

This story is from Quartz’s investigations team. Here’s how you can reach us with feedback or tips:

Email (insecure): investigations@qz.com
Signal (secure): +1 929 202 9229
Secure Drop (secure & anonymous): qz.com/tips