On February 1 the Guardian ran two pieces on alleged pro-Trump, pro-conspiracy, anti-government bias in Youtube’s “up next” algorithm.
The first – “Fiction is outperforming reality’: how YouTube’s algorithm distorts truth” – is a profile of “Guillaume Chaslot” a “French computer programmer” and an “ex-YouTube insider” who allegedly “reveals how its recommendation algorithm promotes divisive clips and conspiracy videos.”
“Did they harm Hillary Clinton’s bid for the presidency?” demands the standfirst, setting the agenda from the off.
To sum up for those who don’t want to plough through the bloated text, the article is a splicing of the author’s (Paul Lewis) own uninteresting personal experiences of the Youtube algorithm taking him “on a journey of its own volition” to “a video of two boys, aged about five or six, punching and kicking one another,” with broad entirely unchecked and uncorroborated claims from Chaslot about what he says the algorithm shows about Youtube bias.
The ‘meat’ of the story, if there is any, is that the public-spirited M. Chaslot was allegedly fired by Google in 2013, and has now – as a maverick outsider – built a program that can monitor Youtube’s secret algorithms used to select recommended content to its viewers.
Using this program Chaslot has allegedly discovered this rampant bias in favour of the usual suspects. In fact Guillaume claims his software has detected YT’s selection algorithm had an 80% bias in favour of Trump over Clinton during the election.
The second Guardian piece from Feb 1, “How an ex-YouTube insider investigated its secret algorithm” (they are really pushing this), pretty much reiterates these claims from a slightly different perspective. In this one two Guardianistas (Paul again and someone called Erin) checked the list of “8,052” videos Guillaume’s software produced as evidence of bias. The pair seem very excited about their ‘research’, but since they have made no effort to examine the program itself or verify its balance or objectivity, their results are more or less worthless. GIGA always applies. Until we know exactly how Chaslot’s code works and until it’s been verified by some other parties, its conclusion remain moot at best.
The idea Youtube has a bias in favour of conspiracy theories seems fairly unlikely and will probably come as a very big surprise to all those whose “conspiracy-theory” accounts have been targeted for demonetization, or suffered “banning..and de-trending…for posting hateful, fake or inappropriate content that challenges or mocks progressive narratives” since the previous clamp down on free thought by Google. In fact, given all this established fact, at this stage some of you inveterate sceptics and “conspiracy-theorists” out there might even be wondering how reliable M. Chaslot’s software actually is.
There’s no real information about this in either of the articles, beyond unverified claims that the program selects only top recommends on each pass. Chaslot’s website doesn’t seem to shed any light either.
The most potentially useful avenue to explore is to simply do your own searches using Chaslot’s professed method. Do you find the same overwhelming preponderance of conspiracy videos and pro-Trump videos as he and the Guardianistas claim? Let us know your results.
Another helpful thing is a link the Guardian provides to Chaslot’s code on GitHub.
If any of you out there are coders why not check it out while it remains available. See if Chaslot’s results can be duplicated. Is his program telling us a surprising truth, or is it flawed and unrepresentative?
This isn’t a trivial question, because today’s Guardian article on the same subject -“ Senator warns YouTube algorithm may be open to manipulation by ‘bad actors’” – makes it absolutely clear these claims are going to be used as a basis for fresh and probably draconian censorship which may see the end of any place on YouTube for opinions that even mildly question those sanctified “progressive narratives” of militarism, Russia-hate, endless war and global austerity.