By: +David Herron; Date: March 5, 2018
A cool thing about YouTube is as you browse the site, it recommends videos based on your viewing history. YouTube's engineers have carefully tuned artificial intelligence algorithms to keep track of your viewing history, and recommend more of the same to you. Simply browse the YouTube home page and this is obvious, as the selections will match what you've recently watched. YouTube does this to increase viewing time on the site, and success at matching recommendations to a viewers history directly impacts Google's revenue. But does the algorithm skew its results in order to increase viewing time?
A former YouTube engineer claims that the algorithm clearly preferences conspiracy videos because they increase viewing time.
He created a website to showcase that claim: https://algotransparency.org/
Supposedly the website does YouTube searches as if you're a brand new user with no history on YouTube. Sure enough the searches he shows do tend towards conspiracy videos.
If I open YouTube in an Incognito Window (for instructions, see Bypassing the NY Times paywall, and read NY Times content for free. Incognito windows should behave as this guy claims, since the identifying markers are missing.
In an Incognito window I typed in search phrases he showcases on the AlgoTransparency website, and the results I get are different than are shown on that site, and are not so heavily skewed to conspiracy results.
In other words, using an Incognito Window I cannot replicate what's shown on AlgoTransparency. Maybe this means AlgoTransparency is making a bogus claim. But I'm not convinced of that, since there are many more variables and maybe "Incognito Window" isn't sufficient to act as a brand new user.
For the next stage of testing it - I started Firefox. I normally use Chrome, and rarely use Firefox, so therefore my Firefox instance has probably never visited YouTube. I then typed in a search phrase shown on AlgoTransparency, and the results matched what I'd gotten in Chrome with Incognito Mode rather than matching the results on AlgoTransparency.
In other words the evidence is building that something isn't quite right with the claim by AlgoTransparency. But that's not 100% clear their claim is bogus, just that something is smelling a little fishy.
AlgoTransparency has a description of the site: https://algotransparency.org/en/demarche.html?candidat=Francois Fillon&file=ytrecos-presidentielle-2017-06-10
More importantly the source code is here: https://github.com/pnbt/youtube-explore
It's a command-line Python tool to perform the searches shown on the AlgoTransparency website. Someone with more interest than I should examine the code to see what's going on.