How YouTube radicalizes folks to Alt-Right views with the algorithm

; Date: Wed Jul 10 2019

Tags: YouTube

YouTube's algorithm to select content you're likely to watch is amazing, and very useful. It's easy to just head to the YouTube home page, and voila there's a bunch of recommended content some of which is surely of interest. Google is investing a lot of money in AI algorithms to drive content recommendation on YouTube, and YouTube's popularity is largely driven by those recommendations. But it's possible to rabbit-hole into specific content areas, and then suddenly the entire body of information you see is within that topic bubble, and it tends to affect ones thinking.

The attached video is an interview with a young man who describes his process of radicalization into the Alt-Right circle of ideas, and then his process of leaving that for a progressive liberal circle of ideas.

YouTube is full of channels presenting different ideas from cooking shows, to sewing shows, to computer repair shows, to politics shows.

Of course our society is highly concerned about political points of view, and whether ones politics leads to good choices or bad choices. And of course there is plenty of wiggle room around what one can "take" as truth or fiction when it comes to political reporting.

But the YouTube algorithms don't seem to care whether it is selecting for you political shows, or computer repair shows. Maybe. Instead the YouTube content selection algorithm tracks the things you watch, and then selects for you videos with similar topics.

What this means is - if you start watching Conservative politics videos, YouTube will start showing you more videos of that ilk. Since I watch a lot of computer repair videos, or DIY Power Wall videos, YouTube selects for me more videos of that sort. And I watch enough "liberal politics" videos that the attached video came up as a recommendation.

What's described in the interview is a person whose political views were not firmly set. He tended away from racism and towards a more liberal point of view, but it was not a firmly held belief. Once he watched a few Libertarian and then Conservative videos, YouTube started showing him more and more of the same.

Watching these videos affected his thinking so much that he became a Trump supporter before the 2016 election.

But then by the same process that radicalized him - the choices of the YouTube content selection algorithm - he flipped over to progressive shows, and a more progressive point of view. Namely, YouTube inserted some content into his recommendations that got him to see something different. Shortly he reverted back to his previous liberal stance, and is watching a bunch of liberal/progressive shows.

YouTube content recommendation algorithm - harming society?

In part the culprit here is YouTube's content selection algorithm. Because of this algorithm we all see a filtered view of the total possible content we could be seeing. YouTube does this to encourage us to keep coming back, because we learn that YouTube can offer us things of interest.

There is a whole world of content we could watch on YouTube that we are not being shown by YouTube. YouTube doesn't show us that stuff because we've never shown an interest. What if we actually would be interested in videos about fly fishing along the Columbia River, but YouTube doesn't know this because we never asked to see such videos?

If YouTube's content selection algorithm leads us into rabbit holes, how do we avoid that fate?

What if we were to find content by searching for it rather than blindly trusting YouTube's content selection algorithm?

Testing the claims about YouTube's content recommendation algorithm

How do you know that what I've said in this post is accurate? Does YouTube have a content selection algorithm? If so, what does it do?

The test I'll propose is to

  1. Set up multiple Google Accounts,
  2. Use each account to access YouTube,
  3. For each account purposely select different content to watch.

I've done this a little bit - and have seen how in different accounts YouTube recommends different things.

It's not just YouTube that has a content selection algorithm

Facebook, Twitter, Google News, and on and on all have content recommendation algorithms. Each service builds up a profile of what it thinks are your interests. Using those interests the service has an algorithm to select content for you.

This sort of personalized service is supposed to be convenient in the ways already described.

The problem is whether the society around us is being trained into different bubbles of conversation.

In the bubble I see on Facebook the conversation circles around electric vehicles and charging stations. For someone else the conversation circles around glass bead making.

How YouTube "Radicalizes" The Alt-Right

About the Author(s)

(davidherron.com) David Herron : David Herron is a writer and software engineer focusing on the wise use of technology. He is especially interested in clean energy technologies like solar power, wind power, and electric cars. David worked for nearly 30 years in Silicon Valley on software ranging from electronic mail systems, to video streaming, to the Java programming language, and has published several books on Node.js programming and electric vehicles.