jdnt.site / lily / myrtle / acacia / frangipani

Toxic Recommendations and the 3am YouTube Binge

Note: this was written 18.03.2019 and published in a college magazine. The refined version includes input from the magazine editors, which I have not included below. This is my unedited submission. All footnotes are from the present, when I migrated the article to my personal site in 2022.



The Internet is an ocean of algorithms trying to tell you what to consume. From video services like YouTube and Netflix pushing content they calculate you'll watch, to social media platforms filtering and reorganising posts in not only your interest; but their own. And this can surface toxic, derisive content and misinformation.

Recommendation algorithms collect every iota of data about me, including search keywords, watch history, engagement, and a galore of other undisclosed data points. This data is then utilised to push an array of loosely related content for me to consume, wasting my youth in front of a blue light in the early hours of the morning. But beyond wasting time, algorithmic suggestions have been found to amplify conspiracy theories, pseudoscience, and other traditionally alt-right views.

Last year [1], the director of research at Columbia University's Centre for Digital Journalism, Jonathan Albright, discovered how a search for “crisis actors” after the Parkland shooting led him to a scheme of over 9,000 conspiracy videos. This would be tolerable if platforms did not manifest rabbit-holes for such misinformation to spread. Since algorithms are designed to give users more of what they've been viewing, if I watched a few flat-earth conspiracy videos to make fun of (with my adblocker on, of course) I'd be led down a path of more conspiracies, including the aforementioned Parkland crisis actors. They'd stay in my recommendations for weeks on end, probably until I threw my laptop into the ocean.

This is because the business model of social media platforms seek to maximise the time we spend online. And it works, too. YouTube has reported that more than 70% of its viewing time comes from AI-driven recommendations. The thing about this though, says Guillaume Chaslot, a former YouTube engineer, is that "AI is optimised to find clickbait". In order to combat this, he's created a website which works to document the YouTube recommendation system's flaws: algotransparency.org [2]. A study by The Guardian using Chaslot's service found that the top 500 'Up Next' videos of a search of the words "Clinton" consisted of 81% partisan videos favouring the Republican party. Most were slanderous accusations of Hillary Clinton.

I'm not trying to say; 'how dare YouTube favour Trump over Clinton', but it’s more of; bias and corporate interest exists, and we should be aware of this when we view content online. Oftentimes this bias leans towards right-wing populist videos, as they use clickbait tactics which satisfies the AI recommendation system. The danger is, these videos make sweeping assumptions, based on fear and hysteria, and without any evidence; with the aim of radicalisation. You’ll know what I’m talking about if you watch a single Alex Jones video. I don't need to go into detail about the rise of white supremacist thinking, due to the endless hours of anti-SJW videos recommended to users on YouTube dot gov. This is especially worrying considering more than half of YouTube's audience use the platform for news and information (Pew Research Centre, 2018) [3].

And if social media platforms continue to promote toxic recommendations riddled with misinformation and hysteria-fuelled assumptions, we'll continue to see a rise in alt-right and fascist recruitment all around the world.

The only way we can combat the Internet's petri dish of conspiracy, fabrication, and hateful rhetoric, is for platforms to focus on viewer satisfaction rather than viewing time. Up and coming start-ups like Canopy focus on delivering a small handful of quality items to read or listen to every day, based on centralised data stored on the users' device. Podcast app Himalaya tested a version where users were asked, point blank, what they wanted from recommendations and tuned them accordingly. Users consumed 30% more of the content they wanted to.

Some are hopeful that Silicon Valley tech companies will follow start-ups and smaller platforms. Others, like myself, are a little more pessimistic. I'm not sure that big companies are willing to radically change their highly successful business models. I mean, YouTube has a market value of more than $75 billion (CNBC). Mark Zuckerberg alone is worth $67 billion (Variety). Twitter's market cap is $24.7 billion (Forbes) [4].

These platforms aren’t market leaders because users are satisfied at the content they host. It’s because we’re coerced into hours of jumping down rabbit-hole after rabbit-hole as a result of data-hungry algorithms that push corporate interests. Even if those rabbit-holes promote harmful alt-right ideologies that have a role to play in terror attacks; like Charlottesville, Pittsburgh, and now, Christchurch [5].

After all, clickbait rules everything around me.


[1] Since this was written in 2019, I am referring to 2018.

[2] I wrote this for print originally. I decided to update the article with as many hyperlinks as I could, for usability.

[3] Unfortunately, I did not keep any references. Didn't really feel like it since I was already having to do Harvard citations for every first year university assignment at the time.

[4] Of course, these figures are out of date, and in the last years they have probably increased. Or decreased, in Facebook's case?

[5] The mass shooting of a mosque in Christchurch was kind of my main reason for writing this article. Look, I know its not even a drop in the ocean, but this was cathartic for me back then. Maybe because I was super left-wing (in the social justice sense) at the time, it affected me quite deeply. These days I am a little more jaded. The world is also very much more fucked up which has hardened people, I think, not just me. I still agree that algorithms suck, but not because of politics. People will discover that stuff on their own, whether they want to or not. People are not sheep. Not always, anyway.