Netflix’s dystopian Korean drama Squid Game has become the streaming platform’s biggest-ever series launch, with 111 million viewers watching at least two minutes of an episode. Out of the thousands of programmes available on Netflix globally, how did so many people end up watching the same show? The easy answer is an algorithm – a computer program that offers us personalised recommendations on a platform based on our data and that of other users.
Streaming platforms like Netflix, Spotify and Amazon Prime have undoubtedly reshaped the way we consume media, primarily by massively increasing the film, music and TV available to viewers.
How do we cope with so many options? Services like Netflix use algorithms to guide our attention in certain directions, organising content and keeping us active on the platform. As soon as we open the app the personalisation processes begin.
Our cultural landscape is now automated rather than simply being a product of our previous experiences, background and social circles. These algorithms don’t just respond to our tastes, they also shape and influence them.
But focusing too much on the algorithm misses another important cultural transformation that has happened. To make all this content manageable, streaming platforms have introduced new ways of organising culture for us. The categories used to label culture into genres have always been important, but they took on new forms and power with streaming.
Classifying our tastes
The possibilities of streaming have inspired a new “classificatory imagination”. I coined this term to describe how viewing the world through genres, labels and categories helps shape our own identities and sense of place in the world.
While 50 years ago, you might have discovered a handful of music genres through friends or by going to the record shop, the advent of streaming has brought classification and genre to our media consumption on a grand scale. Spotify alone has over five thousand music genres. Listeners also come up with their own genre labels when creating playlists. We are constantly fed new labels and categories as we consume music, films and television.
Thanks to these categories, our tastes can be more specific and eclectic, and our identities more fluid. These personalised recommendations and algorithms can also shape our tastes. My own personalised end-of-year review from Spotify told me that “chamber psych” – a category I’d never heard of – was my second-favourite genre. I found myself searching to find out what it was, and to discover the artists attached to it.
These hyper-specific categories are created and stored in metadata – the behind-the-scenes codes that support platforms like Spotify. They are the basis for personalised recommendations, and they help decide what we consume. If we think of Netflix as a vast archive of TV and film, the way it is organised through metadata decides what is discovered from within it.
On Netflix, the thousands of categories range from familiar film genres like horror, documentary and romance, to the hyper-specific “campy foreign movies from the 1970s”.
While Squid Game is labelled with the genres “Korean, TV thrillers, drama” to the public, there are thousands of more specific categories in Netflix’s metadata that are shaping our consumption. The personalised homepage uses algorithms to offer you certain genre categories, as well as specific shows. Because most of it is in the metadata, we may not be aware of what categories are being served to us.
Take Squid Game – it might well be that the way to have a large launch is partly to do with the algorithmic promotion of widely watched content. Its success is an example of how algorithms can reinforce what is already popular. As on social media, once a trend starts to catch on, algorithms can direct even more attention toward it. Netflix categorises do this too, telling us what programmes are trending or popular in our local area.
Who is in control?
As everyday media consumers, we are still at the edge of what we understand about the workings and potential of these recommendation algorithms. We should also consider some of the potential consequences of the classificatory imagination.
The classification of culture could shut us out to certain categories or voices – this can be limiting or even harmful, as is the case with how misinformation is spread on social media.
Our social connections are also profoundly shaped by the culture we consume, so these labels can ultimately affect who we interact with.
The positives are obvious – personalised recommendations from Netflix and Spotify help us find exactly what we like in an incomprehensible number of options. The question is: who decides what the labels are, what gets put into these boxes and, therefore, what we end up watching, listening to and reading?
This article is republished from The Conversation under a Creative Commons license. Read the original article.