Something’s off, but you can’t quite name it. It’s the moment you get home after staying with friends and an influencer using their exact coffeemaker pops up on your Instagram feed. There's the split-second after an actor delivers a quippy line on a streaming series and you try to parse whether this scene has already become a meme or if it’s just written to court them. It’s the new song you’ve been hearing everywhere, only to discover it’s an ‘80s deep cut, inexplicably trending on TikTok.
There is a name for this uneasiness. It’s called “algorithmic anxiety,” and it’s one of the main subjects of Kyle Chayka’s new book, Filterworld: How Algorithms Flattened Culture. A staff writer for The New Yorker, Chayka charts the rise of algorithmic recommendations and decision-making. He shows how culture has slowly started effacing itself to fit more neatly within our social media platforms' parameters
Algorithms, Chayka reminds us, don’t spring from the machine fully-formed. They’re written by humans—in this case, humans employed by the world's biggest tech conglomerates—and their goal is simple: to prioritise content that keeps us scrolling, keeps us tapping and does not, under any circumstances, divert us from the feed.
Filterworld shows us all the ways this can manifest, both online and IRL, into a kind of contentless content. Songs are getting shorter, because it only takes 30 seconds to rack up a listen on Spotify. Poetry has enjoyed an unexpected revival on Instagram. But mostly when it is universal, aphoristic and neatly formatted to work as image as well as text.
There’s the phenomenon of the “fake movie” on streaming services like Netflix. These cultural artefacts have actors, plots, settings—all the makings of a real film. But it still seem slickly artificial, crowd-sourced and focus-grouped down to nothing.
If our old tech anxiety amounted to well-founded paranoia (“Are they tracking me? Of course they are.”), the new fear in Filterworld is more existential: “Do I really like this? Am I really like this?” Is the algorithm feeding us the next video, the next song, tailored to our unique taste? Or is it serving us the agglomerated preferences of a billion other users? Users who, like us, may just want something facile and forgettable to help us wind down at the end of the day.
Chayka doesn’t give us easy answers at the end of Filterworld. He does, however, offer an alternative to the numbing flow of the feed: taste! Remember taste? We still have it. Although the muscles may have atrophied after so many of us have ceded our decision-making abilities to the machines.
Rediscovering our personal taste doesn’t have to be an exercise in high culture or indie elitism. But it does require what Chayka calls the conscientious consumption of culture. In seeking out trusted curators, seeking out culture that challenges us and taking the time to share with others what we love.
To go deeper, Esquire sat down with Chayka to talk about the cultural equivalent of junk food, the difference between human and algorithmic gatekeepers, and why “tastemaker” doesn’t need to be a dirty word. This interview has been edited for length and clarity.
ESQUIRE: Let me start with a slightly provocative question. Is there anyone with a bigger grudge against algorithms than journalists?
KYLE CHAYKA: Well, journalists are known to have a grudge against algorithms. I can speak to my own dislike of them. Just because they’ve taken away this filtering, tastemaking function that journalists have had for so long. But through the course of the book, I talk to all sorts of creators who hate algorithms just as much.
It’s the illustrator who got trapped into doing one bit on Instagram because it succeeded all the time. Or the influencer whose hot selfies get tons of likes but their actually earnest, artistic posts don’t get any attention. In the book, I interview coffee shop founders around the world, and even they are like, “I hate the algorithm because I have to engage with all these peoples’ photos of my cappuccinos.” Everyone feels kind of terrorised.
Maybe journalists were just part of the first wave to realise this?
I think journalists are often canaries in the coal mine, partly because we complain the loudest about everything. But you could see the impact of algorithmic feeds in the media really early on. We moved from consuming news on cable TV or in a newspaper or even on a website homepage to consuming stories the majority of the time through social media feeds. And that just takes away so much control.
A newspaper front page or a website homepage is a human-curated, thought-through intentional thing that highlights important stuff, along with fun stuff, along with goofy stuff. There was an intention and a knowledge to that, which algorithmic feeds have just totally automated away.
Let’s take it from news to culture, which is really the focus of your book. Filterworld explains that the algorithms driving social media exist to keep us engaged as long as possible.The result is a kind of flattening of culture. Our social feeds privilege content that’s easily digestible so we can keep on grazing. What happens to us when all the culture we consume is flattened like that? And we’re not pushed to seek out new things, or to just try something that makes us uncomfortable? What happens to us when we aren’t getting any nutrients, you could say, from the feed?
It makes me think of the cultural equivalent of junk food. It’s engineered to appeal to you. To engage your senses in ways you might not even like, per se, but it’s just so chemically perfect. I talk a lot about how creators feel pressure to conform in certain ways to the feed. Consumers also have to conform in a way. Algorithmic feeds push us to become more passive consumers. That we don't really think about what we’re consuming. We float along on the feed and not think about our own taste too much. I feel like that makes us into more boring people. It makes the cultural landscape less interesting. But it also takes away this opportunity for us to encounter art that is really shocking or surprising or ambiguous.
Take the example of a Spotify playlist. You start by listening to something that you choose. Then Spotify pushes you along on this lazy river of music that is similar to what you put on and is not going to disrupt your experience but it’s also not going to push you anywhere new. It’s not going to try to disrupt you; it’s not going to try to challenge your taste. In the book I contrast that with an indie radio DJ who is making these intentional choices to put songs next to each other that don’t really fit but have some kind of implied meaning based on their proximity. Algorithmic feeds fundamentally can’t create meaning by putting things next to each other. There’s no meaning inherent in that choice because it’s purely automated, machine choice. There’s no consciousness behind it.
You talk a lot about curators in Filterworld. What else can a curator do for us that an algorithm cannot do? Why should we trust them more than an algorithm?
Curating as a word has this very long history dating back to Ancient Rome to the Catholic priesthood. It always had this meaning of taking responsibility for something. I feel like curators now take responsibility for culture. They take responsibility for providing the background to something, providing a context, telling you about the creator of something, putting one object next to others that build more meaning for it. So curating isn’t just about putting one thing next to another, it's all this background research and labour and thought that goes into presenting something in the right way.
That’s true of a museum curator who puts together an art exhibition. It’s true for a radio DJ who assembles a complicated playlist. It’s true for a librarian who chooses which books to buy for a library. But it’s not true for a Spotify algorithmic playlist. The Twitter feed is not trying to contextualise things for you with what it feeds to you. It’s just trying to spark your engagement. TikTok is maybe the worst offender because it’s constantly trying to engage your attention in a shallow way. But it’s absolutely not pushing you to find out anything more about something. There’s no depth there, there’s no context. It actively erases context, actually. It makes it even harder to find.
But we know curators can have their own agendas. What’s the difference between, say, a magazine editor who needs to please their advertisers and a tech company looking after their bottom line? Is there a difference?
There’s this transition that I write about in the book from human gatekeepers to algorithmic gatekeepers, so moving from the magazine editors and the record label executives to the kind of brute mathematics of the TikTok ‘For You’ feed. I think they both have their flaws. The human gatekeepers were biased. They were also beholden to advertisers; they had their own preferences and probably prioritised the people that they knew in their social circles. Whereas the flaw of the algorithmic feed is that while anyone can get their stuff out there, the only metric by which they’re judged is: How much engagement does it get? How much promotion does it merit based on the algorithmic feed?
So they’re both flawed. The question is: which flaws do we prefer? Or which flaws do we want to take with their benefits? The ability of the human gatekeeper was to highlight some voice that would be totally surprising or shocking—to highlight some new and strange thing that totally doesn’t fit with your preconceived notions of what art or music or writing is. The algorithmic feed can’t really do that because it’s only able to measure how much other people already consider it popular.
The advertiser thing—another hobbyhorse of mine is Monocle magazine, which has existed for a decade or two now. It’s a print magazine with a very nice mix of shopping and international news and culture and profiles. That magazine does really well selling print ads because they put print advertising in a good context with good articles. The advertisers appreciate the quality of the content that surrounds it. So that’s a net positive for everyone. Whereas with the internet now, the advertisers are almost in a war with the platforms just as much as the users are. Advertisers don’t want their content appearing willy-nilly, messily next to the crappy content the algorithmic feeds promote, which at this point might be snuff videos or videos of bombings in Gaza. That’s not serving either users or advertisers.
The other night, I was scrolling through this beautiful, curated interiors account and then there was an ad for Ex-Lax, just dropped in the middle of this very aspirational stuff.
That collision to me is the case and point. It’s so useless, and so not productive for either party, that it just feels like a glitch, you know? And that’s because of algorithmic targeting. It’s because these feeds don’t prioritise anything besides engagement.
Places like Monocle, for instance, cater to a relatively small readership. It’s not for everybody; it’s for this smaller subset of people who consider themselves clued-in. We’re getting into a sticky discussion about taste and tastemaking here, but: how do these more niche platforms react against the algorithm?
Tastemaking is a really complicated topic. I think it strikes a lot of people as elitist because you're talking about what people should like and why they should like it, and why I know something that you don’t. “I’m going to tell you something, and it's going to heighten your sensibilities or lead you somewhere different.” That can be intimidating, it can be pretentious, it can be alienating, it can be very biased in class ways, identity ways, all sorts of ways.
But I almost feel like it has to be defended at this point, just because we’re all so immersed in automated feeds. We’re consuming so much through different platforms that we’ve kind of lost touch with the human tastemaker. We all have voices we love following on Twitter or Instagram or TikTok but those voices get lost in the feed. We sometimes lose track of them and we sometimes don’t see their content. Those feeds are also not serving those creators particularly well because the business models are all based on advertising and the creators don’t get access to the bulk of that revenue. Through the book, I propose that one answer to Filterworld, to the dominance of these algorithmic feeds, is to find those human voices. Find tastemakers who you like and really follow them and support them and build a connection with those people.
Thinking about your own taste doesn’t have to be elitist. Fundamentally it’s just about creating a human connection around a piece of culture that you enjoy, and that should be open to anyone. It’s literally telling a friend why you like this specific song, or saying, “We should go see this movie, because I like the director because of XYZ reasons.”
Tastemaking is almost just being more conscientious about cultural consumption, being more intentional in the way that we’ve become totally intentional about food, right? Food is such a source of identity and community, and we take pride in what we eat, what restaurants we go to, what we cook. I would love it if people took more pride in going to a gallery, going to a library, going to a concert series at a concert hall. I think those are all acts of human tastemaking that can be really positive.
And all the things you mentioned are also things outside the house.
Yes. You’re coming together with other people in appreciation of the kind of culture you like to consume. And that’s really good. That helps everyone.
I want to finish by talking about the idea of ambient culture. You clearly appreciate ambient music, and in Filterworld you describe genres like lofi hiphop and Japanese City Pop as music that feels almost designed for the algorithm. Our feeds seem to push us toward ambient content: stuff that’s frictionless and easy to ignore. So I’m wondering, is that always a bad thing? When is ambience necessary and when is it detrimental?
I do really enjoy ambient content. My first book was about minimalism, which has a kind of ambient quality. I wrote an essay about Emily in Paris and ambient TV. I've written about Brian Eno a lot, the musician who coined the term ambient music. That kind of art fulfills a function: to put your brain at rest. It provides a pleasant background at a technological moment when we have a lot of distractions. Ambient TV is maybe the perfect TV to look at your phone in front of. It relies on the presence of that second screen to complement it. The TV show doesn’t have to be that interesting because your phone is interesting.
The problem becomes that through algorithmic recommendations, so much content is pushed towards ambience, and you never want all of your stuff to be ambient. You don’t only want to consume ambient art because then what are you actually paying attention to? If everything exists as a soothing background, what’s actually provoking you? What’s leading you somewhere new?
I think the critique goes back to Brian Eno’s definition of ambient music, which was that the music has to be “as ignorable as it is interesting.” You have to be able to ignore it. It can be in the background, but you should also be able to pay attention to it and be rewarded by your attention to it. I feel like a lot of culture now only falls into that former category. You’re only able to ignore it. Once you start paying attention, there’s nothing really gripping there. Certainly with TikTok and Spotify playlists, there’s this prioritisation of the soothing, numbing quality of ambient content. Functional stimulus in the form of culture is so big these days, whether it’s ambient music or ASMR videos.
Sleep sounds…
So now sometimes, culture exists in a functional context rather than an artistic context. You’re like, “Oh I watch The Office to fall asleep,” or, “I listen to this track while I run because it sustains my exercise.” I personally always want to make an argument for culture for its own sake and for thinking deeply about artistic process and ideas.