What Algorithms Must Destroy to Work & The Flattening of Human Connection
Every digital platform that mediates human connection has to flatten it first, and the cost of that compression is something we still haven't reckoned with.
There’s a thing that happens when you’re standing somewhere genuinely stunning, some landscape that stops you mid-step, and you reach for your phone to take a picture. You frame it, you tap the shutter, you look at the result, and it’s just... not that. The picture might even be good. But it doesn’t look anything like what you’re seeing. There’s something missing, something the camera can’t hold.
I’ve been turning that feeling over for years now, because I think it’s pointing at something much larger than photography. It’s pointing at a fundamental limitation of digital mediation, one that runs through every platform we’ve built to connect people at scale, and one we keep pretending doesn’t exist.
The compression problem
Digital, by definition, computes human expression. It takes something continuous, something alive with infinite variables, and it reduces it to something a machine can process. It always takes something away. It’s always going to compress something. It’s always going to remove something.
This isn’t a bug. It’s the mechanism. It’s how the thing works.
Consider live music versus a record. A live performance has something a recording can never have, because the record is going to play the same way every single time. There’s no room for expression. No room for any finite variable, any time-bound variable to enter the equation. The performer’s mood, the acoustics of the room, the cough from the third row that shifts the timing of a phrase by half a second: all of that’s gone. The record is perfect in the way that only dead things are perfect. It’s fixed. And in being fixed, it’s lost something essential.
Now scale that same principle to human relationships.
What gets lost in the flattening
The thing is, human relationships have always been mediated. That’s not new. They’ve been mediated by geography, by religious institutions, by family, workplaces, by mutual friends. But none of those mediators required the same sacrifice that digital ones do.
What makes digital mediation different is the mechanism. To process relationships at scale, algorithms have to flatten people into signals: engagement patterns, keyword matches, behavioural data. Multi-dimensional human beings get compressed into something software can handle. And what’s lost in that compression is far more than we tend to acknowledge.
Take the mutual friend as an example. If someone introduces you to a friend of theirs, that friend can account for nuance. They can say, “Look, he seems quiet at first, but give him twenty minutes and he’s the funniest person in the room.” Or, “She comes across as intense, but that’s because she actually cares about what you’re saying.” A mutual friend can account for the gap between how someone presents in a snapshot and who they actually are. They can account for consciousness.
An algorithm can’t do that. It can’t account for consciousness. It can process signals: how long someone looked at a profile, which photos got the most engagement, what keywords appear in a bio. But the distance between those signals and the actual texture of a human being is enormous. And that distance is where everything that matters about connection actually lives.
The neurological cost
If your brain acclimatises to swiping through TikTok or Instagram reels, to consuming human beings as ‘content’ at the rate of one every three seconds, how is it supposed to cope with the slowness of real relationships? The patience required. The ambiguity. The discomfort of sitting with someone and not knowing where a conversation is going.
There’s a parallel I keep coming back to, between social media’s effect on our neurology and the documented effects of pornography. Both involve the over-stimulation of reward systems through compressed, mediated versions of something that’s supposed to be experienced in full fidelity. Both create a gap between expectation and reality that makes the real thing feel somehow insufficient. And both operate through the same basic mechanism: take something complex, strip it down to its most stimulating signals, and deliver those signals at a frequency the brain wasn’t designed to cope with.
The infinite scroll is part of this. TikTok, Instagram, all of them: you never have the experience of closing the loop. There’s no moment where you’ve finished. No moment where the thing is done and you can set it down. It’s an unclosed loop, running constantly, and I think that hidden absence of completion is doing more damage to mental health than the content itself.
Creation doesn’t fix it either
Here’s what surprised me. I’ve gone from something like 100% content consumption to now roughly 90% content creation. And I don’t feel any better for it.
I thought the problem was passivity. I thought if I switched from consuming to creating, from scrolling to making, the flatness would lift. But it hasn’t, not fundamentally. Because the medium itself is the issue. Whether you’re consuming compressed human expression or producing it, you’re still operating within a system that can’t hold the full thing. Every time I log into Substack or LinkedIn, it’s just the same stuff. An ocean of content that’s been processed into the same shape, the same cadence, the same optimised structure.
The creation side is arguably worse in some ways, because it adds a layer of performance. You’re not just experiencing the compression; you’re actively compressing yourself. Flattening your own thinking into formats the algorithm can distribute.
The Instagram spot problem
There’s a telling behaviour around those famous Instagram spots, the locations that become popular specifically because they photograph well. People travel to them, but I’m not convinced they’re going to experience the place. They’re going to acquire the asset. The photo. The proof of having been there. It’s a kind of status collection, trophies on a digital shelf.
The experience itself, the actual standing-in-a-place-and-being-stunned-by-it part, is almost incidental. The platform has inverted the relationship between experience and documentation. The documentation is the point. The experience is just the means of production.
And this connects back to the compression problem, because what you end up with is a photograph that doesn’t capture what was there, shared on a platform that compresses it further, consumed by people scrolling past it in under a second. At every stage, something is stripped away. The landscape becomes a rectangle. The rectangle becomes a thumbnail. The thumbnail becomes a data point in an engagement metric.
What previous mediators preserved
Geography mediated relationships by limiting who you could meet, but it didn’t flatten the people you did meet. A church mediated relationships through shared ritual and moral framework, but it didn’t reduce its members to swipeable profiles. A workplace mediated relationships through proximity and shared purpose, but it preserved the full complexity of daily human interaction.
What’s distinctive about algorithmic mediation isn’t that it mediates. It’s what it has to destroy in order to do so. Previous mediators were constraints on access. Digital mediators are constraints on fidelity. They don’t limit who you can reach; they limit how much of a person can travel through the connection.
That’s a fundamentally different trade-off, and I don’t think we’ve properly understood what we’ve given up by making it.
The trust question
There’s a deeper question underneath all of this about whether trust itself can be computed. Blockchain tried to remove trust as a factor entirely, replacing it with verification - ‘Don’t trust, verify’. But I think that gets it backwards. What we actually want, fundamentally, is to encourage trust, (not least of all because trust and economic prosperity are positively correlated - more on this in a forthcoming article). The question is on what basis.
A mutual friend gives you a basis for trust that’s grounded in shared experience and accumulated knowledge of a person. A verified profile gives you a basis for trust that’s grounded in... what, exactly? That someone’s photos match their face? That their employment history checks out? These are signals, but they’re thin ones. They’re the data that survived the compression.
The thing I keep circling back to is that there’s no way to mediate human relationships at scale without flattening them. Without thinning them out. Without removing a lot of the things that are important for human connection. The furniture maker carving wood in a workshop, shaping something purely through consciousness and physical contact with the material, that’s the opposite end of the spectrum. That’s human expression utterly unmediated. And everything we build digitally moves further from it.
I don’t have a clean answer for what to do with this. I’m not sure there is one. But I think the first step is being honest about the trade-off: that every platform promising to connect us is, by the mechanics of how it works, also compressing us. And that compression has a disconnection cost we’re only beginning to measure.
If this piece added something to your week, please consider subscribing :)



