When everything is urgent, nothing is
We’re struggling with an information overload today. Everything is competing for our attention online. You unlock your phone to check your messages and 10 minutes have disappeared. You’ve skimmed a war headline, watched some strangers morning routine, viewed some products in a random store and forgot why you even picked up your phone.
The sheer volume of information available online today needs to be filtered so we can attempt to understand what is happening around us. Unfortunately we haven’t solved this problem, and the current solutions have insidious business objectives built into them.
This is having a huge impact on us. We’re constantly connected yet feel disconnected, and our rest is being interrupted with endless feeds that produce more fatigue. So in a culture of infinite entertainment, why are many of us experiencing boredom?
For starters, feeds remove context and hierarchy. In other mediums like books, newspapers and programs, there is implied hierarchy through layout and timing. Programs adhere to a logical flow, with beginnings, pauses and endings. Even when trivial content is present, the medium signals how this information is to be understood.
This isn’t the case with feeds. You will likely see a news story beside a meme, a personal experience next to an irrelevant fact, or a child in a war torn country next to a lookxmaxxing tutorial. Major events share the same space with something trivial. A thoughtful essay and a viral clip are competing for the same few seconds of your attention. You’re learning about an overseas tragedy when an advertisement for a new car suddenly appears.
The context that helps us interpret the information we are seeing is fading, and we’re left with many fragments of unrelated narratives competing for our attention. This isn’t abundance at play, it’s compression. All this different information is arriving at your screen with the same weight, urgency, and demand for response. When everything looks equally important, importance itself becomes harder to perceive.
Feeds shape what counts as content. What fits into the format will survive, and what doesn’t will fade. And feeds encourage brevity, immediacy, and constant updates. So, ideas that require duration or deep thought struggle to compete with the fragments made to grab your attention instantly. Over time, the content begins to bend toward what the format is rewarding. Depth loses its place in the shallows of the feed.
We find truth in what relentlessly surfaces and relevance in what’s highly engaging. The feed doesn’t just surface information; over time, you learn to think in its shape.
Feeds wouldn’t exist without algorithms but it’s not the code that’s the problem, it’s the business objectives programmed by humans. We’re at fault. These recommendation systems don’t decide what matters on their own. They amplify the business objectives built into them by humans, such as attention, growth, and engagement.
Algorithms learn by optimising towards programmed outcomes, and adjusting themselves in ways that even their creators do not fully understand. Which means over time, the logic responsible for what we see slowly disappears from view. At first, feeds can feel unpredictable, even chaotic, because the algorithm is testing different content to see what holds your attention. As patterns begin to emerge, the repetition replaces the discovery, and the environment begins to narrow around what content works. That’s when the same themes keep appearing, and you start to build narratives around them.
And this is where the problem shifts from a technical one to a psychological one. The environment doesn’t just present you with more information, it begins to shape your assumptions and world views.
The algorithms decide what we see, and because we’re bombarded with so much information, we default to skimming just to keep up. This is how information loses it’s weight. We don’t go deep enough or stay long enough for it to transform us into action or resolution, we simply pass through it.
This isn’t an entirely new problem, it all started with the telegraph. Before it, news arrived within a community where decisions could be made to address it. Then the telegraph introduced a flood of distant information disconnected from action or resolution. It created a sense of urgency but changed nothing in how you live. The feed didn’t invent these conditions, but scaled them dramatically.
Today, fragments of information arrive from everywhere, and all at once. Distant crises, trending debates, and personal updates from strangers. We might see what is happening in the world, but we don’t always understand why it matters, or what it asks of us. Everything begins to feel urgent, but nothing feels complete.
When our minds are flooded with information that doesn’t lead to action, it begins to lose meaning and leaves us disengaged.
In “Life & Fate”, Vasily Grossman wrote that the freedom of human life is inseparable from its uniqueness, the unrepeatable way each of us experiences the world. Meaning comes from what cannot be repeated, particularity.
But algorithms don’t work like that. They learn through millions of our interactions and amplify what scales, what holds our attention, what repeats. This begins to flatten human difference. The subtle personal perspectives struggle to surface, while familiar patterns rise again and again.
Algorithmic feeds endanger the singular human perspective. When an environment prioritizes scalable engagement, individuality begins to be treated as noise because it doesn’t fit easily into patterns. The feed becomes more than a filter; it shapes what feels real because of what becomes visible.
There is too much information to encounter the world without some form of mediation. So we rely on editors, teachers, friends, writers, institutions, and machines to help decide what information reaches us. Filters are a necessity, but which filters should we trust, and why?
One response has been to place a personal algorithm over existing ones, so that it can summarize feeds or further customize them for you. This sounds promising, but adding another opaque system over an already opaque one doesn’t restore meaning. It risks compounding the same problems, resulting in more abstraction, more mediation, more distance between the person and the world they are trying to understand.
Another response has been to allow users some control over the information being suggested to them via their platform settings. For example, I don’t want to see anything related to sport or crypto. This sounds promising, but the underlying incentives don’t change. These platforms are designed to keep engaged for as long as possible, your attention is a currency. The feed may feel more personalized, but it’s still optimizing for outcomes that aren’t aligned with your goals.
With the scale of information today, some form of curation is essential, and abandoning technology altogether isn’t an option. Too much of our life flows through these systems, and they can be incredibly useful.
What if instead of relying on opaque systems, we started to place our trust back in people? Take the time to understand the people sharing content. Writers, artists, and researchers, the thought leaders in their respective fields. This seems like a more sensible approach, rather than allowing opaque systems to surface the content only because it holds attention at scale.
Algoasis doesn’t claim to solve the problem of meaning, and it doesn’t try to outsmart the algorithm with another algorithm. It gives you the option to step outside the recommendation loop entirely, to turn off algorithmic suggestions and let information arrive through the people you’ve chosen deliberately.
It’s a small but meaningful change, because depth rarely comes from seeing more. It comes from staying longer with fewer signals that matter.
Technology will keep evolving, and new tools will continue to promise clearer answers. But what I believe is that we need to be more thoughtful about how we shape the environments that shape us. We need to be more willing to question the incentives behind what appears in front of us. And we need to be more intentional about where we place our trust.
If the modern feed encourages endless expansion, maybe the counter-move isn’t rejection but instead choosing particularity over pattern, and discovering what kind of world becomes visible when we do.
References
- Neil Postman — Amusing Ourselves to Death, Technopoly - Nassim Nicholas Taleb — Fooled by Randomness - Jenny Odell — How to Do Nothing - Hannah Fry — Hello World: How to be Human in the Age of the Machine