When everything is urgent, nothing is
It’s no secret that we’re struggling with an information overload today. Everything online is competing for our attention. You check your messages and 10 minutes have disappeared. You’ve skimmed a war headline, watched a stranger’s morning routine, seen a highly targeted ad, and have now forgotten why you even picked up your phone.
We want to blame quantity as the sole culprit, but this is too simplistic.
Yes, there’s more information now than at any other time in history, and with the assistance of AI, the speed at which this information is being produced is unprecedented.
However, volume alone doesn’t explain the overwhelming feeling many people experience. Being constantly connected yet feeling disconnected, and the fatigue that follows scrolling through endless feeds. In a culture of endless entertainment, why are many of us experiencing boredom?
Over the last two hundred or so years media often arrived in contextual containers. Books unfolded in sequence, newspapers implied hierarchy through layout and timing, and programs carried a rhythm, with beginnings, pauses, and endings. Even when advertising, gossip, or trivial content were present, the medium signaled how this information should be understood.
Today, the feed has become the dominant interface where information is filtered. A personalized reflection of the world, summarized for your consumption. You see a news story beside a meme, a personal experience next to an irrelevant fact. Major events share the same space with something trivial. A thoughtful essay and a viral clip are competing for the same few seconds of your attention. You’re learning about an overseas tragedy when an advertisement for a new car suddenly appears.
This isn’t only abundance at play, it’s compression. All this different information is arriving at your screen with the same weight, urgency, and demand for response. When everything looks equally important, importance itself becomes harder to perceive.
There used to be a clearer transition between experiences. When one kind finished, another began. The context that helped us interpret the information we are seeing is fading, and we’re left with many fragments of unrelated narratives competing for our attention.
This can be a disenchanting environment. We experience the world at large through bite-sized bits of information, and try to make sense of it. But the information quickly loses meaning because nothing remains long enough to transform us.
Media shapes what counts as content. What fits into the format will survive, and what doesn’t will fade. And feeds encourage brevity, immediacy, and constant updates. So, ideas that require duration or deep thought struggle to compete with the fragments made to grab your attention instantly. Over time, the content begins to bend toward what the format is rewarding. Depth loses its place in the shallows of the feed.
We find truth in what relentlessly surfaces and relevance in what’s highly engaging. The feed doesn’t just surface information; over time, you learn to think in its shape.
People tend to blame algorithms for these problems, which makes no sense, because they are code, and by doing so, we shift the blame away from humans. These recommendation systems don’t decide what matters on their own. They amplify the business objectives built into them by humans, such as attention, growth, and engagement.
Algorithms learn by optimising towards programmed outcomes, and adjusting themselves in ways that even their creators do not fully understand. Over time, the logic responsible for what we see slowly disappears from view. Whatever content surfaces again and again starts to feel important; not because it is important, but because it satisfies the goals the system was designed to pursue. Algorithms do not create meaning, but they shape what becomes visible long enough to feel like meaning.
At first, feeds can feel unpredictable, even chaotic, because the algorithm is testing different content to see what holds your attention. As patterns begin to emerge, the repetition replaces the discovery, and the environment begins to narrow around what content works. That’s when the same themes keep appearing, and you start to build narratives around them. They begin to feel important, even if they are just what holds your attention at that time. Coincidence starts to resemble coherence.
And this is where the information overload shifts from a technical problem to a psychological one. The environment does not just present you with more information; it begins shaping your assumptions.
The concern with modern media should not be focused only on the amount of information we’re consuming, but on the depth of information, and whether it connects to action.
Before the telegraph, information usually arrived with the potential for action. News arrived within a community where decisions could be made to address it. And then the telegraph introduced the first flood of distant information disconnected from action to resolution. It created a sense of urgency, but changed nothing in how you live. The feed didn’t invent this condition, but scaled it dramatically.
Today, fragments of information arrive from everywhere, and all at once. Distant crises, trending debates, and personal updates from strangers, insights detached from context. The question of what to do with all this information often has no answer.
If we’re not careful, information risks becoming something we only ever experience. We skim headlines, reactions, and ideas, but rarely stay long enough to find deeper meaning. We might see what is happening in the world, but we don’t always understand why it matters, or what it asks of us. Our awareness expands while our agency shrinks. And everything begins to feel urgent, but nothing feels complete. When our minds are flooded with signals that aren’t transforming into action, the content we’re consuming begins to lose meaning; we become disinterested.
In “Life & Fate”, Vasily Grossman wrote that the freedom of human life is inseparable from its uniqueness, the unrepeatable way our consciousness experiences the universe. Human experience is rooted in particularity, the unique way each person experiences the world. Human meaning emerges from what cannot be repeated.
Algorithms learn through millions of our interactions and amplify what scales, what holds our attention. These patterns begin to flatten human difference in the process, because in environments shaped by scale, human particularity becomes harder to see. The subtle perceptions that do not align with the dominant patterns struggle to find the surface. What is rare or deeply personal risks becoming hidden beneath what performs well at scale.
The risk of algorithmic environments isn’t only distraction, it’s the disappearance of the singular human perspective hidden inside the systems optimizing for repetition. When an environment prioritizes scalable engagement, individuality begins to be treated as noise because it doesn’t fit easily into patterns. The feed becomes more than a filter; it shapes what feels real because of what becomes visible.
How do we see less but with more intention?
There is too much information to encounter the world without mediation. So, we rely on editors, teachers, friends, writers, institutions, and machines to help decide what information reaches us. Filters are a necessity, but they raise an important question: which filters should we trust, and why?
One response has been to place a personal algorithm over existing ones, so that it can summarize feeds or further customize them for you. This sounds promising, but adding another opaque system over an already opaque one doesn’t restore meaning. It risks compounding the same problems, resulting in more abstraction, more mediation, more distance between the person and the world they are trying to understand.
With the scale of information today, some form of curation is essential, and abandoning technology altogether isn’t an option. Too much of our life flows through these systems, and they can be incredibly useful.
What if, instead of relying on opaque systems, we started to place our trust back in people? Take the time to understand the people sharing content. Writers, artists, and researchers, the thought leaders in their respective fields. This seems like a more sensible approach, rather than allowing opaque systems to surface the content only because it holds attention at scale.
Algoasis doesn’t claim to solve the problem of meaning, and it doesn’t try to outsmart the algorithm with another algorithm. It gives you the option to step outside the recommendation loop entirely, to turn off algorithmic suggestions and let information arrive through the people you’ve chosen deliberately.
It’s a small but meaningful change, because depth rarely comes from seeing more. It comes from staying longer with fewer signals that matter.
Technology will keep evolving, and new tools will continue to promise clearer answers. But what I believe is that we need to be more thoughtful about how we shape the environments that shape us. We need to be more willing to question the incentives behind what appears in front of us. And we need to be more intentional about where we place our trust.
If the modern feed encourages endless expansion, maybe the counter-move isn’t rejection but instead choosing particularity over pattern, and discovering what kind of world becomes visible when we do.