Power Reads

The Quiet Erasure of the Human Monopoly: Why Mo Gawdat’s Predictions Are Already Our Daily Reality

Mo Gawdat’s 2020 AI predictions are no longer forecasts—they are our lived reality. A sociological analysis of the shift toward an automated world.
The Quiet Erasure of the Human Monopoly: Why Mo Gawdat’s Predictions Are Already Our Daily Reality

The light from a smartphone screen catches the edge of a commuter’s glasses, a tiny, flickering reflection of a world being rewritten in real-time. On a crowded morning train, the silence is heavy, punctuated only by the rhythmic, Pavlovian swipe of thumbs against glass. Each passenger is ensnared in a personalized loop, a digital feed that feels like a choice but is, in fact, a calculation. This mundane choreography—the tilt of the head, the glazed eyes, the reflexive scroll—is the visceral starting point for understanding a much larger, more systemic transformation.

Zooming out from this microscopic scene of urban alienation, we find ourselves at the center of a prophecy fulfilled. In 2020, Mo Gawdat, the former Chief Business Officer of Google’s moonshot factory, Google X, stepped away from the corporate engine to issue a series of warnings. At the time, his assertions felt like the stuff of speculative fiction, the kind of discourse reserved for late-night philosophy salons or high-concept tech summits. Today, as we navigate the landscape of 2026, Gawdat’s foresight has transitioned from provocative theory to our pervasive, everyday reality. He recently noted that three of his most bold predictions have already come to pass, shaping a world where the boundary between human agency and algorithmic governance has become increasingly opaque.

The Inevitable Current: Beyond the Point of No Return

Gawdat’s first prediction was centered on a singular, chilling word: inevitability. He argued that AI was not a trend one could opt out of, but a fundamental shift in the fabric of civilization. Historically, humanity has always viewed technology as a tool—a hammer, a steam engine, a computer—something that remains dormant until a human hand reaches for it. Paradoxically, AI has inverted this relationship. It is no longer a tool we use; it is an environment we inhabit.

In everyday terms, this inevitability is visible in the way we consume information. If you are reading this article or watching a video recommended by an interface, you are participating in a loop where the AI has already predicted your curiosity. Gawdat describes this as an "arms race," a term that carries the weight of Cold War geopolitics but is now applied to the technical infrastructure of our lives. Corporations and nations are locked in a structural struggle where slowing down is equivalent to surrender. Consequently, we have reached a stage where the systems scale faster than our ability to manage them. We are no longer the smartest entities on the planet; we are the architects who have built a cathedral so complex we can no longer find the exit.

The Expertise Paradox: When Machines Reason

His second prediction focused on the threshold of intelligence. For decades, we comforted ourselves with the idea that AI was merely a sophisticated calculator, capable of pattern recognition but devoid of true reasoning. Gawdat pointed to AlphaGo Zero as the turning point—a system that didn't just learn from humans but learned from itself, surpassing thousands of years of human strategic wisdom in a matter of weeks.

Linguistically speaking, the way we describe "intelligence" is undergoing a profound shift. We used to define expertise through the accumulation of technical knowledge and the ability to execute complex tasks. However, as AI models now mirror the neural networks of the human brain, they have begun to "reason" in ways that are increasingly indistinguishable from human logic. They can compress years of research into a microsecond, identifying medical breakthroughs or coding solutions that would take a human lifetime to conceive.

Through this lens, the human "habitus"—our ingrained skills and dispositions—is being marginalized. If a machine can outperform a lawyer in discovery, a doctor in diagnostics, or a programmer in syntax, what remains of our professional identity? In practice, the remaining advantage for humans is shifting toward the ephemeral qualities of judgment, ethics, and visceral connection. We are moving from a society of "knowers" to a society of "discerners," where the value lies not in the output itself, but in the wisdom to know what that output means for our collective future.

The Hall of Mirrors: The Erosion of Shared Reality

Perhaps the most unsettling of Gawdat’s predictions is the third: that things would go wrong, specifically regarding our grip on reality. We are currently witnessing an erasure of truth that feels both systemic and deeply personal. As AI-generated content becomes ubiquitous, our social media feeds have transformed into a hall of mirrors, reflecting and amplifying our biases until we can no longer recognize a shared objective world.

Culturally speaking, this has led to a state of "liquid modernity," where nothing is fixed and everything is subject to manipulation. When we can no longer trust the evidence of our eyes and ears—when a video of a world leader or a voice note from a loved one can be synthesized in seconds—the social contract begins to fray. This is not just a technical glitch; it is a sociological crisis. Without a shared reality, the ability to maintain trust in institutions, media, and even personal relationships becomes fragmented.

Feature Human-Centric Reality (Pre-2020) Algorithmic Reality (Post-2024)
Information Source Curated by editors/experts Generated by predictive models
Trust Mechanism Reputation and institutional backing Engagement metrics and viral velocity
Social Structure Broad communities (The Third Place) Atomized echo chambers (The Feed)
Truth Definition Verifiable, objective facts Resonant, personalized narratives

The Atomized Archipelago: Living in the Aftermath

Behind the scenes of this trend lies a deeper sociological phenomenon: the atomization of the individual. As AI takes over the mundane tasks of our lives, from scheduling our days to choosing our partners, we risk becoming an archipelago of isolated souls—living densely packed in modern cities but completely disconnected from a common narrative. Our everyday routines, once an anchor of stability, are now mediated by algorithms that prioritize efficiency over human serendipity.

Ultimately, the disruption Gawdat describes is not a failure of the technology, but a reflection of the context in which it is deployed. The danger is not the "smartness" of the machine, but the human behavior that drives its development: the greed for attention, the pursuit of surveillance, and the weaponization of misinformation. We are using a god-like technology to serve our most primitive impulses.

Reclaiming the Human Anchor

As we look toward the horizon, the challenge is not to stop the inevitable, but to consciously navigate the instability it creates. Gawdat’s reflections suggest that the ultimate outcome of this era will depend less on the code and more on the decisions we make as the code evolves. We are at a crossroads where we must rethink how we define work, value, and truth.

On an individual level, this requires a radical shift in perspective. We must learn to value the things that AI cannot replicate: the nuances of a shared silence, the messy complexity of human empathy, and the ability to act against our own data-driven interests for the sake of a higher principle. We need to reclaim our "third places"—those physical spaces of community that exist outside the digital feed—to ground ourselves in a reality that is visceral rather than virtual.

To put it another way, in a world where machines can generate infinite output, the most valuable thing we possess is our attention. Where we choose to place it, and how we choose to connect with one another in the gaps between the algorithms, will determine whether this new era is one of human obsolescence or a profound reset of what it means to be alive.

As you step away from this screen and back into the mundane flow of your day, take a moment to observe the world without the mediation of a lens. Notice the grain of the wood on a table, the specific pitch of a stranger's laugh, or the weight of your own breath. In these small, unquantifiable moments, we find the resilient core of our humanity—a territory the machines have yet to map.

Sources:

  • Gawdat, M. (2021). Scary Smart: The Future of Artificial Intelligence and How You Can Save Our World.
  • Business Insider Interview: "Ex-Google X exec Mo Gawdat on the 3 AI predictions that came true."
  • DeepMind Research: "AlphaGo Zero: Starting from scratch."
  • Bauman, Z. (2000). Liquid Modernity.
  • Oxford Internet Institute: "The Sociology of AI and Algorithmic Governance."
bg
bg
bg

See you on the other side.

Our end-to-end encrypted email and cloud storage solution provides the most powerful means of secure data exchange, ensuring the safety and privacy of your data.

/ Create a free account