Guest Author | June 24, 2025

The Cognitive Offloading Edition

On search engine AI, and the fear that it's affecting how we think.

Books explaining why books no longer matter come in many flavors.
via the New Yorker

Noah here. Researchers from MIT Media Lab just published "Your Brain on ChatGPT: Accumulation of Cognitive Debt." The key finding:

The LLM undeniably reduced the friction involved in answering participants' questions compared to the Search Engine. However, this convenience came at a cognitive cost, diminishing users' inclination to critically evaluate the LLM's output or "opinions" (probabilistic answers based on the training datasets). This highlights a concerning evolution of the 'echo chamber' effect: rather than disappearing, it has adapted to shape user exposure through algorithmically curated content.

The researchers called this "cognitive offloading." They hooked participants to EEG monitors while writing essays and found LLM users showed the weakest neural connectivity. The participants struggled to accurately quote their own work afterward.

Why is this interesting?

We've been here before. In Plato's Phaedrus, Socrates warns that writing will destroy memory: "This discovery of yours will create forgetfulness in the learners' souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves." Fast forward 2,400 years, and here we are again, wringing our hands about a new technology making us intellectually lazy.

This isn't to say Plato or the MIT researchers are wrong. Writing probably did reduce—or at least shift—our memory capacity. Calculators weakened mental math skills. GPS diminished our spatial reasoning. Google has changed how we store information. The evidence for cognitive offloading is real.

Marshall McLuhan, probably my favorite thinker on all things media and technological change, understood this—he argued that all media create "extensions" of ourselves while simultaneously causing "amputations" of the functions they replace.

But these are trade-offs. Writing may have weakened memory, but it has enabled the advancement of science, history, and literature. The printing press may have encouraged "shallow learning," but it sparked the Renaissance. Every tool that offloads cognition also extends our capabilities in new directions.

Adam Gopnik captured this perfectly in his 2011 New Yorker piece—the article I've probably shared with more people than any other over the past 15 years. He divided technology critics into Never-Betters (digital utopians), Better-Nevers (wishing it never happened), and Ever-Wasers, who "insist that at any moment in modernity something like this is going on, and that a new way of organizing data and connecting users is always thrilling to some and chilling to others—that something like this is going on is exactly what makes it a modern moment."

I'm with the Ever-Wasers. Every generation thinks its technology is uniquely destructive to human cognition. Yet here we are, still thinking.

I'll leave you with something Adam's sister, Alison Gopnik, wrote in the Wall Street Journal in 2022. A study in Psychological Science by Adam Smiley and Matthew Fisher found that our assessment of technologies reflects a basic human psychology: the status quo bias. The researchers asked over 2,000 people to rate the benefits or harms of different technologies. The crucial factor was whether the technologies were older or younger than the participant. There was a sharp difference between technologies that appeared before or after people were 2 years old, around when we make our first lasting memories. People thought that the technologies that appeared later were more harmful.

People tend to prefer things that are familiar to those that are new and different. In other words, the day before you were born is Eden, and the day after your children are born is Mad Max. (NB)

© WITI Industries, LLC.