Attention to Intention Economies

#economics
#pontification

where is the wisdom we have lost in knowledge?
where is the knowledge we lost in information?
T.S. Eliot, The Rock

Computer scientist / economist / cognitive psychologist Herbert Simon said in a 1971 speech:

…in an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes…hence a wealth of information creates a poverty of attention

If a wealth of information creates a poverty of attention, then what happens when we have a wealth of intelligence? Indulge me a reductive thought: is intention the substrate of intelligence?

In the world of the attention economy, attention is monetized. In a world where intention becomes the core resource, being able to predict, infer, and make decisions on the consumer’s behalf will become the main source of value. Similar to how Tik Tok launched the paradigm of pushing content rather surfacing them in response to search queries - the flip side of having an algorithm that allows newcomers to be discovered and sidestep the follower count barrier to entry a la Instagram - other ways in which we devour content will also be chosen for us. AI-powered “search” in the likes of Perplexity, SearchGPT and Google’s Gemini summaries is an example of this in action. Rather than presenting with us the raw forms for us to choose, they are now deciding what information to keep and summarize, and what to throw away, on our behalf. This is the “information processing system” that Simon proposes:

An information-processing subsystem … will reduce the net demand on the rest of the organization’s attention only if it absorbs more information previously received by others than it produces - that is, if it listens and thinks more than it speaks.

The vast swaths of data swallowed by today’s LLMs, and the recent push towards more reasoning (or the mimicry of it) such as in o1, DeepSeek-R1, does the “listening” and “thinking”. If the output is less than the input, at least in volume, then the LLMs are acting like a compression algorithm for us. It’s probably a lossy one too, which begs the question of how much of the information is lost in the process.

What does this imply? I guesstimate that the companies, technologies, and products that can infer the user’s underlying intention through behavior telemetry and other implicit signals and bypass the need for explicit expression and thereby make decisions on their behalf is going to dominate. This needs to extend beyond current choices predicated on historical actions, but also predicting future interests and possibly pre-empting them. And in a world where AI is making decisions for us and acting on them, maybe that spawns AI-only micro-economies trading, buying, and selling on behalf of their humans, and in turn engender a market of infrastructure that facilitates this.

Comments

Loading comments from the stars...