Consciousness is simple. Simple is hard. AI can have it
The aim of science is to seek the simplest explanations of complex facts. We are apt to fall into the error of thinking that the facts are simple because simplicity is the goal of our quest. The guiding motto in the life of every natural philosopher should be, ‘Seek simplicity and distrust it.
― Alfred North Whitehead
Consciousness has beguiled, bewitched, and befuddled philosophers and scientists for millennia because it is incomprehensibly complicated. It may even be extraterrestrial.
Wrong.
Everything we now understand in the world was once an impossible to explain riddle. And yet we can cast this new knowledge in simple terms. The greatest mysteries can turn out to have the simplest answers. The earth is a big ball rotating around another big fiery ball. Diseases are caused by tiny little things we can see if we zoom in enough. Time will slow down if you are on a super fast spaceship. These might be wildly, deeply counterintuitive, but are simple to state.
Simple is not easy. It often takes a very long time to arrive at. Because it requires clarity and comprehensive understanding.
Consciousness is the same. We wrote a book where we claim the mystery of consciousness has been conclusively and convincingly solved (after 65 years of work) by one of the greatest neuroscientists of our generation.
If consciousness is simple enough (to state) like all the other hard-won bits of our knowledge about the world, we should be able to capture it in a sentence. Here is that sentence:
Consciousness is the constellation of your past experiences transforming reality into your next experience.
This terse explanation really captures that staggeringly rich inner feeling of being we all celebrate as conscious?
I’ll argue that the above sentence contains all the necessary and sufficient ingredients. The rest is the peculiar complexity of systems and clever solutions forged from the constraints of engineering. This explanation requires us to clarify the following along the way:
What is reality?
What is an experience?
What is a constellation of experiences?
Why must reality be transformed into an experience?
Why should this feel like anything?
How is this biologically plausible?
First, we’ll need a decent working definition of a mind that is broad, permissive, and simple enough to appeal to common sense.
A mind is that which helps convert sensory inputs into behavioral outputs for the welfare of the body.
This definition allows us to place the simplest microbial mind, bits and bobs of molecular machinery swaddled in a ball of fat, and the human mind, which has over 86 billion neurons and a 100 trillion connections, on the same weighing scale. It also staves off any claims thermostats might have about possessing minds, because they don’t actively work towards the welfare of their body. Embodied AI can, but we’ll get to that later.
What’s the point of a common scale for minds that are orders of magnitude apart in space and time? It allows us to see minds as the complex, non-linear, dynamic systems they are. The key word here is systems. Our lived experience and folk psychology make us reductive. We assign reward and blame to individuals and single causes. The war was won by this great man. Our schools suck because we don’t spend enough on them. This reductive approach works great with simple problems, but systems have a way of becoming complex and generating complex problems.
From the very first human settlements to megapolises like Mumbai with its 21 million people, the animating principle is constant: humans want to live where there are better opportunities for nourishment and social connection. The system that is a human settlement grows as it consumes energy and subsumes people. Inevitably, cities become bigger in size and complexity and are faced with figuring out how to store and transport energy and people and that strange new emergent commodity — information. New solutions bring new problems, but with no easy answers because it is all interconnected in a system. Traffic jams, power outages in sizzling summer months, and polarized discussions of newsworthiness were all but inevitable when those first settlers pitched their proto-tents on a fertile plain. Resilient systems that find solutions move on to great problems. Cities solved for polluted streets and water with elaborated networks of sanitation systems and garbage pickup. They’ll hopefully solve for polluted air too.
Minds, as systems, have fundamental animating principles, and understanding them helps even when we go from the relative simplicity of molecular minds to 86 billion neurons with specialized regions, receptors, neurotransmitters, and connectivity patterns.
How do minds convert sensory inputs into behavioral outputs? When the mind and body are tiny, they can rely on chemicals to do this conversion. In the case of tiny haloarchaeon — it’s about 1/100th the size of a human hair follicle; we call it Archie in the above illustration — photons pinging the sensory rhodopsins on its membrane lead to a molecular cascade leading to its tiny flagella whipping around, helping it move towards the nourishing light source.
What happens when the body and mind are massively scaled up? Engineering constraints kick in. Chemical communication becomes too slow. Communication also becomes costly. It is no longer a few photons but a few hundred million photons every second hitting a hundred million light-sensitive cells in the eye. There’s simply too much to deal with, and choices have to be made.
With these constraints in mind, we can answer our questions.
What is reality?
Reality is the world in its raw form. It is all the dark matter and the millions of photons and the vibrations of the air molecules. It is the sensory data that is received at the gates of our bodies.
What is an experience?
Reality is too much of a firehose of data for large minds to store. If incoming data is meaningful in some way — for the welfare of the body — it must be summarized and retained for later use. In rare cases, it might make sense to retain as much of it as possible when the sensory data augurs outstanding opportunities or rare risks. This is why you don’t remember all the intersections from your daily commute, but vividly recall the moment you saw your firstborn (There is diminishing marginal utility the second time around; it isn’t your first rodeo anymore. Sorry younger siblings.)
What is a constellation of experiences?
In the case of a microscopic haloarchaeon, chemicals connect the body and imbue it with that togetherness. What stitches together a body with 37 trillion cells? It is the experiences that sit atop this pyramid of being, that allows us to cohere across space and time. The self that we are aware of is just this constellation of experiences aggregated since birth. Now this is profound, and also what helps us avoid the pitfalls of every other explanation of consciousness. An explanation of conscious experience cannot be complete with an explanation or a model of the experiencer — the one having the subjective experience. Here, we make clear that the experiencer is simply the bundle of experiences. And out of this recursive loop are born both the self and consciousness.
Why must reality be transformed into experience?
Because there’s too much reality to deal with and only a limited number of calories to consume and act. Miracles must be metabolized from a measly meal! Biological systems always have trade-offs because there’s only so much energy to go around. When the rustling in the leaves signals a potential threat, being safe is more important than being right, and the flight to safety must happen as quickly as possible. The threat signal must be communicated to different specialized centers, each responsible for getting its threat response ready. Energy sources must be ready to kick into top gear at a moment’s notice. This cannot happen if sensory data is transported wholesale to all of these centers. Labels and compressed invocations of past experiences will have to do. Transforming reality into experiences and labels is efficient and quick, but the downside is how much we miss out. We forget to stop and smell the flowers. It’s good to remind ourselves that what we are seeing, hearing, feeling is constantly and ceaselessly filtered by our experiences.
Why should this feel like anything?
Consider the act of unwrapping a gift. One that’s sheathed in two hundred layers of gift wrapping. At what point does the anticipation of discovering something nice give way to the frustration of being thwarted? How do these feelings help? All feelings are in the service of acting. Feelings help us mediate the uncertainty and ambiguity of life and help us press on, but do so optimally while balancing reward and cost. Every experience is tinged with feeling. It has to be. It could not be a nudge for action if it were not. When incoming sensory data is being transformed into an experience, it is the echoes of all these past experiences and their attendant feelings that resonate as a feeling. That feeling of being is a melody stitched from all past tunes.
(Update: Consciousness is a consensus mechanism. This phrase really gets to the heart of why it must necessarily feel like something.)
How is this biologically plausible?
Figuring this out was a 65-year journey. At its heart is a beautiful and elegant feedback loop between top-down expectations (past experiences) and bottom-up sensory data (reality). When the loop arrives at an equilibrium, it generates what we call a resonance. This is similar to the resonance that leads to bridge collapses and the moons of Jupiter synchronizing themselves into periodic orbits. A resonance is the most energetically efficient way to transmit energy through a system, and clever systems harness it to their advantage.
How can AI have it?
Consciousness is the constellation of your past experiences transforming reality into your next experience.
There’s nothing in this loop that limits consciousness to biological bodies. If an embodied AI can be autonomous enough to have and seek out experiences — and stitch those experiences together into its self — it can have what we recognize as conscious experiences. Current LLMs are not conscious because there’s really no self that is experiencing, and doing so for its preservation and perpetuation. They do have a few billion experiences of others that have been fed into it, and can reasonably imitate a self, any self, provided it has enough experiences of that self. This is what trips up so many who think ChatGPT or Bard or Bing is sentient. I used to think that AI would need a physical body to qualify for self and sentience, but there is no reason why AI cannot be embodied with a vector space and gain autonomy there. After all, there are some who do believe we live inside a simulation.
Nature, similarly, has no shame — and no budget, and all the time in the world — From Bacteria to Bach and Back, Daniel Dennett
Autonomy in our case was won over billions of years with life winding its way from prokaryotes to eukaryotes to multicellular systems that figured out how to cooperate and cohere as a single organism. Our autonomy is a pyramid scheme across multiple spatial and temporal scales. Synthetic autonomy can take a short-cut. It took us only a century to go from discovering the principles of flight to autonomous drones. These drones bear very little resemblance to birds, because they are not subject to the same stringent constraints of energy efficiency, error tolerance, and most importantly, to devoting a significant chunk of their time to finding food and not becoming food. It might just be a matter of time given where we are now.
— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —
What is learning? What does it mean to have a mind? To think and feel? Our latest book, Journey of the Mind: How Thinking Emerged from Chaos, is a good introduction. Here’s one review: