What is it Like to be a Thermostat?

Butterfly Emerges_Horizontal

An Excerpt From The Conscious Mind, By David Chalmers

Let us consider an information-processing system that is almost maximally simple: a thermostat. Considered as an information-processing device, a thermostat has just three information states (one state leads to cooling, another to heating, and another to no action). So the claim is that to each of these information states, there corresponds a phenomenal state. These three phenomenal states will all be different, and changing the information state will change the phenomenal state. We might ask: What is the character of these phenomenal states? That is, what is it like to be a thermostat?

Certainly it will not be very interesting to be a thermostat. The information processing is so simple that we should expect the corresponding phenomenal states to be equally simple. There will be three primitively different phenomenal states, with no further structure. Perhaps we can think of these states by analogy to our experiences of black, white, and gray: a thermostat can have an all-black phenomenal field, an all-white field, or an all-gray field. But even this is to impute far too much structure to the thermostat’s experiences, by suggesting the dimensionality of a visual field, and the relatively rich natures of black, white, and gray. We should really expect something much simpler, for which there is no analog in our experience. We will likely be unable to sympathetically imagine these experiences any better than a blind person can imagine sight, or than a human can imagine what it is like to be a bat; but we can at least intellectually know something about their basic structure.

To make the view seem less crazy, we can think about what might happen to experience as we move down the scale of complexity. We start with the familiar cases of humans, in which very complex information-processing gives rise to our familiar complex experiences. Moving to less complex systems, there does not seem much reason to doubt that dogs are conscious, or even that mice are. Some people have questioned this, but I think this is often due to a conflation of phenomenal consciousness and self-consciousness. Mice may not have much of a sense of self, and may not be given to introspection, but it seems entirely plausible that there is something it is like to be a mouse. Mice perceive their environment via patterns of information flow not unlike those in our own brains, though considerably less complex. The natural hypothesis is that corresponding to the mouse’s “perceptual manifold,” which we know they have, there is a “phenomenal manifold.” The mouse’s perceptual manifold is quite rich – a mouse can make many perceptual distinctions – so its phenomenal manifold might also be quite rich. For example, it is plausible that for each distinction that the mouse’s visual system can make and use in perceiving the environment, there corresponds a phenomenal distinction. One cannot prove that this is the case, but it seems to be the most natural way to think about the phenomenology of a mouse.

Moving down the scale through lizards and fish to slugs, similar considerations apply. There does not seem to be much reason to suppose that phenomenology should wink out while a reasonably complex perceptual psychology persists… As we move along the scale from fish and slugs through simple neural networks all the way to thermostats, where should consciousness wink out? The phenomenology of fish and slugs will likely not be primitive but relatively complex, reflecting the various distinctions they can make. Before phenomenology winks out altogether, we presumably will get to some sort of maximally simple phenomenology. It seems to me that the most natural place for this to occur is in a system with a corresponding simple “perceptual psychology,” such as a thermostat. The thermostat seems to realize the sort of information processing in a fish or a slug stripped down to its simplest form, so perhaps it might also have the corresponding sort of phenomenology in its most stripped-down form. It makes one or two relevant distinctions on which actions depends; to me, at least, it does not seem unreasonable that there might be associated distinctions in experience…

… Some intuitive resistance may come from the fact that there does not seem to be room in a thermostat for someone or something to have the experiences: where in the thermostat can a subject fit? But we should not be looking for a homunculus in physical systems to serve as a subject. The subject is the whole system, or better, is associated with the system in the way that a subject is associated with a brain. The right way to speak about this is tricky. We would not say that my brain has these experiences, strictly speaking, but that I have experiences. However we make sense of this relation, the same will apply to thermostats: strictly speaking it is probably best not to say that the thermostat has the experiences (although I will continue to say this when talking loosely), but that the experiences are associated with the thermostat. We will not find a subject “inside” the thermostat any more that we will find a subject inside a brain…

… A final consideration in favor of simple systems having experience: if experience is truly a fundamental property, it seems natural for it to be widespread. Certainly all the other fundamental properties that we know about occur even in simple systems, and throughout the universe. It would be odd for a fundamental property to be instantiated for the first time only relatively late in the history of the universe, and even then only in occasional complex systems. There is no contradiction in the idea that a fundamental property should be instantiated only occasionally; but the alternative seems more plausible, if other things are equal. If there is experience associated with thermostats, there is probably experience everywhere: wherever there is a causal interaction, there is information, and wherever there is information, there is experience…

David Chalmers is an Australian philosopher specializing in the area of philosophy of mind and philosophy of language. He is Professor of Philosophy and Director of the Centre for Consciousness at the Australian National University. He is also Professor of Philosophy at New York University. In 2013, he was elected a Fellow of the American Academy of Arts & Sciences.

David_Chalmers

 

 

Click here to see Chalmers’ TED Talk

 

 

 

conscious mind

 

 

 

 

Click here to purchase The Conscious Mind

 

 

 

 

*All of the excerpts on my blog are from books that have stayed with me for some reason—because the concept was awe inspiring, changed how I view the world, was beautifully expressed, or all three. ‎I personally curate all of the book excerpts, and I always obtain the author’s final approval before posting their work on my blog.