What is consciousness? Is the mind the brain? Could a machine think? Five centuries of arguments, with new urgency in the age of large neural networks.
Consider what is true of you in the next ten seconds. There are facts about your body — neurons firing, blood flowing, oxygen exchanging. There are also facts about your mind — what you are thinking, hoping, looking at, what red looks like to you right now. How do these two sets of facts relate?
Mental states just are physical states of the brain. The mind is the brain in operation. Materialism, monism, identity theory.
Mind and body are different substances or properties. Descartes; modern property dualism. Doubts come at the joint where they interact.
Functionalism, eliminativism, neutral monism, panpsychism. Each says the dichotomy is wrong, in a different direction.
This deck walks the main positions in roughly chronological order — Descartes, behaviourism, identity theory, functionalism — and ends at the two great thought experiments that haunt the field: Mary's Room and the Chinese Room.
René Descartes (1596–1650), in the Meditations on First Philosophy (1641), pushes doubt as far as it will go. The senses can be deceived; dreams are indistinguishable from waking; perhaps an "evil demon" deceives me about everything. But:
"I am, I exist — that is certain. But how often? Just when I think; for it might possibly be the case that if I entirely ceased to think, I should at the same time altogether cease to be." (Meditation II)
From this cogito, Descartes argues that he can clearly conceive of mind without body: therefore mind and body are different substances. Mind is res cogitans (thinking thing); body is res extensa (extended thing). They differ in essence: extension belongs to body, thought to mind.
If mind and body are different substances, how do they interact? My decision to raise my arm seems to cause my arm to rise. Princess Elisabeth of Bohemia, in correspondence with Descartes (1643), pressed this objection. He gestured at the pineal gland; the answer is unsatisfactory.
Early twentieth century. The dualist's "inner theatre" cannot be observed; psychologists (Watson, Skinner) and philosophers (Ryle, Carnap) propose to dispense with it. Mental terms refer to behaviour, or to dispositions to behave.
Gilbert Ryle's The Concept of Mind diagnoses the "ghost in the machine" — Descartes's two-substance picture — as a category mistake. To say "Smith is intelligent" is not to refer to a hidden inner state; it is to say that Smith is disposed to behave intelligently in the relevant circumstances.
Behaviourism is mostly dead as a philosophical view. But Ryle's diagnosis stuck. Many later philosophers of mind agree we should be deflationary about the "inner theatre" while still admitting more than dispositions. The challenge is how.
1950s Australia. J. J. C. Smart's "Sensations and Brain Processes" (1959) and U. T. Place's "Is Consciousness a Brain Process?" (1956) propose: mental states just are brain states. Pain is C-fibres firing. The "is" is the same "is" as in "lightning is electrical discharge" or "water is H₂O."
Consider pain. It seems possible that a human, an octopus, and a hypothetical silicon Martian could all be in pain — but their underlying physical realisations differ entirely. If pain = C-fibre firing, then the octopus, lacking C-fibres, can't be in pain. That's wrong.
The objection is that mental kinds are realised by many different physical kinds. So mental kinds are not identical to any specific physical kind. We need a different theory.
Putnam's reply to his own objection. What makes a mental state the kind it is is not its physical composition but its functional role — its causal connections to inputs, outputs, and other mental states.
Pain is whatever is caused by tissue damage, causes wincing-behaviour and the desire that it stop, and is integrated with my other beliefs and desires in the right way. C-fibres play that role in humans; some other mechanism plays it in the octopus; in principle, software could play it.
Mind is to brain as software is to hardware. The same program can run on different hardware. Mental states are software states. This metaphor became cognitive science.
Pain as a node in a causal graph.
Functionalism captures structure. Does it capture feeling? "Qualia" is the technical term for the felt qualities of experience — the redness of red, the painfulness of pain, the salty taste of seawater. Two famous arguments say no.
However completely we describe the bat's neurology and echolocation behaviour, we will not have captured what it is like for the bat to perceive by sonar. Subjective character — the "what it's like" — eludes objective description. There is something it is like to be a bat; there is something it is like to be you; physical theory leaves this out.
Mary is a brilliant neuroscientist confined since birth to a black-and-white room. She learns, through black-and-white textbooks and screens, every physical fact about colour vision. One day she leaves the room and sees a ripe tomato.
Question: does Mary learn something new on leaving the room?
Most people answer yes — she learns what red looks like. If she knew every physical fact, but learns something new, then there are non-physical facts. Therefore physicalism is false.
John Searle, "Minds, Brains, and Programs," Behavioral and Brain Sciences, 1980. The most-discussed argument in philosophy of mind of the last fifty years.
Maybe Searle doesn't understand Chinese — but the whole system (Searle + rule book + papers + room) does. Searle's counter-reply: imagine he memorises the rule book and works outdoors. Now he is the system, and still doesn't understand Chinese. The argument continues.
David Chalmers, in "Facing Up to the Problem of Consciousness" (1995) and The Conscious Mind (1996), draws a distinction.
Discrimination, integration of information, reportability, attention, the difference between waking and sleep. "Easy" not because they are easy in fact — they are extremely hard — but because we know what kind of explanation would settle them: a story in terms of information processing in the brain.
Why does any of this information processing feel like anything from the inside? You could in principle have all the cognitive function without any subjective experience — a "philosophical zombie." That seems conceivable. So function does not entail experience. So function does not explain it.
| Position | Held by | What it says |
|---|---|---|
| Reductive physicalism | Dennett | The "hard problem" is illusory. Consciousness is the easy problems all the way down. |
| Property dualism | Chalmers | Phenomenal properties are fundamental, alongside physical ones, with bridging laws. |
| Panpsychism | Goff, Strawson | Some primitive form of experience is everywhere; complex experience emerges from combinations. |
| Higher-order theories | Rosenthal | A mental state is conscious when there is a higher-order representation of it. |
| Global workspace | Baars, Dehaene | Consciousness is information made widely available across the brain's processing modules. |
| Integrated Information Theory | Tononi | Consciousness is integrated information (Φ); has both panpsychist and quantitative flavours. |
The human brain has roughly 86 billion neurons, each with thousands of synaptic connections. Whatever the right philosophical theory of mind, it has to make sense, eventually, of this. The picture is taken at the scale where philosophical commitments meet empirical detail, and the easy problems start being hard.
| Author | Work | Year | Position |
|---|---|---|---|
| Descartes | Meditations on First Philosophy | 1641 | substance dualism |
| La Mettrie | L'Homme Machine | 1747 | materialism |
| Ryle | The Concept of Mind | 1949 | logical behaviourism |
| Place | "Is Consciousness a Brain Process?" | 1956 | identity theory |
| Smart | "Sensations and Brain Processes" | 1959 | identity theory |
| Putnam | "Psychological Predicates" | 1967 | functionalism, multiple realisability |
| Nagel | "What Is It Like to Be a Bat?" | 1974 | against reduction |
| Searle | "Minds, Brains, and Programs" | 1980 | Chinese Room |
| Jackson | "Epiphenomenal Qualia" | 1982 | Mary's Room |
| Dennett | Consciousness Explained | 1991 | multiple drafts |
| Chalmers | The Conscious Mind | 1996 | property dualism, hard problem |
| Clark & Chalmers | "The Extended Mind" | 1998 | cognition beyond the skull |
| Tononi | "An Information Integration Theory of Consciousness" | 2004 | IIT |
| Goff | Galileo's Error | 2019 | panpsychism, popular |
Large language models can hold a conversation, write a sonnet, pass a bar exam, and discuss the Chinese Room. Are they conscious?
Functionalism, taken seriously, says any system with the right causal-functional organisation has the corresponding mental states. If a model has a representation of itself, of its task, of the user, integrated into ongoing reasoning — why exclude it from the mental?
Neural networks are inspired by brains; they form distributed representations; they generalise. The differences from biological brains are differences of substrate, training regime, and embodiment, not (necessarily) differences of kind.
Searle's argument generalises. A transformer manipulates tokens by gradient-trained statistics; nothing in that process intrinsically grounds meaning, intention, or experience. The behavioural appearance of understanding is not understanding.
Humans are embodied, evolved, tied to the world by sensorimotor loops millions of years deep. Whatever consciousness depends on, it may depend on these features that current models lack.
Most working philosophers of mind treat the question as genuinely open and likely to be resolved (if at all) by progress in the science of consciousness rather than by introspection from the armchair. The questions Chalmers asked in 1996 are now also engineering questions.
Closer to Truth (Robert Lawrence Kuhn) has interviewed every major contemporary philosopher of mind. Wireless Philosophy has clear introductory videos on Mary's Room, the Chinese Room, and functionalism.
Watch · Wireless Philosophy · Chinese Room
Watch · David Chalmers · Hard Problem TED
Watch · BBC In Our Time · Consciousness
The intrinsic, felt qualities of experience — the redness of red, the painfulness of pain.
The "aboutness" of mental states — beliefs are about things, desires are for things.
A relation in which the mental cannot differ without the physical also differing.
Type-identity: every kind of mental state = some kind of physical state. Token-identity: every individual mental event = some individual physical event.
One substance, two kinds of properties — physical and phenomenal.
Folk-psychological terms (belief, desire) refer to nothing real; they will be replaced as neuroscience matures.
A philosophical thought experiment: a being physically identical to a conscious being but with no inner experience.
Why is there subjective experience at all? Why isn't all this processing "in the dark"?
The thesis (Clark & Chalmers, 1998) that cognitive processes can extend beyond the skull — your notebook, your phone — under the right conditions.
An organism has conscious mental states if and only if there is something that it is like to be that organism — something it is like for the organism.
— Thomas Nagel, "What Is It Like to Be a Bat?" (1974)