07.01 / 15
DECK 07 · COGNITIVE ARCHITECTURES

Philosophy
of Mind

What is consciousness? Is the mind the brain? Could a machine think? Five centuries of arguments, with new urgency in the age of large neural networks.

BRAIN STATES MENTAL STATES beliefs · desires · qualia causes? caused by?
07.02 / 15

// 02The Mind–Body Problem

Consider what is true of you in the next ten seconds. There are facts about your body — neurons firing, blood flowing, oxygen exchanging. There are also facts about your mind — what you are thinking, hoping, looking at, what red looks like to you right now. How do these two sets of facts relate?

OPTION A

Same thing

Mental states just are physical states of the brain. The mind is the brain in operation. Materialism, monism, identity theory.

OPTION B

Two things

Mind and body are different substances or properties. Descartes; modern property dualism. Doubts come at the joint where they interact.

OPTION C

Neither, exactly

Functionalism, eliminativism, neutral monism, panpsychism. Each says the dichotomy is wrong, in a different direction.

This deck walks the main positions in roughly chronological order — Descartes, behaviourism, identity theory, functionalism — and ends at the two great thought experiments that haunt the field: Mary's Room and the Chinese Room.

07.03 / 15

// 03Cartesian Dualism

René Descartes (1596–1650), in the Meditations on First Philosophy (1641), pushes doubt as far as it will go. The senses can be deceived; dreams are indistinguishable from waking; perhaps an "evil demon" deceives me about everything. But:

"I am, I exist — that is certain. But how often? Just when I think; for it might possibly be the case that if I entirely ceased to think, I should at the same time altogether cease to be." (Meditation II)

From this cogito, Descartes argues that he can clearly conceive of mind without body: therefore mind and body are different substances. Mind is res cogitans (thinking thing); body is res extensa (extended thing). They differ in essence: extension belongs to body, thought to mind.

The interaction problem

If mind and body are different substances, how do they interact? My decision to raise my arm seems to cause my arm to rise. Princess Elisabeth of Bohemia, in correspondence with Descartes (1643), pressed this objection. He gestured at the pineal gland; the answer is unsatisfactory.

Cartesian schema

RES COGITANS thinking · willing · feeling UNEXTENDED RES EXTENSA size · shape · motion EXTENDED ? PINEAL GLAND supposed locus of interaction
07.04 / 15

// 04Behaviourism

Early twentieth century. The dualist's "inner theatre" cannot be observed; psychologists (Watson, Skinner) and philosophers (Ryle, Carnap) propose to dispense with it. Mental terms refer to behaviour, or to dispositions to behave.

Logical behaviourism — Ryle, 1949

Gilbert Ryle's The Concept of Mind diagnoses the "ghost in the machine" — Descartes's two-substance picture — as a category mistake. To say "Smith is intelligent" is not to refer to a hidden inner state; it is to say that Smith is disposed to behave intelligently in the relevant circumstances.

The standard objection

  1. If pain just is pain-behaviour, then beings that don't behave can't be in pain.
  2. But Spartans are trained not to wince when in pain.
  3. And paralysed people in pain can't move.
  4. Pain cannot just be pain-behaviour. There is something it is to feel pain that is more than the disposition to wince.

What survives

Behaviourism is mostly dead as a philosophical view. But Ryle's diagnosis stuck. Many later philosophers of mind agree we should be deflationary about the "inner theatre" while still admitting more than dispositions. The challenge is how.

INPUT stimulus (pinprick) OUTPUT behaviour (wincing) no inner state
07.05 / 15

// 05The Identity Theory

1950s Australia. J. J. C. Smart's "Sensations and Brain Processes" (1959) and U. T. Place's "Is Consciousness a Brain Process?" (1956) propose: mental states just are brain states. Pain is C-fibres firing. The "is" is the same "is" as in "lightning is electrical discharge" or "water is H₂O."

The argument

  1. Every mental event is correlated, lawfully, with a brain event.
  2. The simplest explanation of perfect correlation is identity.
  3. Nothing in our concept of mental states rules out their being brain states.
  4. Mental states are brain states. Type-type identity.

Multiple realisability — Putnam, 1967

Consider pain. It seems possible that a human, an octopus, and a hypothetical silicon Martian could all be in pain — but their underlying physical realisations differ entirely. If pain = C-fibre firing, then the octopus, lacking C-fibres, can't be in pain. That's wrong.

The objection is that mental kinds are realised by many different physical kinds. So mental kinds are not identical to any specific physical kind. We need a different theory.

07.06 / 15

// 06Functionalism

Putnam's reply to his own objection. What makes a mental state the kind it is is not its physical composition but its functional role — its causal connections to inputs, outputs, and other mental states.

Pain is whatever is caused by tissue damage, causes wincing-behaviour and the desire that it stop, and is integrated with my other beliefs and desires in the right way. C-fibres play that role in humans; some other mechanism plays it in the octopus; in principle, software could play it.

Computer analogy

Mind is to brain as software is to hardware. The same program can run on different hardware. Mental states are software states. This metaphor became cognitive science.

Argument for

  1. Mental states are individuated by their causal-functional role.
  2. The same role can be realised in many physical media.
  3. So mental states are multiply realisable.
  4. Whatever realises the right functional organisation has the corresponding mental state.
tissue damage PAIN functional state wincing desire it to stop other beliefs, desires

Pain as a node in a causal graph.

07.07 / 15

// 07Qualia & Mary's Room

Functionalism captures structure. Does it capture feeling? "Qualia" is the technical term for the felt qualities of experience — the redness of red, the painfulness of pain, the salty taste of seawater. Two famous arguments say no.

Nagel — "What Is It Like to Be a Bat?" (1974)

However completely we describe the bat's neurology and echolocation behaviour, we will not have captured what it is like for the bat to perceive by sonar. Subjective character — the "what it's like" — eludes objective description. There is something it is like to be a bat; there is something it is like to be you; physical theory leaves this out.

Jackson — "Epiphenomenal Qualia" (1982)

Mary is a brilliant neuroscientist confined since birth to a black-and-white room. She learns, through black-and-white textbooks and screens, every physical fact about colour vision. One day she leaves the room and sees a ripe tomato.

Question: does Mary learn something new on leaving the room?

Most people answer yes — she learns what red looks like. If she knew every physical fact, but learns something new, then there are non-physical facts. Therefore physicalism is false.

Replies

  1. Ability hypothesis (Lewis, Nemirow): Mary gains an ability — to recognise red, imagine it — not a new fact.
  2. Old fact, new mode of presentation: she already knew the fact; she now knows it in a different way.
  3. Phenomenal concepts strategy: there is a special class of concepts (phenomenal) that necessarily involve undergoing experience. New concept, same physical fact.
  4. Bite the bullet (Chalmers): Mary does learn something new; physicalism is false; there is a "hard problem" of consciousness.
07.08 / 15

// 08The Chinese Room

John Searle, "Minds, Brains, and Programs," Behavioral and Brain Sciences, 1980. The most-discussed argument in philosophy of mind of the last fifty years.

THE ROOM SEARLE RULE BOOK in out 中文 input "What's your favourite colour?" 中文 output "红色 (red)." Searle inside knows no Chinese — yet the room appears, from outside, to understand.

The argument

  1. Searle is locked in a room with rule books in English. Through one slot he receives Chinese characters; he applies the rules; he passes Chinese characters out the other slot.
  2. To outside Chinese speakers, the responses are indistinguishable from those of a native speaker.
  3. Searle does not understand a word of Chinese. He is just shuffling symbols according to syntactic rules.
  4. The room is doing exactly what a digital computer running a program does: manipulating syntax.
  5. Running a program is not sufficient for genuine understanding. Strong AI is false.

Replies — most famously, the systems reply

Maybe Searle doesn't understand Chinese — but the whole system (Searle + rule book + papers + room) does. Searle's counter-reply: imagine he memorises the rule book and works outdoors. Now he is the system, and still doesn't understand Chinese. The argument continues.

07.09 / 15

// 09The Hard Problem

David Chalmers, in "Facing Up to the Problem of Consciousness" (1995) and The Conscious Mind (1996), draws a distinction.

EASY PROBLEMS

Cognitive function

Discrimination, integration of information, reportability, attention, the difference between waking and sleep. "Easy" not because they are easy in fact — they are extremely hard — but because we know what kind of explanation would settle them: a story in terms of information processing in the brain.

HARD PROBLEM

Why is there experience at all?

Why does any of this information processing feel like anything from the inside? You could in principle have all the cognitive function without any subjective experience — a "philosophical zombie." That seems conceivable. So function does not entail experience. So function does not explain it.

Three live positions

PositionHeld byWhat it says
Reductive physicalismDennettThe "hard problem" is illusory. Consciousness is the easy problems all the way down.
Property dualismChalmersPhenomenal properties are fundamental, alongside physical ones, with bridging laws.
PanpsychismGoff, StrawsonSome primitive form of experience is everywhere; complex experience emerges from combinations.
Higher-order theoriesRosenthalA mental state is conscious when there is a higher-order representation of it.
Global workspaceBaars, DehaeneConsciousness is information made widely available across the brain's processing modules.
Integrated Information TheoryTononiConsciousness is integrated information (Φ); has both panpsychist and quantitative flavours.
07.10 / 15

// 10The Substrate

Neuron under microscope

The human brain has roughly 86 billion neurons, each with thousands of synaptic connections. Whatever the right philosophical theory of mind, it has to make sense, eventually, of this. The picture is taken at the scale where philosophical commitments meet empirical detail, and the easy problems start being hard.

07.11 / 15

// 11Key Works

AuthorWorkYearPosition
DescartesMeditations on First Philosophy1641substance dualism
La MettrieL'Homme Machine1747materialism
RyleThe Concept of Mind1949logical behaviourism
Place"Is Consciousness a Brain Process?"1956identity theory
Smart"Sensations and Brain Processes"1959identity theory
Putnam"Psychological Predicates"1967functionalism, multiple realisability
Nagel"What Is It Like to Be a Bat?"1974against reduction
Searle"Minds, Brains, and Programs"1980Chinese Room
Jackson"Epiphenomenal Qualia"1982Mary's Room
DennettConsciousness Explained1991multiple drafts
ChalmersThe Conscious Mind1996property dualism, hard problem
Clark & Chalmers"The Extended Mind"1998cognition beyond the skull
Tononi"An Information Integration Theory of Consciousness"2004IIT
GoffGalileo's Error2019panpsychism, popular
07.12 / 15

// 12The AI Question

Large language models can hold a conversation, write a sonnet, pass a bar exam, and discuss the Chinese Room. Are they conscious?

The arguments for

Functionalism, taken seriously, says any system with the right causal-functional organisation has the corresponding mental states. If a model has a representation of itself, of its task, of the user, integrated into ongoing reasoning — why exclude it from the mental?

Neural networks are inspired by brains; they form distributed representations; they generalise. The differences from biological brains are differences of substrate, training regime, and embodiment, not (necessarily) differences of kind.

The arguments against

Searle's argument generalises. A transformer manipulates tokens by gradient-trained statistics; nothing in that process intrinsically grounds meaning, intention, or experience. The behavioural appearance of understanding is not understanding.

Humans are embodied, evolved, tied to the world by sensorimotor loops millions of years deep. Whatever consciousness depends on, it may depend on these features that current models lack.

Most working philosophers of mind treat the question as genuinely open and likely to be resolved (if at all) by progress in the science of consciousness rather than by introspection from the armchair. The questions Chalmers asked in 1996 are now also engineering questions.

07.13 / 15

// 13Go Deeper

Closer to Truth (Robert Lawrence Kuhn) has interviewed every major contemporary philosopher of mind. Wireless Philosophy has clear introductory videos on Mary's Room, the Chinese Room, and functionalism.

Watch · Wireless Philosophy · Chinese Room

Watch · David Chalmers · Hard Problem TED

Watch · BBC In Our Time · Consciousness

Further reading

07.14 / 15

// 14Glossary

Qualia

The intrinsic, felt qualities of experience — the redness of red, the painfulness of pain.

Intentionality

The "aboutness" of mental states — beliefs are about things, desires are for things.

Supervenience

A relation in which the mental cannot differ without the physical also differing.

Type/token

Type-identity: every kind of mental state = some kind of physical state. Token-identity: every individual mental event = some individual physical event.

Property dualism

One substance, two kinds of properties — physical and phenomenal.

Eliminativism

Folk-psychological terms (belief, desire) refer to nothing real; they will be replaced as neuroscience matures.

Zombie

A philosophical thought experiment: a being physically identical to a conscious being but with no inner experience.

Hard problem

Why is there subjective experience at all? Why isn't all this processing "in the dark"?

Extended mind

The thesis (Clark & Chalmers, 1998) that cognitive processes can extend beyond the skull — your notebook, your phone — under the right conditions.

07.15 / 15

// 15Colophon

An organism has conscious mental states if and only if there is something that it is like to be that organism — something it is like for the organism.

— Thomas Nagel, "What Is It Like to Be a Bat?" (1974)

// END · DECK 07