Catalog · Business & Economics · Vol. VII · Deck 07.06

PROCEEDINGS · STRATEGIC INTERACTION · VOL. VII · NO. 6 · MAY 2026

Game Theory
The Mathematics of Strategic Choice

A primer · with payoff matrices and equilibria

ABSTRACT Game theory is the formal study of decisions where outcomes depend on the choices of others. Formalized by von Neumann & Morgenstern (1944) and refined by Nash (1950), it has become the lingua franca of microeconomics, evolutionary biology, computer science, and military strategy. We survey the key solution concepts — dominant strategies, Nash equilibrium, subgame perfection, evolutionary stability — alongside the canonical games (Prisoner's Dilemma, Stag Hunt, Battle of the Sexes, Chicken), with applications to oligopoly, deterrence, and auction design.

Contents

  1. What is a game?
  2. Dominant strategies & elimination
  3. Nash equilibrium
  4. The Prisoner's Dilemma
  5. Coordination · Stag Hunt, Battle of the Sexes
  6. Mixed strategies
  7. Sequential games · subgame perfection
  8. Repeated games · cooperation in the shadow of the future
  9. Zero-sum & minimax
  10. Evolutionary game theory
  11. Mechanism design & auctions
  12. Case · Cuban Missile Crisis
  13. Pitfalls & limits
  14. Reading & viewing

1 · What is a game?

A game has three primitives: players (decision-makers), strategies (the actions each can take), and payoffs (the value each player assigns to every outcome).[1]

Two assumptions usually accompany the model: players are rational (they maximize expected payoff given beliefs) and the structure of the game is common knowledge (everyone knows the rules; everyone knows that everyone knows; ad infinitum).

Games come in flavors: simultaneous vs. sequential; complete vs. incomplete information; one-shot vs. repeated; cooperative vs. non-cooperative. The taxonomy explodes; the toolkit handles each.

2 · Dominant strategies

A strategy is strictly dominant if it yields a higher payoff than any alternative, regardless of what others do. If one exists, the rational player picks it. If all players have one, the equilibrium is found by simple inspection.

Iterated elimination of dominated strategies (IEDS) is a refinement: remove strategies that are dominated, then iterate. Often a unique outcome remains.

Beauty contest game · Keynes/Moulin Pick a number 0–100. Winner is closest to 2/3 of the average. Iterated reasoning leads to 0. In experimental settings, average guesses cluster around 20–30 — suggesting players reason 1–2 levels deep, not infinitely.

3 · Nash equilibrium

John Nash (1950) proved that every finite game has at least one equilibrium in mixed strategies — a profile of strategies, one per player, such that no player can do better by unilaterally deviating.[2]

σ* is NE iff ∀i, ∀σᵢ : uᵢ(σ*) ≥ uᵢ(σᵢ, σ*₋ᵢ)

Nash equilibrium is a consistency condition. It need not be efficient (Prisoner's Dilemma), unique (coordination games), or even reachable (some equilibria are off-path).

4 · The Prisoner's Dilemma

Two suspects are interrogated separately. If both stay silent, each gets 1 year. If one defects and the other stays silent, defector goes free, silent partner gets 10. If both defect, each gets 5.[3]

B: CooperateB: Defect
A: Cooperate−1, −1
both silent
−10, 0
sucker / free
A: Defect0, −10
free / sucker
−5, −5
both defect
Table 1 · Payoffs in years of jail (negative is bad). Defect is strictly dominant for both; (D,D) is the unique NE — yet (C,C) Pareto-dominates it.

The PD captures the canonical tension between individual rationality and collective good. It models arms races, climate cooperation, doping in sports, and over-fishing.

5 · Coordination

5.1 Stag Hunt (Rousseau)

Two hunters can jointly bring down a stag (high payoff, requires cooperation) or each chase a hare alone (lower but safe). Two pure-strategy NE: (Stag, Stag) and (Hare, Hare). The first is payoff-dominant; the second risk-dominant.

5.2 Battle of the Sexes

Couple disagree on opera vs. football but prefer being together over being apart. Two NE in pure strategies (both opera, both football) plus a mixed-strategy NE. Coordination is the problem.

5.3 Chicken

Two cars hurtle toward each other; first to swerve loses face. Two pure NE: (Swerve, Straight) and (Straight, Swerve). The credible commitment to NOT swerve — say, throwing your steering wheel out the window — wins. Schelling's The Strategy of Conflict built much from this.

FIGURE 1 · CHICKEN GAME · BEST-RESPONSE DIAGRAM Player B's probability of swerving Player A's BR A swerves A straight mixed NE Figure 1 · Pure-strategy NE at the corners (Swerve, Straight) and (Straight, Swerve). Mixed NE at intersection.

6 · Mixed strategies

When there is no pure NE — Matching Pennies, Rock–Paper–Scissors — players must randomize. The mixed NE is the probability distribution over actions that makes the opponent indifferent between their pure strategies.

In tennis, the optimal serve direction has been shown by Walker & Wooders (2001) to be statistically indistinguishable from minimax play — even among amateur club players.

7 · Sequential games

When players move in sequence, we draw the extensive form as a game tree and solve by backward induction.

Subgame Perfect Equilibrium (SPE), due to Selten (1965), refines NE: a strategy profile is SPE if it forms a NE in every subgame, including off the equilibrium path. SPE rules out non-credible threats.

Note · The Centipede Rosenthal's Centipede game: two players alternately decide to take or pass; pile grows each turn. Backward induction predicts player 1 takes immediately. Experimentally, players cooperate for several rounds — strict rationality fails again.

8 · Repeated games

When the same game is played many times, cooperation can emerge. The Folk Theorem (Friedman 1971) states that any individually rational outcome can be sustained as a NE of an infinitely repeated game, given a high enough discount factor.

Robert Axelrod's 1984 tournament asked academics to submit strategies for an iterated PD. The winner: Tit-for-Tat, submitted by Anatol Rapoport. Cooperate first; thereafter copy the opponent's last move. Four properties of winning strategies:

9 · Zero-sum & minimax

Von Neumann (1928) proved the minimax theorem: in any two-player zero-sum game with finite strategies, there exists a value v such that player 1 can guarantee at least v, and player 2 can guarantee paying at most v. Poker, chess, and naval pursuit are zero-sum.

v = maxσ₁ minσ₂ u(σ₁,σ₂) = minσ₂ maxσ₁ u(σ₁,σ₂)

10 · Evolutionary game theory

Maynard Smith (1973) extended game theory to populations. Strategies that survive the replicator dynamic are Evolutionarily Stable Strategies (ESS): a population of incumbents cannot be invaded by a small mutant.

Hawk–Dove games, the founder example, model conflict over resources. The ESS is a mix of hawkish and dovish behavior, parameterized by the cost of fighting and the value of the prize. Many real-world equilibria — animal contests, gender ratios, plant resource allocation — match the predictions.

11 · Mechanism design

Reverse game theory (Hurwicz, Maskin, Myerson — Nobel 2007). Instead of solving a fixed game, we design the rules to elicit desired behavior from rational agents.

Vickrey auction (1961): second-price sealed-bid. Truth-telling is a dominant strategy. Bidder writes down their true value because winning at a price they didn't set has no incentive to inflate.

Used in: eBay (effectively), Google AdWords (modified second-price → generalized second-price), spectrum auctions in many countries. Match-making (Gale–Shapley): NRMP, school-choice algorithms in NYC and Boston.

AuctionTruth-telling?Revenue
First-price sealedNo (shade bid)Same in expectation*
Second-price sealed (Vickrey)YesSame*
English (open ascending)Effectively yesSame*
Dutch (open descending)No (shade)Same*

*Revenue Equivalence Theorem · Myerson 1981 · with risk-neutral, IPV bidders.

12 · Case · Cuban Missile Crisis

archival photograph Illustrative placeholder image (picsum.photos), not U-2 reconnaissance imagery

October 1962. Soviet missiles in Cuba. Kennedy chooses naval blockade rather than airstrike. The game-theoretic reading (Allison, 1971): a sequential game where each player has incomplete information about the other's resolve.

Schelling's later analysis: the credible commitment mattered more than raw forces. By going public, Kennedy made backing down domestically costly, raising the credibility of escalation. Khrushchev faced a similar bind. The settlement — Soviet withdrawal in exchange for U.S. removing Jupiter missiles in Turkey (kept secret) — let both leaders save face.

13 · Pitfalls & limits

Common knowledge of rationality is rarely complete. Behavioral game theory (Camerer 2003) models bounded-rationality players (level-k thinkers, quantal response equilibrium).

Multiple equilibria haunt coordination games. Schelling's focal points (1960) — meet in NYC, no time, no place — solve some via shared culture (Grand Central, noon).

Off-equilibrium play is empirically common. Real organizations make threats they wouldn't carry out. Studying disequilibrium is harder than studying equilibrium.

14 · Reading & Watching

Books

von Neumann & Morgenstern — Theory of Games and Economic Behavior (1944).
Schelling — The Strategy of Conflict (1960).
Axelrod — The Evolution of Cooperation (1984).
Camerer — Behavioral Game Theory (2003).
Dixit & Nalebuff — Thinking Strategically (1991).
Binmore — Playing for Real (2007).

YouTube

Yale Open Courses — Ben Polak's Game Theory (ECON 159)
Stanford GSB — strategic decisions seminars
Khan Academy — microeconomics game theory
Y Combinator — Peter Thiel "Last Mover Advantage" lecture

References

  1. Von Neumann, J. & Morgenstern, O. (1944). Theory of Games and Economic Behavior. Princeton.
  2. Nash, J. F. (1950). "Equilibrium Points in n-Person Games." PNAS 36: 48–49.
  3. Tucker, A. W. (1950). "A Two-Person Dilemma." Stanford lecture notes.
  4. Selten, R. (1965). "Spieltheoretische Behandlung eines Oligopolmodells mit Nachfrageträgheit." Zeitschrift fur die gesamte Staatswissenschaft.
  5. Schelling, T. C. (1960). The Strategy of Conflict. Harvard.
  6. Axelrod, R. (1984). The Evolution of Cooperation. Basic.
  7. Maynard Smith, J. & Price, G. R. (1973). "The Logic of Animal Conflict." Nature 246: 15–18.
  8. Vickrey, W. (1961). "Counterspeculation, Auctions, and Competitive Sealed Tenders." Journal of Finance 16: 8–37.
  9. Myerson, R. B. (1981). "Optimal Auction Design." Mathematics of Operations Research 6: 58–73.
  10. Camerer, C. F. (2003). Behavioral Game Theory. Princeton.
← The Deck Catalog   ·   Business & Economics index   ·   Vol. VII · Deck 06 · Game Theory