Vol. III · Deck 11 · The Deck Catalog

Cryptography.

Three thousand years of the practice of writing in secret. Caesar's shift cipher; Vigenère's polyalphabetic; the Enigma machine and Bletchley Park; Claude Shannon's 1949 founding paper; public-key cryptography and the rise of RSA, Diffie-Hellman, and elliptic curves; zero-knowledge proofs; the post-quantum transition.


Founded~1900 BC
Modern1949
Pages32
Lede0x01

// Opening$ what is cryptography

Three thousand years of attempts to make a message readable to its intended recipient and unreadable to anyone else. Until 1976, all of it ran on the same fundamental architecture. Then everything changed.

The discipline divides cleanly into two eras. Classical cryptography (c. 1900 BC to 1949) — substitution and transposition ciphers, mechanical and electromechanical machines, breakable in principle and often in practice. Modern cryptography (1949–) — Shannon's information-theoretic foundations, computational-hardness assumptions, public-key cryptography, formal definitions of security, and the precarious infrastructure of trust on which the contemporary internet sits.

This deck moves chronologically. Caesar's shift, the polyalphabetic ciphers, Enigma and Bletchley, Shannon, Diffie-Hellman, RSA, elliptic curves, zero-knowledge proofs, and the post-quantum migration that is happening now.

Crypto · Lede02 / 32
Caesar0x02

Chapter I$ the shift cipher.

Suetonius records that Julius Caesar substituted each letter of his messages with the letter three positions later in the alphabet. The shift is the simplest possible substitution: c = (m + k) mod 26. With key 3, A becomes D, B becomes E, and so on. Augustus is recorded using shift 1.

The Caesar cipher has a key space of 25 — trivially small. Modern children break it by inspection. But the cipher exposes the structure that all classical cryptography would build on: a key shared between sender and receiver, an algorithm that uses the key to transform plaintext into ciphertext, and a security claim that depends on the adversary not having the key.

The general form of Caesar — any monoalphabetic substitution cipher, where each plaintext letter maps to a single ciphertext letter — has key space 26! ≈ 4 × 10^26. Vast, by 9th-century standards. Trivially broken by frequency analysis once that technique was discovered.

Crypto · Caesar03 / 32
Al-Kindī0x03

Chapter II$ frequency analysis.

The 9th-century Arab polymath Abū Yūsuf al-Kindī, in his Risāla fī Istikhrāj al-Mu'ammā (On Decrypting Encrypted Correspondence), gave the first known systematic description of frequency analysis. The technique exploited a fact about natural languages — that letters appear with characteristic frequencies — to reduce the apparently vast 26! key space of monoalphabetic substitution to something searchable in an afternoon.

The discovery effectively ended monoalphabetic substitution as a viable cipher for serious correspondence. The Renaissance Italian and French chanceries that continued to use them were repeatedly read by their adversaries. The famous Babington Plot (1586), which sent Mary Queen of Scots to the executioner, was undone by Thomas Phelippes's frequency analysis of the Babington-Mary correspondence.

The cryptographer's response was the polyalphabetic cipher — multiple substitution alphabets, switching by some rule, designed to flatten the frequency distribution and defeat al-Kindī's technique.

Crypto · Al-Kindī04 / 32
Vigenère0x04

Chapter III$ the polyalphabetic ciphers.

Leon Battista Alberti's 1467 De Cifris introduced the first true polyalphabetic system, using a cipher disk that rotated periodically. Johannes Trithemius (1518) and Giovan Battista Bellaso (1553) extended the idea. The system that took Bellaso's name in posterity — known to history as the Vigenère cipher — uses a keyword to drive a sequence of Caesar shifts.

For three centuries Vigenère was considered le chiffre indéchiffrable — the indecipherable cipher. The reputation was overstated even at the time, but the security was a genuine advance: frequency analysis applied directly to Vigenère ciphertext yields a roughly flat distribution.

The cipher fell to Friedrich Kasiski's 1863 publication, which exploited repeated ciphertext segments to recover the key length, after which the cipher reduced to multiple parallel Caesar ciphers (one per key position) that frequency analysis could break individually. Charles Babbage had broken Vigenère before Kasiski (around 1854) but did not publish.

Crypto · Vigenère05 / 32
One-time pad0x05

Chapter IV$ the one-time pad.

Frank Miller (1882), then Gilbert Vernam at AT&T (1917), and finally Joseph Mauborgne of the US Army Signal Corps (1918) converged on the cipher with this property: if the key is as long as the message, uniformly random, used only once, and never disclosed, then the ciphertext reveals nothing about the plaintext beyond its length.

Claude Shannon proved this rigorously in 1949 in his founding paper of mathematical cryptography. The proof is simple and total: every plaintext is equally likely given any ciphertext, so no amount of computation can do better than guessing at random.

The catch is operational. The key must be as long as everything you ever want to encrypt; it must be truly random, not pseudo-random; it must be safely distributed in advance; it must be used exactly once. The VENONA Soviet diplomatic traffic that the NSA partially decrypted in the 1940s was a one-time pad misused — Soviet operators reused key material under wartime pressure, and US analysts spent forty years exploiting that misuse.

Crypto · OTP06 / 32
Enigma0x06

Chapter V$ the enigma machine.

Arthur Scherbius's commercial Enigma (1923) and the German military's adaptation (from 1928) was the most sophisticated cipher machine of its era. Three rotors stepped through 26 positions, the rotors cycled through different permutations on each keystroke, a plugboard added another layer of substitution, and a reflector returned the signal through the rotors a second time. The total key space was roughly 1.6 × 10^20.

The Polish Cipher Bureau under Marian Rejewski achieved the first systematic break in 1932, exploiting a German operating procedure (the doubled message-key indicator) and the mathematics of permutation cycles. Polish work was transferred to the British and French intelligence services in July 1939 in the Pyry Forest meeting.

From January 1940 the work moved to Bletchley Park. Alan Turing, Gordon Welchman, and the team built electromechanical "bombes" that mechanised the search through Enigma keys. The breaks of the German Army, Air Force, and (after February 1942) Naval traffic gave the Allies a sustained intelligence advantage in the Battle of the Atlantic and the Mediterranean campaign.

Crypto · Enigma07 / 32
Enigma_machine
The machine that Bletchley Park learned to read for most of the Second World War.
Bletchley0x07

Chapter VI$ bletchley park.

Bletchley Park, the Government Code & Cypher School's wartime site in Buckinghamshire, was the largest and most consequential signals-intelligence operation in history. The site read German Army, Luftwaffe, and Naval Enigma; the Italian Hagelin C-38; the Japanese diplomatic Purple (mostly broken by the US); and — most dramatically — the German Lorenz cipher used for high-command teleprinter communications.

The Lorenz break by Bill Tutte (who reverse-engineered the entire machine from intercepted ciphertext, never having seen one) and the construction of Colossus by Tommy Flowers (1943) — the first programmable electronic digital computer — were Bletchley's most spectacular achievements. The intelligence product, codenamed ULTRA, has been credited by historians with shortening the war by two years and saving fourteen million lives. The estimate is contested; the impact was unquestionably large.

Bletchley's work was kept classified for thirty years after the war ended. Turing's role was substantially unknown until F. W. Winterbotham's The Ultra Secret (1974). Turing himself died in 1954, after a 1952 conviction for homosexual acts and chemical castration; the British government issued a posthumous apology in 2009 and a royal pardon in 2013.

Crypto · Bletchley08 / 32
Shannon 19490x08

Chapter VII$ the shannon paper.

Claude Shannon's Communication Theory of Secrecy Systems (Bell System Technical Journal, October 1949) is the founding text of mathematical cryptography. Three contributions reshaped the field.

First, the rigorous definition of perfect secrecy: a cipher is perfectly secret if and only if the ciphertext is statistically independent of the plaintext, equivalently if the conditional distribution of plaintext given ciphertext equals the prior. Shannon proved that perfect secrecy requires the key entropy to be at least the message entropy — the one-time pad bound.

Second, the analysis of practical secrecy via the concept of unicity distance — the minimum length of ciphertext needed to uniquely determine the key, given the redundancy of the language. For English encrypted with a typical substitution cipher, unicity distance is around 30 characters; for AES with a 128-bit key, the unicity distance is about 17 characters — at which length the key is, in principle, determined, but exhausting the key space is computationally infeasible.

Third, the confusion-and-diffusion design principle: a strong cipher should make the relationship between key and ciphertext as complex as possible (confusion) and spread the influence of each plaintext bit over as much of the ciphertext as possible (diffusion). The principle has guided every block-cipher design since.

Crypto · Shannon09 / 32
DES0x09

Chapter VIII$ the symmetric standards.

DES (Data Encryption Standard), adopted by NBS in 1976, was the first openly published encryption standard. Designed by IBM (Horst Feistel's Lucifer team) with substantial NSA modifications — including the controversial reduction of the key from 64 to 56 bits and the redesign of the S-boxes. The S-box redesign turned out, decades later, to be a strengthening rather than a weakening: the NSA had known about differential cryptanalysis in 1974 (rediscovered publicly by Biham and Shamir in 1990) and modified the S-boxes to resist it.

The 56-bit key was the deliberate vulnerability. Diffie and Hellman estimated in 1977 that a $20M machine could break DES in a day; the EFF's Deep Crack in 1998 did it in 23 hours for a budget of $250,000. Triple-DES extended the cipher's life through the 2000s.

AES emerged from the 1997–2001 NIST competition. The Rijndael submission (Joan Daemen and Vincent Rijmen, two Belgian academics) won. AES uses 128-bit blocks, 128/192/256-bit keys, and a substitution-permutation network rather than DES's Feistel structure. It is the dominant symmetric cipher of the present era and has resisted serious cryptanalytic threat for over two decades.

Crypto · DES/AES10 / 32
The 1976 paper0x0A

Chapter IX$ public-key cryptography arrives.

Until 1976, every known cipher required the sender and receiver to share a secret key in advance. The key-distribution problem was the fundamental scaling obstacle: if two parties have never met, how do they establish a shared secret without a courier?

Whitfield Diffie and Martin Hellman's New Directions in Cryptography (1976) proposed two ideas that solved the problem. The first: public-key cryptography, in which each party has a public key (anyone can encrypt to it) and a secret private key (only the holder can decrypt). The second: a key-exchange protocol by which two parties communicating over an open channel can derive a shared secret without ever transmitting it.

The protocol — Diffie-Hellman key exchange — works in any group where the discrete logarithm is hard. Alice picks secret a and sends g^a; Bob picks secret b and sends g^b; both compute g^(ab). An eavesdropper sees g, g^a, g^b but cannot derive g^(ab) without solving the discrete log.

It later emerged that Ralph Merkle's undergraduate "Project for Computer Security" (1974) had independently outlined the public-key idea, and that James Ellis, Clifford Cocks, and Malcolm Williamson at GCHQ had discovered both Diffie-Hellman-equivalent key exchange (Williamson, 1974) and an RSA-equivalent encryption scheme (Cocks, 1973) — but their work was classified until 1997.

Crypto · DH11 / 32
RSA0x0B

Chapter X$ rsa.

Six months after Diffie-Hellman, three MIT researchers — Ron Rivest, Adi Shamir, Leonard Adleman — produced the first published full public-key encryption and digital-signature scheme. The trapdoor: it is computationally easy to multiply two large primes p and q; computationally hard, given only the product n = pq, to factor n back into p and q.

RSA encryption: choose primes p, q; compute n = pq and Euler's totient φ(n) = (p−1)(q−1); pick e coprime to φ(n); compute d = e^(−1) mod φ(n). Public key (n, e). Private key d. Encryption c = m^e mod n; decryption m = c^d mod n.

The first widely-used RSA implementation was PGP (Phil Zimmermann, 1991), which gave individual users access to military-grade public-key cryptography over email. The Clinton administration spent 1993–1996 trying to prosecute Zimmermann for "munitions export" without a license; the case was eventually dropped.

Modern RSA uses 2048- or 3072-bit moduli (with the 1024-bit moduli of the 1990s now considered insufficient). RSA-2048 is currently estimated to require ~10^17 classical operations to factor; the largest factored RSA modulus is RSA-250 (829 bits, 2020). The shadow of Shor's algorithm looms — see Chapter XXII.

Crypto · RSA12 / 32
Hash functions0x0C

Chapter XI$ hash functions and digital signatures.

Cryptographic hash functions — fixed-length fingerprints of arbitrary-length data — are the silent infrastructure of digital signatures, password storage, blockchains, and integrity checks.

The historical lineage: MD5 (Rivest, 1992) — fully broken (Wang et al. 2004; practical chosen-prefix collisions by 2008). SHA-1 (NSA, 1995) — collision found by Marc Stevens et al. 2017 (the SHAttered attack), now deprecated. SHA-2 family (SHA-256, SHA-512; 2001) — currently secure. SHA-3 (Keccak; Bertoni, Daemen, Peeters, Van Assche; 2015) — adopted as a structurally different alternative.

Digital signatures combine a hash function with a public-key primitive: hash the message, then encrypt the hash with the signer's private key. Recipients verify by decrypting with the public key and comparing to a fresh hash of the message. The major schemes: RSA-PSS, DSA, ECDSA, EdDSA (Ed25519 — Bernstein et al., 2011, now the default in many modern systems).

Crypto · Hash13 / 32
ECC0x0D

Chapter XII$ elliptic-curve cryptography.

Neal Koblitz and Victor Miller independently proposed in 1985 that the group of points on an elliptic curve over a finite field could replace the multiplicative group of integers modulo a prime in Diffie-Hellman and similar protocols. The discrete logarithm problem on a well-chosen elliptic curve is, by current best algorithms, much harder relative to key size than the integer-factorisation or finite-field discrete-log problems.

The practical consequence: equivalent security at much smaller key sizes. ECC-256 provides roughly the same classical security as RSA-3072 — at smaller key sizes, smaller signatures, less computation per operation. The advantage compounds in resource-constrained settings (smart cards, IoT devices, mobile network handshakes).

The standard curves in current deployment: NIST P-256/P-384/P-521 (the curves Suite B specified for US government use); Curve25519 and Ed25519 (Daniel Bernstein, designed in 2005 to avoid trust issues with the NIST curves); secp256k1 (the curve Bitcoin uses). Curve choice is a contested design surface; the NIST curves were under particular suspicion after the 2013 Snowden disclosures, though the Dual_EC_DRBG backdoor that prompted the suspicion was a different primitive.

Crypto · ECC14 / 32
TLS0x0E

Chapter XIII$ tls — the internet's plumbing.

The protocol now called TLS, originally Netscape's SSL (Secure Sockets Layer, 1994), is the cryptographic substrate of HTTPS, modern email transport, secure DNS, and most of the internet's confidential channel use. The protocol negotiates cipher suites, exchanges keys, authenticates servers (and optionally clients) via X.509 certificates issued by Certificate Authorities, and establishes an authenticated symmetric session.

The protocol's history is a catalogue of practical cryptanalysis. BEAST (2011), CRIME (2012), BREACH (2013), Heartbleed (2014, an OpenSSL implementation bug), POODLE (2014, killed SSL 3.0), FREAK and Logjam (2015, exploited downgrade to export-grade crypto), DROWN (2016). Each round drove protocol cleanup; TLS 1.3 (RFC 8446, 2018) substantially simplified the protocol, removed legacy primitives, encrypted more of the handshake, and made forward secrecy mandatory.

The Certificate Authority infrastructure remains TLS's principal weakness. CA compromises (DigiNotar, 2011; Comodo, 2011) and government-issued mis-certificates have repeatedly demonstrated the structural problem: any one of hundreds of trusted CAs can issue a valid certificate for any domain. Certificate Transparency (Google, RFC 6962, 2013) provides a public audit log; HSTS and HPKP (deprecated) provided some defences in depth.

Crypto · TLS15 / 32
Side channels0x0F

Chapter XIV$ attacks the algorithm doesn't see.

The mathematical security of a cipher is a necessary but not sufficient condition for the security of an implementation. Side-channel attacks exploit information that leaks from the physical execution of the algorithm: timing variations, power consumption, electromagnetic emanations, cache-access patterns, even acoustic emissions.

The timing attack on RSA (Paul Kocher, 1996) demonstrated that the time to perform a modular exponentiation depends in measurable ways on the secret exponent. Differential power analysis (Kocher, Jaffe, Jun, 1999) extracts AES keys from smart-card power traces. Cache-timing attacks on AES (Bernstein, 2005; Osvik, Shamir, Tromer, 2006) extract keys from shared-cache timing on commodity CPUs.

The defences are implementation-level: constant-time code (no data-dependent branches, no data-dependent memory accesses), masking (split secrets across multiple shares processed in parallel), blinding (randomise inputs to the secret operation). The cost is typically a 2–10× performance penalty; the gain is that the algorithm's mathematical security can actually be relied on in practice.

Crypto · Side16 / 32
Elliptic-curve_cryptography
The mathematics behind ECC: hard problems on the points of a curve over a finite field.
Provable security0x10

Chapter XV$ what "secure" means.

Modern cryptographic schemes are not "proven secure" in an absolute sense. They are accompanied by reductionist security proofs: a proof that, if there exists an efficient algorithm A breaking the scheme, then A can be transformed into an efficient algorithm B solving an underlying mathematical problem widely believed to be hard. The reduction transfers your confidence in the underlying problem (factoring is hard, discrete log on Curve25519 is hard, the Learning With Errors problem is hard) into confidence in the scheme.

The standard threat models: IND-CPA (indistinguishability under chosen-plaintext attack — semantically secure against an adversary who can encrypt arbitrary messages); IND-CCA1 and IND-CCA2 (chosen-ciphertext, non-adaptive and adaptive); EUF-CMA for signatures (existential unforgeability under chosen-message attack).

The 1980s-onward foundational programme — Goldwasser, Micali, Rivest, Yao, Blum, Goldreich — turned cryptography into a discipline of formal definitions and reductionist proofs. The 2012 Turing Award to Goldwasser and Micali recognised this transformation. Cryptography before this programme was a craft; afterward, an engineering discipline with theorems.

Crypto · Provable17 / 32
ZK proofs0x11

Chapter XVI$ zero-knowledge proofs.

Zero-knowledge proofs let one party (the prover) convince another (the verifier) that a statement is true without revealing anything beyond the truth of the statement. The classic example: a prover demonstrates knowledge of a graph 3-colouring without revealing which vertex has which colour.

The 1985 Goldwasser-Micali-Rackoff paper established the formal definitions; the 1986 Goldreich-Micali-Wigderson paper showed that every NP statement has a zero-knowledge proof. For decades the result was theoretically beautiful and practically useless — the proofs were too large and slow to deploy.

The 2010s changed that. zk-SNARKs (Succinct Non-interactive Arguments of Knowledge — Bitansky, Canetti, Chiesa, Tromer 2012; Pinocchio 2013; Groth16, 2016) gave succinct proofs that could be verified in milliseconds. Bulletproofs (Bünz, Bootle, Boneh et al. 2018) eliminated the trusted-setup requirement. zk-STARKs (Ben-Sasson et al. 2018) added quantum-resistance.

Deployment: the Zcash cryptocurrency (2016, the first widespread practical zk-SNARK use); StarkWare's Ethereum scaling work; numerous "ZK rollup" L2 protocols. Zero-knowledge has moved from theory to production engineering in roughly a decade.

Crypto · ZK18 / 32
MPC and HE0x12

Chapter XVII$ compute without revealing.

Two adjacent enterprises extend cryptography from communication to computation. Multi-party computation (MPC): n parties jointly compute a function f(x_1, …, x_n) such that no party learns more than the output and their own input. Andrew Yao's 1982 garbled circuits gave the first general construction. BGW (Ben-Or, Goldwasser, Wigderson, 1988) extended to arbitrary-arity. SPDZ (Damgård, Pastro, Smart, Zakarias, 2012) made MPC efficient at scale.

Fully homomorphic encryption (FHE): compute on ciphertext to produce ciphertext that decrypts to the result of computing on the plaintext. Craig Gentry's 2009 PhD thesis gave the first plausible scheme; the early constructions were astronomically expensive (multi-billion-fold slowdown over plaintext). The 2010s rounds of improvement (BGV, BFV, GSW, CKKS, TFHE) reduced the slowdown to manageable factors for narrow workloads.

Production deployment is concentrated in privacy-preserving machine learning, healthcare data analysis, and select financial applications. The performance cost remains substantial; FHE has not yet entered routine general-purpose use.

Crypto · MPC/FHE19 / 32
Random0x13

Chapter XVIII$ randomness and the entropy problem.

Cryptography is unforgiving of bad randomness. A "secret key" generated from a predictable PRNG is not secret. The most spectacular failures of deployed cryptography in the last twenty years have been not algorithmic breaks but RNG failures.

The Debian OpenSSL bug (2006–2008) — a single line of debug code commented out by a Debian maintainer — limited OpenSSL's PRNG entropy to the process ID, producing only 32,767 possible keys. SSH keys, OpenVPN keys, X.509 certificates generated on Debian-derived systems for two years were drawn from a tiny set. Dual_EC_DRBG (NIST SP 800-90A, 2007) — a deterministic random bit generator that almost certainly contained an NSA-knowable backdoor; standardised, deployed in RSA's BSAFE library at NSA's $10M urging, and finally removed from the standard in 2014 after the Snowden disclosures.

The defences: hardware random number generators (Intel RDRAND, ARM TrustZone TRNG); kernel entropy pools that mix multiple physical sources; the Linux /dev/urandom design (and its long-running argument with /dev/random); HKDF-based key derivation that extracts uniform secrets from non-uniform entropy. Get the randomness right or nothing else matters.

Crypto · Random20 / 32
Crypto wars0x14

Chapter XIX$ the politics of strong cryptography.

The cryptographic primitives this deck describes have been continuously contested by governments. The 1990s "Crypto Wars" pitted the US government (under both Bush and Clinton) against the cryptography research community over export controls, the proposed Clipper chip with mandatory key escrow, and the prosecution of PGP author Phil Zimmermann under the Munitions Export Act. The crypto community won most of the battles; the export controls were substantially relaxed in 1996, and Clipper was abandoned.

The argument has returned every decade. The 2014–2016 Apple-FBI dispute over the San Bernardino iPhone (Apple refused to write a special firmware that would bypass the device passcode protections; FBI eventually paid an outside vendor to do it). The 2018-onward "Going Dark" rhetoric. The proposed EARN IT Act. The UK Online Safety Bill's client-side scanning provisions.

The cryptography community's core argument has been consistent: there is no technical mechanism for "exceptional access for law enforcement" that does not also weaken security against everyone else. The 2015 Abelson et al. paper Keys Under Doormats made the argument formally; subsequent proposals have not surmounted it. The political pressure is unlikely to abate.

Crypto · Politics21 / 32
Quantum threat0x15

Chapter XX$ the quantum threat.

Peter Shor's 1994 algorithm reshapes the long-term security calculus. A sufficiently large quantum computer could, in polynomial time, factor RSA moduli and compute discrete logs — breaking RSA, Diffie-Hellman, and elliptic-curve cryptography. The current state of quantum hardware (a few hundred noisy qubits) is many orders of magnitude short of what's needed (millions of error-corrected qubits for RSA-2048). But the trajectory is forward.

The defensive response is twofold. Post-quantum cryptography: replace RSA/DH/ECC with primitives based on problems believed hard for both classical and quantum computers — lattice problems (Learning With Errors, NTRU), code-based problems (McEliece), hash-based signatures (XMSS, SPHINCS+), isogeny-based schemes (the SIKE submission was broken in 2022, a cautionary episode). Quantum key distribution: information-theoretically secure key exchange via quantum channels (BB84, E91). QKD has serious deployment limitations (channel distance, infrastructure cost) and is not the focus of most current standardisation work.

The hardest practical concern is store-now-decrypt-later: an adversary can capture today's encrypted traffic and retain it against the day a quantum computer becomes available. For data with a multi-decade confidentiality requirement (state secrets, certain medical records), this is already a present-tense threat.

Crypto · Quantum22 / 32
NIST PQC0x16

Chapter XXI$ post-quantum standardisation.

The NIST Post-Quantum Cryptography standardisation process began in 2016 with 69 submissions and concluded its first round of selections in 2022. The lattice-based CRYSTALS-Kyber (now ML-KEM) was selected for general-purpose key encapsulation; CRYSTALS-Dilithium (ML-DSA), Falcon (FN-DSA), and the hash-based SPHINCS+ (SLH-DSA) for signatures. The first FIPS standards (203, 204, 205) were published in August 2024.

The lattice-based schemes are the workhorses. Their security rests on the Learning With Errors (LWE) problem and its structured variants (Module-LWE for Kyber and Dilithium). The reduction from LWE to worst-case lattice problems (Regev 2005) gives the schemes their theoretical foundation. The keys and ciphertexts are larger than ECC equivalents (Kyber-768 public key is 1184 bytes versus 32 for X25519) but acceptably so for most deployments.

The deployment is happening now. Cloudflare and Google have deployed hybrid X25519+Kyber for TLS handshakes. Apple's iMessage uses post-quantum primitives in its PQ3 protocol. The migration is expected to span the late 2020s for general internet traffic and significantly longer for legacy embedded systems.

Crypto · NIST23 / 32
Lattices0x17

Chapter XXII$ the lattice that is everywhere.

A lattice in R^n is the set of all integer linear combinations of n linearly independent basis vectors. The fundamental algorithmic problems on lattices — finding the shortest non-zero vector (SVP), finding the closest lattice point to a target (CVP), approximating these to within polynomial factors — are NP-hard in their exact forms and believed hard for quantum algorithms in their approximate forms.

The connection to cryptography came through the Ajtai-Dwork cryptosystem (1997) and was made practical by Oded Regev's 2005 introduction of the Learning With Errors problem and its reduction to worst-case lattice problems. LWE became the foundation for an enormous subsequent literature: lattice-based encryption, signatures, identity-based encryption (Gentry-Peikert-Vaikuntanathan, 2008), the first plausible fully homomorphic encryption (Gentry, 2009), and most of the post-quantum standardisation candidates.

The structured variants — Ring-LWE, Module-LWE — give better efficiency at modest theoretical cost. The schemes deployed in 2025 are descendants of Regev's 2005 paper. The next two decades of cryptographic standards will largely be lattice-based.

Crypto · Lattices24 / 32
Signal0x18

Chapter XXIII$ end-to-end messaging.

The most-deployed cryptographic protocol of the last decade is the Signal Protocol, designed by Trevor Perrin and Moxie Marlinspike in 2013 and now used by Signal, WhatsApp (since 2016), Google Messages RCS (since 2024), Facebook Messenger (since 2023, in default-on rollout), and a long tail of smaller messaging applications.

The protocol's contribution is the Double Ratchet: a key-agreement structure that produces a fresh symmetric key for each message, with both backward (forward) secrecy — past messages cannot be decrypted from a current key compromise — and forward (post-compromise) secrecy — once a compromise ends, security is restored. Combined with X3DH (Extended Triple Diffie-Hellman) for initial key agreement, the protocol gives strong asynchronous-messaging security.

The 2023 PQXDH upgrade added a Kyber-based key encapsulation to the X3DH handshake, providing hybrid post-quantum security. Apple's iMessage moved to a similar PQ3 design in 2024. The end-to-end-encrypted messaging market has, against initial expectation, become the application area where post-quantum cryptography is being deployed earliest.

Crypto · Signal25 / 32
Cryptocurrency0x19

Chapter XXIV$ cryptography in cryptocurrency.

Bitcoin's 2008 white paper combined existing primitives (SHA-256, ECDSA on secp256k1, hash-linked chains) with the proof-of-work consensus mechanism into a global decentralised ledger. The cryptographic content is largely conventional; the assembly is the contribution.

Subsequent cryptocurrencies have driven serious new cryptography. Zcash (2016) — first widely-deployed zk-SNARK. Monero — ring signatures and stealth addresses for transaction privacy. Ethereum's BLS12-381 curve and the BLS aggregate signatures used in the beacon chain. The various ZK rollups (StarkNet, zkSync, Scroll, Polygon zkEVM) — production-scale recursive zk-SNARK and STARK systems.

The cryptography around digital signatures has improved as a result. Bitcoin's Schnorr signatures (BIP-340, activated 2021) and Taproot (BIP-341) are direct descendants of the threshold-signature and signature-aggregation literature. The blockchain world has been a substantial driver of practical cryptographic engineering in the 2010s and 2020s, despite the field's other problems.

Crypto · Crypto$26 / 32
Cryptography
The visual cliché meeting the actual practice: cryptography happens behind every secure connection on the internet.
Reading0x1A

Chapter XXV$ twenty-five works.

Crypto · Reading27 / 32
Watch & read0x1B

Chapter XXVI$ watch & read.

↑ The RSA encryption algorithm — how it works, with worked example

More on YouTube

Watch · Zero-knowledge proofs explained at five levels (WIRED)
Watch · Post-Quantum Cryptography (Computerphile)

Read

Simon Singh's The Code Book (1999) is the best general-audience history. David Kahn's The Codebreakers (1967, revised 1996) is the encyclopaedic version. Bruce Schneier's Applied Cryptography (1996) is dated but remains the best engineering introduction. For modern theory: Katz and Lindell's Introduction to Modern Cryptography (now in its 3rd edition). For practitioners: Jean-Philippe Aumasson's Serious Cryptography (2020) and David Wong's Real-World Cryptography (2021). Andrew Hodges's biography Alan Turing: The Enigma (1983) for the founding era.

Crypto · W&R28 / 32
Implementation0x1C

Chapter XXVII$ never roll your own.

The discipline's most-repeated practical advice: never implement cryptographic primitives yourself for production use. The reasons accumulate. Side-channel resistance requires constant-time implementation that compilers may inadvertently break. Padding oracles turn correct primitive choices into broken protocols. Misuse-resistance is its own engineering discipline; APIs that allow IV reuse or unauthenticated encryption produce vulnerabilities even when the underlying primitive is fine.

The current generation of misuse-resistant librarieslibsodium, RustCrypto, Tink, the NaCl family — gives high-level interfaces (secretbox, box, auth, pwhash) that make it hard to misuse and hide the choice of primitive. AEAD ciphers (AES-GCM, ChaCha20-Poly1305) authenticate as well as encrypt, eliminating an entire class of historical mistakes.

For TLS specifically: use a current OpenSSL, BoringSSL, or rustls; do not implement the protocol yourself. For password storage: Argon2id, scrypt, or bcrypt, never plain SHA-2. For random numbers: the OS RNG, never your own PRNG. For elliptic curves: Curve25519 / Ed25519 unless you have a specific reason for a different curve.

Crypto · Impl29 / 32
Open problems0x1D

Chapter XXVIII$ what is not yet known.

The discipline has open questions of varying urgency.

P vs NP remains open. If P = NP — widely believed false — most of public-key cryptography collapses. The empirical security of cryptographic schemes ultimately rests on a complexity-theoretic assumption that has not been proved.

Indistinguishability obfuscation (iO) — programs that can be transformed so their input-output behaviour is preserved but their internals are unintelligible. Garg, Gentry, Halevi, Raykova, Sahai, Waters's 2013 candidate construction was the field's most-watched theoretical advance of the decade. Practical iO remains far from production, but the recent JLS21 construction from well-studied assumptions has reopened the path.

Quantum capability timeline. The crypto-relevant quantum computer (millions of physical qubits, robust error correction, sustained operation) is at least a decade away on most expert estimates and might be much further. The migration to PQC has to start now even though the threat is not yet present.

The CA infrastructure remains a structural single point of failure. Certificate Transparency mitigates rather than solves the underlying problem. The web of trust models that PGP attempted have not, so far, worked at scale.

Crypto · Open30 / 32
State0x1E

Chapter XXIX$ the field at present.

Cryptography in 2026 is, by historical standards, an extraordinarily successful applied discipline. The TLS handshake that secures most web traffic; the Signal protocol that secures most encrypted messaging; the elliptic-curve signatures that authenticate every git commit, every package manager update, every code-signed binary; the post-quantum primitives now rolling into the same infrastructure. The mathematics behind these is hundreds of years old; the engineering is fifty years old; the deployment is mostly the work of the last fifteen years.

What works: the academic-industrial-standards pipeline. NIST competitions have produced AES, SHA-3, and the post-quantum standards over a quarter century, with public scrutiny at each step. The IETF process produces deployable protocols (TLS, MLS, OAuth) with serious cryptographic review.

What is fragile: the certificate authority infrastructure; the implementation surface; the politics around exceptional access; the long tail of legacy embedded systems running unmaintained crypto. The next decade's principal task is the post-quantum migration of the deployed base. The decade after that will be about iO, FHE moving to general use, and continuing pressure on the politics.

Crypto · State31 / 32
Colophon0x1F

End of stream.

Cryptography — Volume III, Deck 11 of The Deck Catalog. Set in JetBrains Mono, terminal-themed. Paper at #0e1116; rule and accent in green and amber.

Thirty-two leaves on three thousand years of writing in secret. The discipline's greatest victories — Bletchley, Diffie-Hellman, Shannon's theorem, the post-quantum migration — were achieved by people who took the mathematics very seriously and the politics no less so.

EOF

↑ Vol. III · Math. · Deck 11 / 12

i / i Space · ↓ · ↑