The Great Prime Number Panic: When Washington Declared War on Arithmetic
There’s a peculiar comfort in knowing that governments have always been bewildered by technology. Today they struggle with TikTok; in the 1990s they struggled with mathematics. Not complicated mathematics, either, just the sort involving large prime numbers and a few lines of code. But give a politician even a faint whiff of an abstract concept they don’t understand and they break out in the regulatory equivalent of hives.
This is the story of the Crypto Wars: that glorious, idiotic decade when the United States genuinely attempted to regulate cryptography by classifying it as a munition. That’s right, numbers were weapons. Equations were dangerous. Writing certain code was, for a brief moment, a form of international arms trafficking. It was an era of fear, farce, and the sort of lawmaking that only occurs when people who haven’t the faintest clue what a prime number is decide to legislate one.
And the irony is that while governments were attempting to wrap cryptography in red tape, the rest of the world was stumbling, accidentally and chaotically, into the digital age. The World Wide Web had just been born. Email was beginning to replace the fax machine. Banks were considering this newfangled idea of online services. Into that mix came a question that terrified policymakers: “What if the general public had access to strong encryption?” The notion that ordinary civilians might one day send messages that intelligence agencies couldn’t instantly read caused an existential crisis in Washington. (This exact crisis is still happening today, but now with better haircuts and worse metaphors.)
So the U.S. government panicked. And as governments often do in a crisis, it made a series of deeply stupid decisions.
The T-Shirt That Broke National Security
Let’s begin with the most British of protests: passive-aggressive clothing.
Phil Zimmermann, developer of PGP, one of the first tools to bring strong encryption to the masses, was under federal investigation because his software had allegedly been “exported” without permission. In reality he had simply put it on the internet, but the government insisted this was tantamount to handing a missile launcher to a Belgian tourist.
Supporters of Zimmermann responded with a gesture of sublime sarcasm: they printed the RSA algorithm, functional code, mind you, on the front of a T-shirt. Wearing it abroad meant you were technically smuggling a munition.
Just imagine the customs scenario.
Officer: Anything to declare?
You: Just this cotton-poly blend containing a functionally complete public-key encryption routine.
Officer: Right. I'm going to need you to step into the interrogation pod.
It was a protest so dry, so perfectly nerdy, you could practically hear the sound of monocles popping across Washington. Nobody was arrested, because even the most overzealous federal agent would’ve struggled to justify detaining someone for the crime of wearing mathematics. But the point was made: if your export controls are so absurd that a T-shirt can violate them, perhaps your export controls are stupid.
The Smuggled Paperback
After the T-shirt came something even better: the paperback.
The law said exporting encryption software was illegal. The law also said exporting books was fine, because books were speech. And in America, free speech is sacred unless it involves nipples, swearing, or suggesting that capitalism might have a flaw.
So MIT Press published an entire book containing the complete source code of PGP. Page after page of monochrome characters, meticulously printed, ready to be scanned back into working software anywhere in the world. Under the logic of U.S. law, the digital form was a weapon; the dead-tree form was a thoughtful academic contribution.
If this sounds like the plot of an Alan Partridge sketch, it’s because it essentially was. A government that feared the internet was undone by a paperback and a photocopier. Cryptography escaped into the wild not through espionage, but through a postal system and the belief that books aren’t dangerous.
It’s hard to overstate how funny this was to cryptographers. They spent the better part of a decade politely (and sometimes impolitely) reminding Washington that ideas are not containable objects. They don’t respect borders. They don’t submit customs declarations. And you cannot classify a mathematical function the way you classify a rocket launcher. Yet bless them, they tried.
The Browser That Did Algebra Behind Your Back
While governments were busy panicking, something extraordinary was happening in living rooms: Netscape Navigator, the browser of the era, began using strong cryptography to secure web traffic. This caused chaos. Suddenly, your home computer, your beige, whirring, occasionally-smoking piece of post-industrial optimism, was performing operations in number theory that would give most maths students a tension headache.
Exponentiation with thousand-bit integers? Your browser did it on a Tuesday.
Randomised key exchange? It ran in the background while you checked an FAQ page. Elliptic curve computations? Not yet, but give it a decade.
For intelligence agencies, this was horrifying. The general public had somehow acquired a technology previously reserved for submarine captains and people with ominous job titles. But there was no stopping it. The maths had left the lab. The algorithms were out of the box. And soon, every secure connection on the internet relied on the exact thing Washington was trying, and spectacularly failing, to contain.
Today, your mobile phone performs more cryptographic operations in an afternoon than GCHQ did in the entire 1970s. The very thing governments insisted was too dangerous for civilians is now so ordinary it barely registers.
And yet we still perform a ceremonial gasp whenever Apple refuses to break iPhone encryption for a police department, apparently forgetting that we have seen this pantomime before, and it ended badly the first time.
The 40-Bit Disaster
Perhaps the most unintentionally damaging consequence of the export panic was the forced creation of “export-grade encryption.” Companies were required to ship weak cryptography abroad, 40-bit keys, because anything stronger was considered too militarily sensitive. This was intended as a compromise: foreigners would get security, but only the sort of security one might reasonably defeat by leaning on the keyboard a bit.
Foreign users, unaware of the regulations, simply assumed that HTTPS meant “secure.” It did not. It meant “secure unless the attacker owns a moderately fast computer or, frankly, a calculator with ambition.”
But the real tragedy was that these weakened ciphers lingered. Even after the restrictions were lifted, the broken codepaths remained buried within libraries and servers, like cryptographic asbestos. Years later, attackers rediscovered them and forced modern systems into these ancient, pathetic modes. The FREAK attack in 2015, quite literally a vulnerability born out of export regulations from two decades earlier, is one of those stories that should be mandatory reading for anyone still arguing for backdoors today.
Governments weakened encryption to feel safe. The result was less safety. The moral is so obvious it could be written on a Post-it note: never break security on purpose. But somehow, every few years, we must learn it again.
RSA and the Sleepless Night
Beneath all the bureaucratic buffoonery of the Crypto Wars lies a story so charmingly improbable it feels like folklore: the invention of RSA. It is often retold as the tale of Ronald Rivest, fuelled by insomnia and a heroic surplus of caffeine, scribbling down the outline of what would become the world’s most influential encryption algorithm in the small hours of the morning.
Rivest was the insomniac, the restless experimenter who could not let a theoretical itch go unscratched. But the idea would never have matured without Adi Shamir, a man whose talent for slicing through mathematical problems is matched only by his ability to do it with frightening speed.
And then there was Leonard Adleman, the quieter force in the trio, the organiser, the patient builder. If Rivest was the spark and Shamir was the blade, Adleman was the engineer, the one who took the theory, shaped it, stabilised it and ensured that the whole thing didn’t fall over the moment you tried to use it in the real world. He is also the reason the algorithm’s name sounds like a law firm, rather than a rejected prog-rock band.
They were obsessed with a question that, at the time, sounded almost unreasonable: Could you invent a lock that anyone could close but only one person could open? A public key and private key. A one-way mathematical function. Something that would reverse thousands of years of cryptographic intuition.
That collaborative spark is easy to overlook because the world loves a lone-hero narrative. But that’s not how RSA happened. It was three people, in the same room, sharing the same intellectual restlessness, each contributing something indispensable.
And that is what makes Washington’s later panic so delightful. RSA, the algorithm the U.S. government treated with the caution normally reserved for biological weapons, wasn’t born in a fortified research lab or a top-secret military programme. It came out of a university corridor, a few stubborn academics, and a shared urge to turn a mathematical curiosity into something useful. While intelligence agencies were worrying about safeguarding encryption technology, three people in Massachusetts were inventing a tool that would ultimately secure the entire global internet.
The contrast is delicious: bureaucrats attempting to classify arithmetic like a missile system, while Rivest, Shamir and Adleman casually reinvented digital security in a room without so much as a security badge.
Three names. One sleepless night. And an algorithm that governments still haven’t quite forgiven them for.
Why This All Still Matters
The death of the first Crypto Wars was not the end. It was the beginning of an ongoing, tedious, occasionally comedic struggle over who controls encryption. Every few years, a new government proposes mandatory backdoors; every few years, cryptographers explain, again, that this is mathematically impossible without catastrophic consequences. Every few years, ministers insist that surely this time the tech companies can “just make an exception for the good guys,” as though mathematics is some sort of obedient pet waiting for orders.
The ghosts of the 1990s linger in every such debate. The weakened encryption that crawled out of the regulatory swamp still haunts modern systems. The fear and misunderstanding of cryptography still infects policymaking. And somewhere, deep in a bureaucratic archive, there is probably still a form listing prime numbers between cruise missiles and night-vision goggles.
But there is another legacy too: the triumph of knowledge over bureaucracy. The T-shirt, the paperback, the academic all-nighter, these were not just stunts. They were reminders that information doesn’t behave like hardware, and that trying to contain it with legislation is like trying to nail fog to a wall.
In the end, cryptography won. Not because governments conceded, but because they were dragged into the future by the sheer inevitability of mathematics. Secure communication became a necessity, not a subversion. The internet demanded it. Commerce demanded it. Reality demanded it.
And no amount of red tape can stop an idea whose time has arrived, especially if it fits neatly on a T-shirt.
Appendix: RSA in Mathematical Pseudocode
# ============================================================
# RSA Algorithm (Mathematical Pseudocode)
# This is NOT real code, it's a readable, structured outline
# of the RSA system for curious undergraduates.
# ============================================================
# ------------------------------------------------------------
# Key Generation
# ------------------------------------------------------------
def KEYGEN():
# Step 1: Choose two large prime numbers (hundreds of digits long)
p ← PRIME() # e.g., random 1024-bit prime
q ← PRIME() # p and q must be distinct
# Step 2: Compute modulus n
n ← p × q # used in both public and private keys
# Step 3: Euler's totient
φ ← (p - 1) × (q - 1)
# Step 4: Choose public exponent e such that gcd(e, φ) = 1
# Common secure choice: 65537
e ← SELECT_INTEGER(1 < e < φ AND gcd(e, φ) = 1)
# Step 5: Compute private exponent d
# d is the modular inverse of e modulo φ:
# e × d ≡ 1 (mod φ)
d ← MODULAR_INVERSE(e, φ)
PUBLIC_KEY ← (n, e)
PRIVATE_KEY ← (n, d)
return PUBLIC_KEY, PRIVATE_KEY
# ------------------------------------------------------------
# Encryption
# ------------------------------------------------------------
def ENCRYPT(m, PUBLIC_KEY):
(n, e) ← PUBLIC_KEY
# m must be an integer such that 0 ≤ m < n
# RSA encryption is modular exponentiation:
c ← (m^e) mod n
return c
# ------------------------------------------------------------
# Decryption
# ------------------------------------------------------------
def DECRYPT(c, PRIVATE_KEY):
(n, d) ← PRIVATE_KEY
# RSA decryption mirrors the encryption step:
m ← (c^d) mod n
return m
# ------------------------------------------------------------
# Tiny Worked Example (Toy values — NOT secure)
# ------------------------------------------------------------
# Imagine students running the math by hand.
p ← 61
q ← 53
n ← p × q # 3233
φ ← 3120 # (60 × 52)
e ← 17 # gcd(17, 3120) = 1
d ← 2753 # because (17 × 2753) mod 3120 = 1
PUBLIC_KEY = (3233, 17)
PRIVATE_KEY = (3233, 2753)
# Encrypt the message m = 42:
c ← ENCRYPT(42, PUBLIC_KEY) # → 2557
# Decrypt it back:
m_recovered ← DECRYPT(2557, PRIVATE_KEY) # → 42
References:
Rivest, R.L., Shamir, A., and Adleman, L. (1978) A Method for Obtaining Digital Signatures and Public-Key Cryptosystems. Communications of the ACM, 21(2), pp.120–126. Available at: https://doi.org/10.1145/359340.359342 (Accessed: 24 November).
Zimmermann, P.R. (1995) PGP Source Code and Internals. Cambridge, MA: MIT Press.
Zimmermann, P.R. (1995) The Official PGP User’s Guide. Cambridge, MA: MIT Press.
Zimmermann, P.R. (2013) ‘Zimmermann’s Law: PGP inventor and Silent Circle co-founder Phil Zimmermann on the surveillance society’, GigaOM, 11 August. Available at: https://om.co/gigaom/zimmermanns-law-pgp-inventor-and-silent-circle-co-founder-phil-zimmermann-on-the-surveillance-society/ (Accessed: 24 November 2025).
Xia, H., Pei, Q. and Xi, Y. (2016) 'The Analysis and Research of Freak Attack Based on OpenSSL', Proceedings of the 6th International Conference on Information Engineering for Mechanics and Materials, p.15-19, Atlantis Press. Available at: https://doi.org/10.2991/icimm-16.2016.4 (Accessed: 24 November 2025)
Woodfield, M. (2015) ‘FREAK Attack: What you need to know’, DigiCert Blog, 5 March. Available at: https://www.digicert.com/blog/freak-attack-need-know (Accessed: 24 November 2025).