The Paranoids Were Right
What The Net, Enemy of the State, and Hackers understood about the future, and why nobody listened.
There is a particular kind of humiliation reserved for people who laughed at something that turned out to be correct. It is subtler than being wrong, being wrong is at least honest. This is the humiliation of being right about the facts and catastrophically mistaken about their significance. Climatologists know this feeling. Epidemiologists know it. And now, if you were a reasonably sentient adult watching cinema in the mid-to-late 1990s and thinking, "well, this is all a bit far-fetched," you know it too.
Three films, released between 1995 and 1998, depicted a world in which your identity could be erased by bureaucratic malice, in which the state could track every movement, call, and transaction of a private citizen in real time, and in which a loose tribe of aesthetically chaotic young people on computers were the only ones who truly understood what was being built around everyone else. These films were The Net, Enemy of the State, and Hackers. They were received, broadly, as entertainment. Enjoyable, occasionally ridiculous entertainment. Sandra Bullock ordered pizza through her computer, which audiences found charming rather than prophetic. Will Smith was chased across Washington D.C. by Gene Hackman representing the National Security Agency, which audiences found thrilling rather than instructive. And a group of teenagers in New York rollerbladed between hacking sessions while wearing outfits that suggested the costume department had lost a bet, which audiences found amusing rather than, in retrospect, anthropologically significant.
We watched these films. We ate our popcorn. We went home. And then, over the following twenty-five years, we built everything in them.
The Net, or: You Are Your Data, and Your Data Is Not Yours
The Net (1995) is, on its surface, a fairly straightforward thriller about identity theft, a concept so exotic in 1995 that the film had to explain it to audiences as though describing an unusually creative form of witchcraft. Sandra Bullock plays Angela Bennett, a systems analyst who works from home, orders food online, and has, by choice or circumstance, reduced her physical social footprint to essentially nothing. She has no paper trail that anyone who knows her in person would notice. She is, the film implies, already halfway digital. When the conspiracy, involving a backdoored cybersecurity product called Gatekeeper that allows its creators to access any networked system, decides to neutralise her, they do not send anyone to her house. They simply change her records. New name. New criminal history. New identity in the databases. Angela Bennett ceases to exist because the infrastructure that said she existed has been edited.
What the film understood, with more clarity than it perhaps intended, is that the self in the modern bureaucratic state is not a metaphysical entity. It is a data object. It lives in systems, and systems can be administered. Your name, your credit history, your medical records, your right to board an a plane or open a bank account, these are entries in tables, and tables have administrators. In 1995, this was a horror premise. In 2026, it is Terms and Conditions.
The film's villain is, nominally, a corrupt organisation. But the deeper villain, the one the film cannot quite bring itself to name directly, because naming it would make the film too bleak for a studio release, is the infrastructure itself. The problem is not that bad people have access to the databases. The problem is that the databases exist and that your life has been quietly uploaded into them without your meaningful consent. Angela Bennett's crisis is not that someone stole her identity. It is that her identity was never really hers to begin with. It was on loan, administered by systems she did not control and could not audit, and when those systems were turned against her, she discovered that the self she thought she possessed was actually a tenancy agreement that had just been revoked.
This is not a fringe concern in 2026. It is the operating condition of every person interacting with a digital state, a digital economy, or a digital social life, which is to say, everyone. The only difference between Angela Bennett's nightmare and the present reality is that in the film, someone had to actively choose to do it to her. Now the systems manage it passively, continuously, and largely without malice, which is somehow worse.
Enemy of the State, or: The Correct People Were Paranoid
Enemy of the State came out in 1998, one year after the United Kingdom acquired more CCTV cameras per capita than any other country on earth, and three years before the United States would use a catastrophic terrorist attack to pass surveillance legislation that would have made the film's antagonists blush. The film stars Will Smith as Robert Dean, a lawyer who accidentally receives a recording of an NSA official murdering a congressman who opposed a new surveillance bill. The official, played by Jon Voight with the specific energy of a man who has made peace with what he is, then deploys the full apparatus of American signals intelligence against one private citizen to recover the footage and silence him.
The NSA, in this film, can access satellite imagery, mobile phone networks, financial records, and security cameras in real time. In 1998, audiences watched this and felt a pleasurable chill at the fantasy of state omniscience. The NSA's actual capabilities, as revealed by Edward Snowden fifteen years later, were substantially more extensive. The film had been conservative.
What Enemy of the State captured, and what makes it anthropologically interesting rather than merely dramatically effective, is the specific cultural logic of American surveillance anxiety. The film's villain is not the state as an abstraction. It is a specific corrupt official within a state apparatus that is otherwise implied to be legitimate. This is a crucial distinction. The film is not arguing that the system is broken. It is arguing that a bad man has broken a good system, and that if you remove the bad man, the system is fine. This is a coherent narrative. It is also, arguably, the narrative that has allowed the system to keep expanding for the past three decades, because if the problem is always an aberrant individual rather than a structural condition, then the solution is always personnel rather than architecture. You fire the villain, you elect the reformer, and you leave the apparatus intact for the next villain to inherit.
The Snowden revelations were uncomfortable not because they revealed that bad people existed in the intelligence community, but because they revealed that the architecture itself, regardless of who operated it, was built to do things that most citizens would find troubling if asked directly. The film could not go there in 1998. A studio thriller requires a villain you can shoot. You cannot shoot a FISA court.
What has changed since 1998 is not the surveillance. The surveillance has expanded beyond what the film's writers could plausibly dramatise. What has changed is the geography of the threat. Enemy of the State worried about the government. And the government is indeed watching. But the more intimate surveillance, the continuous, granular, behavioural tracking that shapes what you see, what you buy, what you believe, and in several documented cases, who you vote for, is being conducted by companies whose terms of service you agreed to in exchange for the ability to argue with strangers about television programmes. Gene Hackman's NSA required subpoenas and satellites. Meta requires you to click "I agree."
Hackers, or: The Ones Who Saw It Coming Were Wearing Ridiculous Trousers
And then there is Hackers, which is a different kind of film entirely and must be approached with appropriate delicacy.
Hackers (1995) is not a film that is loved for its accuracy. This is understood and accepted by everyone, including, one assumes, the people who made it. It features a villain plot involving a computer worm called "Da Vinci" that will capsize oil tankers by causing them to dump their ballast, which is not how oil tankers or computer worms work. Its protagonists access computer systems through an interface that resembles a city made of glass and neon, because the filmmakers correctly understood that "person staring at text on a screen" is not cinematic. The hacking sequences are operatic nonsense. The rollerblading is extensive.
And yet Hackers has a devoted audience among people who actually work in information security, which is both paradoxical and entirely explicable. The film is not loved despite being wrong about the technical details. It is loved because it is right about the culture, the specific subculture of people who were, in 1995, spending their nights and weekends learning how systems worked from the inside out, not because they had been employed to do so but because the alternative was boredom and the systems were there. Hackers understood that this was a tribe with its own aesthetics, its own ethics, its own internal hierarchies, and its own deeply ambivalent relationship to authority. All of this is accurate. The accuracy is simply buried under a significant quantity of visual noise and Jonny Lee Miller.
The anthropological function of Hackers is distinct from the other two films. Where The Net dramatises the vulnerability of the individual in a datafied world, and Enemy of the State dramatises the power of institutional surveillance, Hackers dramatises the possibility of a counter-culture that understands the infrastructure and refuses to be entirely contained by it. Its protagonists are not fighting to restore a previous state of privacy. They do not have a previous state of privacy to restore; they are teenagers who grew up with computers. What they are fighting for is something more specific and more interesting: the right to know. The hacker ethic, articulated in Steven Levy's 1984 book and alive, however stylised, in the film, holds that information should be free, that systems should be understood by the people they affect, and that the only meaningful response to a locked door is to learn how locks work.
This is a political position, though the film prefers to present it as attitude. And it is a political position that has become, in 2026, both more urgent and more marginalised. The systems that govern daily life, the algorithms that determine creditworthiness, the models that generate hiring decisions, the feeds that mediate social reality, are more complex, more consequential, and less legible to the people they affect than anything Hackers imagined. The right to know how they work is actively contested. The companies that build them call their methods proprietary. The governments that deploy them call their methods classified. The hacker instinct, the one that says a system affecting your life should be a system you can understand, is, in the current moment, being systematically suppressed. The Hackers kids would have a lot of work to do and considerably fewer rollerblading opportunities.
What the Films Got Wrong (Which Is Mostly Who to Be Afraid Of)
It would be dishonest to present these three films as simply prophetic. They were also, in ways that matter, wrong, not about the technologies or the capabilities, but about the actors. All three films locate their primary threat in institutional or criminal actors who make a deliberate choice to weaponise the systems against individuals. The conspiracy in The Net. The rogue NSA official in Enemy of the State. The corporate criminal in Hackers. In each case, the infrastructure is neutral. It is people, specific, identifiable, villainous people, who make it dangerous. Remove the people, and the systems are fine.
This is, to put it gently, not how it went.
The surveillance infrastructure that exists in 2026 was not primarily built by villains. It was built by engineers following product specifications, by venture capitalists following returns, by product managers following engagement metrics, and by legislators following campaign contributions from the people following the engagement metrics. The result is a system of near-total behavioural visibility that no individual villain designed and that no individual villain could dismantle, because it is not a conspiracy, it is a business model. The data economy that tracks your location, your purchases, your reading habits, your emotional state as inferred from your typing patterns, and your political leanings as inferred from your consumer behaviour is not the NSA with its subpoenas and satellites. It is the logical outcome of advertising-funded services being optimised for engagement at scale. No one had to be evil. The system produced the outcome regardless.
This is the thing the films could not tell us, because the thing films require is agency. Someone has to have done it. Someone has to be stopped. The data economy has no one to stop. It has shareholders.
The Vocabulary We Inherited and the World It Did Not Prepare Us For
Here is perhaps the most interesting thing these films did, and the thing least often discussed: they gave us the words.
When Edward Snowden revealed the scope of NSA surveillance in 2013, the public's ability to process that information, to form a mental model of what it meant, to feel the appropriate responses of alarm and outrage, was substantially shaped by Enemy of the State. When Cambridge Analytica revealed in 2018 that personal data had been harvested from millions of Facebook users to build psychographic profiles for political targeting, the conceptual framework people reached for, the sense of a self being digitally invaded and reconstructed without consent, had been rehearsed in The Net. The hacker as folk hero, as the person who understands the system and refuses to be captured by it, is a cultural archetype that Hackers helped solidify, however inaccurately it portrayed the practice.
These films were not predictions. They were myths. And myths do not work by being accurate. They work by being emotionally and culturally available when the reality arrives. The films gave late-twentieth-century Western culture a set of images, characters, and narratives for understanding digital vulnerability and institutional power long before most people had any direct experience of either. They were rehearsals for a play that was already being written.
The problem with the vocabulary they provided is that it is, in several crucial respects, too dramatic. It presupposes identifiable villains, dramatic confrontations, and the possibility of resolution. Real data extraction is quiet, continuous, and conducted under licence. Real algorithmic governance is dull and technical and produces outcomes that are very difficult to dramatise because there is no scene in which a corporate machine-learning model dramatically reveals that it has been denying people of colour mortgage applications at statistically improbable rates. There is no third-act confrontation. There is a dataset and a disparity and a very long argument about causation.
The films prepared us to be alarmed. They did not quite prepare us to be bored into compliance. And it turns out that boredom is a far more effective mechanism of consent than anything Gene Hackman's NSA could have devised.
Thirty years on, we live inside a version of all three films simultaneously. Our identities exist primarily as data objects in systems we do not control. Our movements, communications, and behaviours are continuously surveilled by a combination of state and corporate actors whose interests are frequently aligned and occasionally identical. And the people who most clearly understand the infrastructure, who know how the locks work, are either employed by the companies building it, subject to non-disclosure agreements, or explaining it on podcasts that not enough people listen to.
The films were right. We just thought they were entertainment.
In our defence, the rollerblading really was very good.
Buy as a coffee. It's the last transaction about you that won't be sold to someone else.
References:
Cheney-Lippold, J. (2011) 'A new algorithmic identity: soft biopolitics and the modulation of control', Theory, Culture & Society, 28(6), pp. 164–181. Available at: https://doi.org/10.1177/0263276411424420 (Accessed: 2 March 2026).
Couldry, N. and Mejias, U.A. (2018) 'Data colonialism: rethinking big data's relation to the contemporary subject', Television and New Media, 20(4), pp. 336–349. Available at: https://doi.org/10.1177/1527476418796632 (Accessed: 2 March 2026).
Zuboff, S. (2019) The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York: Public Affairs (The Hatchett Group).
Coleman, G. (2014) Hacker, Hoaxer, Whistleblower, Spy: The Many Faces of Anonymous. London: Verso Books.
Greenwald, G. (2014) No Place to Hide: Edward Snowden, the NSA, and the U.S. Surveillance State. New York: Metropolitan Books.
Levy, S. (2010) Hackers: Heroes of the Computer Revolution. 25th anniversary edn. Sebastopol, CA: O'Reilly Media.
Wark, M. (2004) A Hacker Manifesto. Cambridge, MA: Harvard University Press.
Lyon, D. (2007) Surveillance Studies: An Overview. Cambridge: Polity Press.
Solove, D.J. (2011) Nothing to Hide: The False Tradeoff Between Privacy and Security. New Haven, CT: Yale University Press.
Braidotti, R. (2013) The Posthuman. John Wiley and Sons Ltd.
Cadwalladr, C. and Graham-Harrison, E. (2018) 'Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach', The Guardian, 17 March. Available at: https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election (Accessed: 2 March 2026).