Weaponised Outrage: How Virality Became the Internet’s Favourite WMD

Weaponised Outrage: How Virality Became the Internet’s Favourite WMD
Photo by Max Muselmann / Unsplash

Virality used to mean chickenpox. Now it means your aunt’s Facebook post about Bill Gates putting 5G in your teabags. Progress, apparently. Somewhere between cat videos and Kremlin psy-ops, the word stopped being cute and started being terrifying. What we call “going viral” has become less about a quirky meme and more about a carefully engineered form of cultural sabotage. It’s propaganda on steroids, gossip with global reach, psychological warfare disguised as a trending hashtag.

And the worst part? It works because we’re wired to spread it for free. Humans are the unpaid delivery drivers of lies, outrage, and digital sewage. Uber Eats, but for disinformation.

Why We’re All Suckers for Outrage

You like to think you’re a sensible adult, calmly sifting through the day’s news like a philosopher at the breakfast table. You’re not. You’re basically a prehistoric ape with Wi-Fi. Our brains are hardwired to notice and share the dramatic, the disgusting, and the dangerous. Evolution stitched that bias into us because, once upon a time, paying attention to threats was the difference between surviving the sabre-tooth tiger and becoming lunch.

Psychologists call this the negativity bias, the fact that bad news sticks harder, spreads faster, and generally beats the living daylights out of good news in the battle for attention. Baumeister and colleagues summed it up bluntly: bad is stronger than good. And they weren’t being poetic. Brain-imaging studies show that threatening or arousing information lights up our amygdala like a slot machine, locking it into memory and making us itch to tell someone else.

This worked brilliantly in small tribes: if one of your mates said “that berry makes you vomit blood,” it was useful to pass it on quickly. But in today’s digital jungle, the same ancient wiring means you’re more likely to click on “Politician Eats Live Hamster” than “Local Council Improves Recycling.” Social media algorithms have simply turned your evolutionary survival reflex into a monetisation strategy. Your brain is being farmed for engagement.

The Paradox of Virality

Here’s the sick joke: we don’t even like the stuff we share. Surveys show most people say they’d prefer their feeds full of uplifting stories, accurate reporting, and maybe a nice picture of a dog in a bow tie. And yet, what dominates? Conspiracies, tribal rants, and that bloke from your uncle’s darts team who suddenly thinks he’s Clausewitz because he watched a YouTube video about NATO.

This is what academics call the paradox of virality: the most widely shared content is also the least widely liked. In other words, your feed looks like a sewer because other people’s feeds look like a sewer, and everyone is too busy retweeting the stench to do anything about it. Platforms, of course, claim they’re just showing us “what we want to see.” What we apparently want is the psychological equivalent of drinking bleach.

Enter the Superspreaders

If this sounds bad, wait until you meet the superspreaders. No, not sweaty blokes in nightclubs during COVID, though the metaphor isn’t far off. Superspreaders in the digital world are those tireless accounts (sometimes bots, sometimes actual humans with too much spare time) who do the lion’s share of disinformation-sharing.

But here’s the uncomfortable truth: it’s not just anonymous trolls with anime avatars. Superspreaders are very often people with power. Politicians, celebrities, tech moguls, the kind of people whose tweets are treated like scripture by armies of followers. Take Elon Musk. He’s built a whole persona around being the swashbuckling genius of our age, Tony Stark without the charm, and he uses Twitter/X like a personal weapon. With a single post, he can tank a company’s share price, send Dogecoin into orbit, or spread conspiracy theories about Paul Pelosi.

His carefully cultivated myth of the misunderstood genius, propped up by legions of tech-bros desperate to sit closer to the throne, is a case study in weaponised virality. Every snarky meme, every late-night shitpost, every “just asking questions” moment is amplified not just by algorithms, but by human desperation to signal loyalty to the big man. He doesn’t need a troll farm in St Petersburg; he’s got one in Palo Alto, doing it for free.

And Musk isn’t unique. Politicians from Trump to Modi, influencers from Andrew Tate to Kanye West, all know the game: feed your audience outrage and they’ll spread it for you, whether they love you or loathe you. The paradox is that hostility boosts engagement just as much as adoration. To the algorithm, hate and love are identical, they’re just numbers on a dashboard.

So while academics politely point out that 0.1% of users generate 80% of the misinformation, let’s not pretend it’s random nobodies. It’s often the very people with the biggest microphones, the fattest bank accounts, and the most to gain from keeping us angry. Superspreaders aren’t an unfortunate quirk of the system. They are the system.

Virality as a Weapon

This isn’t some accidental side-effect of bored people and clever algorithms. Virality is now a tactic: a deliberate, instrumented way to bend hearts, minds and votes. The playbook is simple and elegant in the worst possible way. Find the human weakness (outrage, fear, tribalism), wrap it in shareable pixels, and let networks do the rest. Nations with time on their hands discovered decades ago that you don’t always need missiles; sometimes you just need memes and a distribution strategy.

The U.S. Special Counsel’s investigation into 2016 spelled it out: the Russian state, via entities such as the Internet Research Agency, ran a sweeping social-media campaign designed to sow discord and influence the election. The IRA didn’t run random pranks; it ran targeted, clinical influence operations. Fake personas, tailored content and platform-specific formats were used to stoke racial tensions, suppress turnout in certain communities, and amplify the ugliest bits of politics. Senate and investigative reporting documented how meme-driven campaigns, Instagram baiting and covert pages moved from manufacture to mainstream with unnerving efficiency. And it worked at scale: millions of exposures, enough to plausibly shift attitudes and tilt the conversation.

And Russia isn’t alone. In 2024, Meta, OpenAI and others exposed multiple covert influence campaigns traced to various states and private firms. One example was a campaign linked to Israel, allegedly run by a commercial outfit called Stoic. Hundreds of fake accounts targeted US lawmakers and specific audiences, using generative AI to churn out polished propaganda at scale. Influence-as-a-service is here, and it blends geopolitical goals with commercial ambition. Today, if you want a disinformation campaign, you don’t need a Ministry of Truth, you just hire one on retainer.

The mechanics are depressingly familiar. Create plausible fake identities. Seed narratives in sympathetic communities. Amplify them with coordinated reposts, paid ads, and helpful influencers. Then step back and let ordinary human tribal instincts do the rest. Hate it, love it, mock it, the algorithm doesn’t care. Every click is fuel.

This is hybrid warfare with low cost, low risk and astonishing reach: a campaign that needs no boots on the ground to tilt public opinion or fracture alliances. The weapons are cheap, deniable, and horribly effective.

The Virus Metaphor Gets Uncomfortably Real

We talk about information “going viral” as if it were a cheeky metaphor, but the resemblance to actual pathogens is a bit too on-the-nose. Ideas spread from host to host, mutate into more contagious forms, and thrive in certain environments while dying in others. If the Cold War was obsessed with nuclear fallout, the twenty-first century is discovering a new kind of fallout: ideological contagion, dripping across borders at fibre-optic speed.

Intelligence agencies, of course, adore this parallel. They’ve spent decades studying how biological and chemical weapons exploit vulnerabilities in the human body. Now, they’re looking at how disinformation exploits vulnerabilities in the human brain. A virus doesn’t politely knock on your immune system’s front door; it sneaks in, hijacks your cells, and uses them as a replication factory. Disinformation campaigns do exactly the same. Hijack our instincts for outrage, fear, or solidarity, and turn us into unwitting amplification machines.

And like any good weapons designer, the engineers of virality love mutation. Biological viruses evolve into nastier strains; memes evolve into sharper, angrier, more culturally tailored versions of themselves. The “lab leak” theory, for example, wasn’t just a conspiracy, it was a virus that mutated, adapted to political contexts, and infected mainstream discourse. By the time anyone fact-checked it, it had already spread further than the correction ever would.

If you want a taste of the bio-warfare parallel, look at what intelligence planners call cognitive security. NATO openly describes disinformation as a “cognitive weapon”, something aimed not at infrastructure, but at the squishy grey matter between your ears. It’s elegant in its brutality: no bombs, no bullets, no expensive supply chains. Just viral content that lodges in your brain and spreads to your friends. Biological warfare was banned by treaty; cognitive warfare is still largely a free-for-all.

So yes, the virus metaphor works a little too well. If germs were the terror of the twentieth century, memes are the terror of the twenty-first. Except this time, we’re the petri dish.

Gossip with Nukes Attached

Spies have always loved gossip. Before Twitter, before tabloids, before the internet shoved clickbait down our throats, human intelligence relied on whispers. During the Cold War, entire networks of informants traded in nothing more sophisticated than pub chatter and office rumours. The CIA, MI6, the KGB, all happily paid people to eavesdrop in cafés, collect bar-room talk, or slip into bed with someone who might accidentally let state secrets slip on the pillow. HUMINT wasn’t glamorous James Bond gadgetry; it was gossip weaponised.

The difference now is scale and speed. What once took a rumour weeks to travel through a neighbourhood now takes milliseconds to travel through a botnet. The same mechanics are at play, intrigue, fear, scandal, but where an old-school handler needed to recruit an asset, today you just need a trending hashtag and a few thousand sockpuppet accounts. Meme warfare is HUMINT on steroids. Instead of buying drinks for a nervous embassy attaché, you’re seeding a doctored video on TikTok and watching millions spread it for free.

Even intelligence terminology fits neatly. Agents of influence used to mean compromised journalists or politicians nudged into steering public opinion. Today, it’s anyone with a large enough follower count, knowingly or otherwise parroting disinformation. Tradecraft once demanded forged documents slipped across borders in dead drops. Now, it’s a badly photoshopped image shared a billion times. Both rely on the same principle: plant the right whisper in the right ear, and the crowd does the rest.

The result? Gossip is no longer just the fuel for village feuds or office politics. It’s been upgraded to a weapon of mass disruption. A meme or a viral hoax can destabilise a democracy faster than a suitcase full of kompromat ever could. In the age of cyberwarfare, gossip isn’t idle chatter, it’s espionage with nukes attached.

Different Cultures, Different Viruses

Spies have always known that propaganda is not a one-size-fits-all business. The KGB didn’t hand out the same leaflets in Paris as it did in Kabul; MI6 didn’t try to sell British values to Latin America without tailoring the message to local sensibilities. Influence has always been about cultural payloads, the right message, in the right language, dropped into the right fracture lines.

Virality works the same way. What infects one society may fall flat in another. In the U.S., the most contagious virus is tribal animosity. Pitch one half of the country against the other and sit back while Thanksgiving dinners turn into proxy wars. It’s not an accident that Russian troll farms pushed content stoking racial tensions, gun politics, and abortion debates. They weren’t guessing; they were exploiting well-mapped pressure points.

Japan, on the other hand, has a different immune system. High-energy positivity spreads faster than partisan rancour, so campaigns there tend to lean more on upbeat nationalism and collective pride. Same tools, different payload.

And in wartime, the mutations become obvious. During the early days of Russia’s invasion of Ukraine, Ukrainians shared a steady diet of out-group hate, mocking Russians, amplifying animosity. But as the war dragged on, the viral payload shifted. Memes of solidarity, resilience, and shared sacrifice spread more effectively than anger. Information operations evolve with the battlefield, just as any good spymaster shifts tactics with the terrain.

This is why modern information warfare looks less like broadcasting and more like psychological tailoring. Disinformation campaigns aren’t crude megaphones; they’re scalpels, slicing neatly into cultural fault lines. An MI6 officer in the 1950s might have handed a forged letter to a friendly journalist; today’s disinformation operative runs A/B testing on memes, finds the strain that resonates most with the target culture, and lets the algorithm carry it across the population.

In short: virality is espionage with localisation settings. What spreads in Alabama won’t spread in Tokyo. What works in Moscow won’t work in Delhi. Every society has its own weak points, and every modern operator knows how to infect them.

Can We Fight Back?

This is the part where well-meaning NGOs insist “media literacy” will save us. Cute idea, but no. Media literacy is like handing out umbrellas during a monsoon while the state next door is busy seeding storm clouds. Useful in theory, utterly pathetic in practice.

The serious stuff happens in the shadows. NATO now openly treats disinformation as part of the hybrid warfare toolkit, and alliances are building counter-disinfo units the way Cold War governments built air defence. The EU has its East StratCom Task Force, whose job is basically to play whack-a-mole with Russian propaganda. The Americans have Cyber Command running “defend forward” operations, poking into enemy networks before campaigns even launch. This isn’t fact-checking; it’s counterintelligence with memes.

Then there’s algorithmic friction, the cybersecurity equivalent of adding sand to the gears. Platforms could slow down the spread of content by tweaking how fast a post can be shared, or forcing extra steps before a message can be forwarded a thousand times. WhatsApp tried this during COVID, limiting bulk forwarding after realising that Indian users were being killed in mob lynchings sparked by fake kidnapping rumours. Slowing virality works, but platforms hate it because it means fewer eyeballs, fewer ads, fewer dollars.

And let’s not forget the cloak-and-dagger side. Western agencies run quiet counter-disinformation ops, often not by censoring lies but by flooding the space with better ones. Classic active measures in reverse. If the Russians can invent entire fake movements, the Brits can seed equally fake counter-movements; plausible narratives, designed to confuse, distract, and dilute enemy payloads. It’s not about finding “truth”; it’s about overwhelming the adversary’s signal with your own noise.

International alliances are also quietly knitting together “cognitive firewalls.” The EU, NATO, and G7 swap intelligence on hostile influence campaigns just like they swap data on terrorist cells. Some even fantasise about a Geneva Convention for disinformation, though let’s be honest, states are about as likely to give up their favourite cheap weapon as they are to dismantle their nukes.

So can we fight back? Yes, but only if we stop treating virality as a social nuisance and start treating it as warfare. Because that’s what it is. Right now, we’re defending against twenty-first century psy-ops with twentieth-century civics classes. And that’s about as effective as bringing a crossword puzzle to a gunfight.

Conclusion: Virality as the New WMD

Once upon a time, the deadliest weapon a state could wield was a nuclear missile. Now, it might be a meme. Outrage spreads faster than fallout and leaves societies poisoned long after the initial blast. The psychology of virality isn’t just an academic curiosity; it’s the instruction manual for twenty-first-century propaganda.

The battlefield is no longer just land, sea, air, space, and cyber. It’s also your aunt’s WhatsApp group, your cousin’s Telegram channel, and your boss’s LinkedIn post about how the lizard people secretly run the IMF. The frontlines are domestic, invisible, and very, very stupid.

And the thing about stupid wars is that nobody realises they’re happening until it’s too late. There won’t be air raid sirens, no tanks rolling through Dover. Just democracies quietly hollowed out from within, one viral tantrum at a time. The collapse won’t look like mushroom clouds, it’ll look like a million angry Facebook posts, a parliament paralysed by conspiracy theories, and a public too busy arguing about whether birds are drones to notice their institutions rotting.

So here’s the toast, dear reader: to the new world order of gossip-as-ordnance, memes-as-missiles, and lies-as-legacies. We survived the Blitz, the Cold War, and the banking crash, but this time the bombs are jokes, the fallout is outrage, and the rubble is our collective sanity. Cheers.

References

Baumeister, R.F., Bratslavsky, E., Finkenauer, C. and Vohs, K.D. (2001) Bad is stronger than good. Review of General Psychology, 5(4), pp.323–370. Available at: https://assets.csom.umn.edu/assets/71516.pdf (Accessed 6 September 2025).

CSIS (2019) Russian Meddling in the United States: The Historical Context of the Mueller Report. CSIS Brief, 27 March. Available at: https://www.csis.org/analysis/russian-meddling-united-states-historical-context-mueller-report (Accessed 6 September 2025).

Deppe, C. (2024) Cognitive warfare: A conceptual analysis of the NATO ACT cognitive warfare concept. Frontiers in Big Data, 7, Article 1452129. Available at: https://www.frontiersin.org/journals/big-data/articles/10.3389/fdata.2024.1452129/full (Accessed 6 September 2025).

European External Action Service (2015) Action Plan against Disinformation. Brussels: EEAS. Available at: https://pism.pl/publications/tracing-the-development-of-eu-capabilities-to-counter-hybrid-threats (Accessed 6 September 2025).

European External Action Service (2024) East StratCom Task Force / EUvsDisinfo. Brussels: EEAS. Available at: https://en.wikipedia.org/wiki/East_StratCom_Task_Force (Accessed 6 September 2025).

NATO Allied Command Transformation (2024) Cognitive Warfare Concept. Norfolk, VA: NATO ACT. Available at: https://www.act.nato.int/article/cogwar-concept/ (Accessed 6 September 2025).

Polyakova, A. (2019) What the Mueller report tells us about Russian influence operations. Brookings Institution, 18 April. Available at: https://www.brookings.edu/articles/what-the-mueller-report-tells-us-about-russian-influence-operations/ (Accessed 6 September 2025).

Reuters (2024) NATO boosts efforts to counter Russian, Chinese sabotage acts. Reuters, 3 December. Available at: https://www.reuters.com/world/nato-boost-efforts-counter-russian-chinese-sabotage-acts-2024-12-03/ (Accessed 6 September 2025).

Senate Select Committee on Intelligence (2019) Report on Russian Active Measures Campaigns and Interference in the 2016 U.S. Election: Volume 2 – Russia’s Use of Social Media, with Additional Views. Washington, D.C.: U.S. Senate. Available at: https://digitalcommons.unl.edu/senatedocs/4/ (Accessed 6 September 2025).

Washington Post (2018) New report on Russian disinformation, prepared for the Senate, shows the operation’s scale and sweep. The Washington Post, 16 December. Available at: https://www.washingtonpost.com/technology/2018/12/16/new-report-russian-disinformation-prepared-senate-shows-operations-scale-sweep/ (Accessed 6 September 2025).

Washington Post (2025) Negativity bias: why bad news sticks. The Washington Post, 6 May. Available at: https://www.washingtonpost.com/wellness/2025/05/06/negativity-bias-positivity-strategies/ (Accessed 6 September 2025).

Wikipedia (2025) Negativity bias. Wikipedia. Available at: https://en.wikipedia.org/wiki/Negativity_bias (Accessed 6 September 2025).

Wired (2018) Inside the British Army’s secret information-warfare machine. Wired, 14 June. Available at: https://www.wired.com/story/inside-the-77th-brigade-britains-information-warfare-military/ (Accessed 6 September 2025).

Ynetnews (2024) Israel runs covert influence campaign targeting US lawmakers, report. Ynetnews, 5 June. Available at: https://www.ynetnews.com/article/r1puqhpv0 (Accessed 6 September 2025).

Times of Israel (2024) Diaspora Ministry funded fake social media posts to spread pro-Israel content. The Times of Israel, 5 June. Available at: https://www.timesofisrael.com/diaspora-ministry-funded-fake-social-media-posts-to-spread-pro-israel-content-nyt/ (Accessed 6 September 2025).

Read more