At the outset, let me acknowledge what some readers will immediately think: the vision I am about to sketch may sound ambitious, even unrealistic. Yet there is a certain value in deliberately suspending disbelief and engaging in design thinking—exploring “what if” scenarios not because they are easily within reach, but because they force us to confront blind spots and reimagine the possible. This, after all, is the vocation of think tanks. Our role is not merely to describe the world as it is, but to ask uncomfortable questions, to challenge assumptions, and to offer proposals that may seem bold today but could become necessary tomorrow. As someone working within the Brussels policy community, I see this exercise not as an indulgence in utopia but as a vital intellectual discipline: probing the edges of feasibility in order to expand the boundaries of Europe’s strategic imagination.
The Kremlin’s “Doppelgänger” and “Portal Kombat” campaigns created cloned versions of trusted media outlets like Bild and The Guardian to spread fabricated stories. In Germany’s 2025 elections, the “Storm-1516” operation weaponised AI-generated fake websites and deepfakes, showing how cheaply and quickly disinformation can now scale.
Europe’s existing infrastructure - EDMO’s fact-checking hubs, StratCom East’s monitoring, ENISA’s cyber coordination - is doing vital work. But these efforts are underfunded and overstretched. In 2022, the European Media and Information Fund received €19.4 million in applications but could fund only €5.7 million worth of projects. Meanwhile, adversaries spend hundreds of millions refining their influence operations.
In short: Europe is fighting a 21st-century war with 20th-century tools. What could be the possible solution?
What if citizens themselves became the missing piece of Europe’s defence?
Across the EU, we have models proving this works. Poland’s Territorial Defence Forces now count over 42,000 volunteer soldiers, 90% of whom serve part-time while keeping civilian jobs. Finland’s “comprehensive security” system ensures one in six citizens has received crisis or cyber training. Both models show the power of grassroots resilience: people willing to give a few hours a week can form a national shield.
The same approach could work in the digital space. A European Cyber Guard - a volunteer corps coordinated at EU level but rooted in local communities - would channel civic energy into defending democracy online. To begin with, volunteers as fact-checkers and OSINT analysts: trained to spot, debunk, and report manipulative narratives.
Education campaigns in schools and communities: raising the digital literacy of citizens, especially where skills gaps remain (only 56% of Europeans have basic digital skills; in Romania, just 28%).
Think of it as a territorial defence for truth.
But a European Cyber Guard should not be limited to defence alone. Effective resilience also requires the capacity for offensive measures. Imagine if, instead of only countering Russian disinformation within Europe, volunteers could contribute to proactive campaigns that introduce doubt and disruption into hostile information ecosystems. This means deploying strategic narratives, irony, and fact-based content in ways that undermine adversaries’ credibility and expose internal contradictions.
Citizen creativity, when guided responsibly, can be a powerful asset. Developing counter-narratives that erode trust in manipulative sources and make disinformation harder to sustain. Such offensive capabilities, would allow Europe not only to withstand foreign information manipulation but also to actively weaken its effectiveness at the source. Examples of narratives that could be deployed include:
Some will understandably raise ethical concerns about engaging in offensive information operations. Yet the dilemma is no different from traditional warfare: if an adversary is firing at you, is it truly unethical to fire back? The only difference is that this battlefield is digital—and the targets are not our bodies, but our minds. Recognising this does not diminish the ethical responsibility to act within democratic values and international law, but it underscores that passivity in the face of aggression carries its own moral cost.
AI has democratised deception: with generative tools, bad actors can now launch sophisticated campaigns at negligible cost. The EU’s resilience gap is widening. While EDMO’s hubs verified 487 false claims during the 2024 elections, this remains a drop in the ocean compared to the scale of manipulative content on social media, where 88% of FIMI activity occurs.
Adversaries target Europe’s weak spots. Language diversity, uneven digital literacy, and fragmented infrastructures make coordinated defence harder. This is precisely why foreign actors exploit local contexts and tailor narratives country by country.
That’s the elephant in the room. Past attempts to regulate online content have triggered accusations of silencing legitimate opinions. Facebook’s fact-checking ecosystem collapsed under precisely such charges. The answer lies in transparency, independence, and oversight. A European Cyber Guard must:
This isn’t about censoring speech. It’s about shining a light on coordinated deception campaigns run by foreign powers - just as NATO’s Article 3 obliges members to “maintain and develop their capacity to resist armed attack”. In today’s world, that includes information warfare.
Europe already has proof-of-concept. ENISA runs cybersecurity competitions powered by 50+ volunteers. Finland’s Cyber Citizen Initiative has built EU-wide training materials in every official language. Civil society groups like Lie Detectors bring fact-checking into classrooms.
Scaling these into a structured, EU-backed volunteer corps would not only strengthen defences but also upskill citizens, narrowing the digital literacy gap and boosting employability in the tech sector. The dual benefit—security plus skills—makes this one of the most cost-effective resilience measures the EU could adopt.
Critics will ask: how do we pay for this? The answer: by connecting dots.
The EU already funds EDMO, EMIF, Horizon Europe research, and national media literacy programs. A Cyber Guard doesn’t require inventing new money, but pooling and scaling existing streams. With €28.3 billion from the Recovery and Resilience Facility earmarked for digital education, resources exist. What’s missing is the political will to organise them under a single, EU-wide framework.
The battle against disinformation will not be won by bureaucrats alone. It requires citizens. The EDMO network rightly concluded after the 2024 elections: “The battle was won, but the attrition war is far from over”.
By launching a European Cyber Guard, the EU could:
In the age of AI-driven disinformation, truth will not defend itself. Europe must. And a citizen-powered Cyber Guard is the next logical step.