AI Powers 27% of Disinformation Campaigns in 2026
The EU tracked 540 disinformation incidents across 10,500 channels in 2025. AI-generated text, audio, and video appeared in more than one in four. Russia ran 29% of attributed cases.

More than one in four foreign disinformation incidents the EU recorded last year involved AI-generated content. The European External Action Service tracked 540 cases across roughly 10,500 social media channels and websites in 2025.
Twenty-seven percent. AI-generated text, synthetic audio, and manipulated video aren't experimental anymore. They're standard equipment in information warfare.
What the EU Found
The EEAS breaks it down bluntly. Of attributed incidents, 29% traced to Russia and 6% to China. The remaining 65% couldn't be linked to any specific actor.
"Russian and Chinese actors have fully implemented AI tools to speed up content production and increase meddling activities with fewer resources," the report said.
That last phrase is key. AI doesn't just make disinformation more convincing — it makes it cheaper. A campaign that once needed a room of paid trolls can now run on a fraction of the budget.
Ukraine remains the primary target. But the reach extends far beyond one country.
Elections as Prime Targets
Nearly half of all incidents tied to elections, protests, or international crises. In 2025, the EEAS tracked election-related campaigns targeting Germany, Poland, Romania, Moldova, and the Czech Republic.
The pattern holds. During moments of public uncertainty — elections, protests, wars — disinformation surges. AI amplifies the speed and volume. Specific countries named, specific campaigns tracked across the European political landscape.
The Telegram Campaign in Estonia
The EU-wide data connects to local operations. On March 21, Estonian Foreign Minister Margus Tsahkna called out a Telegram campaign pushing autonomy for Narva, Estonia's Russian-speaking majority city.
"Narva is Estonia. Full stop. We see through attempts to divide us," Tsahkna wrote on X.
The campaign, reportedly run from Moscow, promoted a separatist statute for Narva. Classic playbook: find ethnic or linguistic divisions, amplify grievances on social media, push narratives that fracture national unity.
Baltic states have faced Russian-language information operations for years. The Telegram campaign shows how they keep evolving — platform by platform, city by city.
Iran's War on Starlink
While some states produce disinformation, others just seize the hardware.
As SpaceX's Starlink passed 10,000 active satellites this week, Iran's intelligence services have been confiscating hundreds of terminals across the country. With conventional internet disrupted by the conflict, Starlink was Iranians' way around government censorship.
Tehran declared Starlink illegal. Anti-government protesters used it to broadcast during internet blackouts earlier this year. Estimates once put Iranian subscribers at 40,000-50,000. Most terminals are now believed inactive.
Two ends of the information warfare spectrum. Sophisticated AI campaigns craft synthetic content to shape opinion. Governments physically destroy communication hardware to stop citizens seeing anything at all.
The US: Policy Without Consensus
The White House released its national AI policy framework on March 20, proposing broad preemption of state AI laws. It asks Congress to require safeguards against child exploitation and self-harm, while calling for streamlined data centre permitting and regulatory sandboxes.
What it largely ignores: AI's role in disinformation. The focus is economic — keeping the US competitive. Child safety gets a mention. Information integrity barely appears.
The same week, Infowars announced it would shut down in mid-April. The Daily Stormer said it was closing too. Two early disinformation platforms going dark. But the infrastructure they built — audience habits, conspiratorial frameworks, the business model — has already been absorbed into mainstream platforms.
What This Means
The EEAS numbers tell a simple story. AI lowered the barrier to entry for information operations. States that once needed large budgets and dedicated teams can now run campaigns with a handful of operators and generative tools.
The targets are getting more specific too. Not just national audiences but individual cities like Narva. Not just social media feeds but physical hardware like Starlink terminals.
The tools changed. The goals haven't. Divide populations, undermine trust, control what people see and say. That 27% isn't an anomaly. It's a baseline — and it's going up.
Sources & Verification
Based on 5 sources from 3 regions
- UNITED24 MediaEurope
- ANSAEurope
- Seoul Economic DailyAsia-Pacific
- Mississippi TodayNorth America
- Roll CallNorth America
Get the daily briefing free
News from 7 regions and 16 languages, delivered to your inbox every morning.
Free · Daily · Unsubscribe anytime
🔒 We never share your email

