At the International Conference “Defending Democracy: Battlefield of Truth”, Professor Walter Quattrociocchi, Full Professor at Sapienza University of Rome, leading the Center of Data Science and Complexity for Society, delivered a sharp and provocative intervention.
Reflecting on nearly a decade of debates about disinformation, he argued that Europe still treats the issue as if it were 2017 — focusing on fake news versus real news — instead of addressing the deeper, systemic causes rooted in information overload, polarization, and the business models of social media platforms.
Quattrociocchi called for a stronger alliance between science and policymaking, warning that new technological dynamics — especially large language models and AI-driven infodemics — are further eroding public trust and distorting our collective sense of truth.
Opening Remarks by Walter Quattrociocchi:
I’ll try to be brief — and maybe a bit provocative.
Because, honestly, when I listen to all these discussions about misinformation, it feels like we’re still in 2017. Nothing really changed. We keep talking as if it’s a fight between fake news and real news. But the reality is far more complex — and the scientific perspective tells a much deeper story.
I was part of the European Commission’s advisory group back in 2016, at the very beginning of the so-called “fake news” era, and later during the COVID years. I left that collaboration because the Commission insisted on treating disinformation as a journalistic problem. And they failed.
Then, years later, it was nice to hear Ursula von der Leyen finally say that pre-bunking is much more effective than fact-checking. That idea came from our own research — a big international project with Cambridge, Harvard, and Sapienza, funded by the UK government. We built a model to help users recognize manipulation before it happens, not by telling them what’s true or false, but by showing how narratives compete for their attention.
For seven or eight years, there was a huge gap between science and policymaking. Things are finally changing. I’m now involved in European panels on what we call the “Democracy Shield,” and at least now the framework is more scientifically driven.
We are in a geopolitical moment when Europe must wake up — or things will get messy. Even the situation in the U.S. can be seen as an opportunity for us to rethink our approach and strengthen our democratic resilience.
Science must be a pillar of policymaking.
Tomorrow, a new study will show how far China has advanced in AI — faster than us, both technologically and normatively. Once again, Europe risks missing the point.
We keep hearing: “We’re fighting Russian propaganda.” Of course, it exists. But if we don’t understand the roots of this problem, we’ll just keep running in circles.
And the roots are simple: information overload.
Social media platforms create a business model that rewards engagement and entertainment. Bots amplify everything, creating a feedback loop — more content, more reactions, more content. Our cognitive system can’t handle it. We seek only what we like and avoid what we don’t. That’s how polarization grows.
I said this back in 2016. Our paper “The Spreading of Misinformation Online” predicted it — and now it’s finally part of the European Commission’s narrative.
If we really want to counter disinformation, we have to make people aware of this process — aware of how their own behavior is being shaped. Because the real danger isn’t that people believe Russian propaganda more easily. The real danger is that we’re losing trust in our own institutions.
And even when we frame the issue as “good versus evil,” we’re reinforcing polarization ourselves.
We need a strong alliance between science and politics — especially now.
And then, consider what’s coming next: infodemics amplified by large language models. The danger of AI isn’t “general artificial intelligence” — that’s marketing talk. The real threat is epistemic erosion. People are losing their ability to tell reliable information from mere linguistic fluency. Style is replacing substance.
We call this epistemia — a disease of the epistemic level. We’re losing our grip on what is true or false.
And as we delegate more and more thinking to AI systems without understanding their limits, we risk destroying the very foundation of democracy: the shared sense of truth.
That’s the challenge ahead of us — and it’s already here.
The panel was held as part of the international conference “Defending Democracy: Battlefield of Truth”, organized within the framework of the project “Democratic Navigator”, with the support of the Federal Foreign Office of the Federal Republic of Germany.