In this blog, Stephen Hutchings introduces some of the key ideas of his new open access article in the journal Cultural Studies.
Counter-Disinformation’s Growing Pains
Few would doubt the need for effective disinformation tracking. It would be a dereliction of democratic duty to leave unmonitored and unchecked the falsehoods, misleading assertions, unverified rumours, hate speech and propaganda disseminated by authoritarian states and other bad actors. Moreover, by engaging ordinary citizens in their endeavours, counter-disinformation initiatives play a vital role not just in protecting democracies from such threats, but in reinforcing the foundations of democratic culture.
The recent growth in such initiatives has been phenomenal. Since 2016, we’ve witnessed the mushrooming of innumerable counter-disinformation monitors, fact checkers, digital literacy curricula, and cyber-intelligence operations, complemented by an academic research colossus. This includes ‘big data’ analyses tracking state disinformation, forensic investigations of specific disinformation campaigns, information manipulation typologies, accounts of ‘misinformation ecosystems’, and studies of their impacts.
Unsurprisingly, the pace of this development has led to growing pains: significant flaws, gaps, and inconsistencies in its coverage. Equally to be expected, therefore, is the emergence of critical voices highlighting the problems. Respected US journalist, Joseph Bernstein identified an entire industry reliant on maintaining the sense of a democratic polity infected by a mutating contaminant requiring ever more antidotes. Terming it ‘Big Disinfo’ he names an ‘unofficial partnership between Big Tech, corporate media, elite universities, and cash-rich foundations,’ arguing that in rushing to bolster the (neo)liberal counter-disinformation project, its proponents still struggle with ‘basic epistemological issues’. Much of Bernstein’s critique carries weight, not least his implicit emphasis on the way in which the concurrent arrival of (a) digital tools facilitating the mass dissemination of lies, distortions and propaganda narratives and (b) computer programmes capable of tracking their spread has facilitated the skewing of disinformation studies methods towards ‘content analysis’ models aimed at measuring and categorizing vast volumes of verifiably false data.
At the same time, in endorsing Bernstein’s concerns, we should acknowledge the danger of unwitting collusion with democracy’s enemies and with ‘post-truth’ relativists who have their own unworthy reasons for undermining counter-disinformation practices; recent right-wing assaults on disinformation specialists from within the US House of Representatives represent one cautionary tale. Nonetheless, democratic scrutiny requires constructive criticism of those practices. Any such scrutiny should be grounded in a firm commitment to both democracy and truth, but also to recognition of the tensions within them and the complexity of their interrelationship.
It is in this spirit, that in September 2023, as part of a new research project, I examined mission statements and related texts on the websites of four leading counter-disinformation units (CDUs): the US-based Atlantic Council’s Digital Forensic Research Lab (DFRL); the EU’s East Stratcom Taskforce (ESTF); and the NGOs, EU Disinfo Lab (EDL); and the international Global Disinformation Index (GDI). They represent a combination of state-aligned and autonomous operations whose geopolitical scope spans Europe and the USA–the core of the liberal democratic polity. Whilst very sympathetic to their goals, I identified three sets of issues requiring attention. The full analysis is presented in a new article in Cultural Studies.
Impugning the Other/Ironing Out the Self
The first set of problems reflects the history of the term, ‘disinformation’. Its myth of origin attributes it to a mid-20th century English translation of the Russian neologism dezinformatsiya, supposedly coined by Stalin with a French-sounding etymology mendaciously designating a non-Soviet practice which the USSR was ‘obliged’ to emulate. The story, widely endorsed by disinformation analysts, is, ironically, itself false. English usage in fact dates to the 19th century when rival US news outlets began accusing each other of ‘disinformation’, while early Soviet instances reflect German, not French, sources. The embedding of this myth in counter-disinformation models has consequences. By overlooking the status of ‘disinformation’ as the product of historically and culturally specific exchanges of ‘othering’ accusations and treating it as a universally definable, stable practice, these models struggle to explain the different meanings of individual breaches of the truth. For example, nodding to liberal democratic impartiality, an EDL affiliate attributes one of its ‘Five Disinformation Narratives about the Ukraine War’ to Ukraine itself. Labelling it ‘Exaggerated tales of Ukrainian heroism’, it glosses it as a deconstruction of the fictitious ‘Ghost of Kiiv’, a fabled guardian angel reputed to destroy Russian planes and clearly unintended to be taken literally.
The same lack of acknowledgement of disinformation accusations’ ‘othering’ function hampers efforts to rebut the wide use of the same rhetorical apparatus by authoritarian states like Russia, which runs its own ‘counter-disinformation’ and ‘fake news’ services. Indicative here, is DFRL’s tortuous account of its preference for the ‘more neutral and objective’ term ‘disinformation’ over ‘fake news’ (now toxified by Trump), articulated in seeming ignorance of the saturation of Kremlin propaganda by allegations of Western ‘disinformation’.
Website of 'fact-checking' service run by Russia Today
More important are the occlusions flowing from the Cold War dichotomy pitting a homogenized liberal democratic Self against a hostile, and equally unified, Communist Other, symbolised in disinformation’s origin myth. The residues of this structure manifest themselves in a blindness to tensions within liberal democracy. Identifying with the first term in ‘liberal democracy’ implies supporting minority rights and diversity. In promoting its ‘Misogynistic Disinformation’ analysis, GDI enacts this principle, risking dismissal for ‘wokery’ by legitimate centre-right opinion, but also eliding the semantics of disinformation with that of ‘hate speech’. GDI cites its protection for ‘at risk groups and institutions,’ ranging from ‘women, persecuted minorities … the LGBTQ+ community … to scientific consensuses … on climate change … to … the judicial system’. Defending marginal groups blends with support for rational expertise and the protection of democratic processes. Disinformation is constructed as the antithesis to these linked, yet distinct, priorities (cf. citations of the ‘sanctity of democratic voting’ underpinning populist causes like Brexit; feminist protests about judicial gender bias). Democracy’s shifting connection to liberalism is echoed in its complex relationship with capitalism. GDI complains that ‘disinformation has become … a by-product of attention-driven business models,’ yet emphasises its capacity to provide ‘risk ratings’ preventing ‘brands … [from] appear[ing] next to toxic content’.
Linked to these conflations is a second set of problems relating to struggles to iron out the contradiction between liberal democracy’s dual belief in the power of rational, state-supported expertise to protect vulnerable citizens and in the spontaneous, participatory involvement of those citizens in its processes. EDL claims ‘active membership of … a passionate and vast community’, unifying ‘different stakeholders, from civil society to … the media ecosystem’. But it is difficult to reconcile alignment with passion-driven grassroots communities and commitment to the very dispassionate, impartial analysis which liberal democracy hails as a guiding principle. ESTF celebrates its ‘advocacy campaign’, omitting to explain how this is compatible with scientific neutrality, and with the fact that it is funded by the vast interstate entity that is the EU.
Indeed, CDUs’ ambiguous relationship with governments shapes their emphasis on transparency. GDI underscores its reliance on open-source information, permitting doubters to ‘check their working’ and providing FAQ lists phrased from the perspective of conspiracy-obsessed sceptics but designed to quell that scepticism (‘Who Funds GDI? Why does GDI work with governments?’) The fulsome response to the first question enumerates government departments providing support, conveying honest openness, yet the response to the second question (‘Combatting disinformation is critical to protecting democracies’) is a non-sequitur. Protecting democracy does not necessarily mean ‘working with governments,’ as UK politician, Caroline Lucas (Green Party), learned when her FoI requests concerning COVID procurement policy prompted the government’s disinformation unit to label her a disinformation purveyor. In responding that it merely ‘tracks narratives using publicly available information’, it illustrates the instrumentalization of open-ness rhetoric for covert surveillance, and the distortive effects of indiscriminate big data tools tracking loosely defined ‘narratives.’
Other autonomous CDUs risk the very opposite of the transparency they promote by downplaying the political support they receive and indulging in the same obfuscation as ‘grey zone’ proxy organizations linked obscurely to autocratic states. EDL proudly vaunts its autonomy, stating that it accepts funding only from ‘sources that share our mission to … not become another mouthpiece for a ... partisan political agenda’, but reveals in a document downloadable from a link multiple clicks away from the home page that those sources include ‘private foundations and governments.’
Representing Disinformation/Catching Smoke
This leads us to a final set of problems: those concerning how CDUs represent disinformation. One noteworthy feature is the attachment to positivist observation and classification. CDUs develop elaborate typologies of observable forms of manipulation presented as fixed, stable techniques, objectively measurable and available as ‘toolsets’ to their propagators. An ESTF report lists among ‘250 … Tactics, Techniques and Procedures (TTPs)’ ‘creating fake accounts, building bot networks, amplifying conspiracy narratives, using fake experts … exploiting data voids … and spamouflaging.’ The compatibility of these activities is questionable, not least since regular social media users mask their identities behind fictional avatars, and because amplifying conspiracy narratives has little in common with exploiting data voids. Neat taxonomies constructed from raw observation of seemingly unambiguous falsehoods fail to accommodate such differentiation.
A larger slippage is that between descriptive accounts of false information and normative rejections of objectionable ‘narratives.’ This switch from exposing disinformation to rebutting propaganda is obscured by the tendency to use the terms in awkward collocations (propaganda-disinformation; disinformation-and-propaganda). The contradiction reveals itself in recurrent references to ‘false narratives.’ Whilst facts must correspond to reality, ‘narratives’ necessarily involve the re-presentation of events to generate rhetorical/emotional effects, problematising efforts to designate them as entirely ‘true’ or ‘false’. Attenuating the judgement by replacing ‘false’ with ‘misleading’ - common CDU practice - produces category errors: any political narrative can be deemed ‘misleading’ by opponents for whom it produces the ‘wrong’ conclusions. GDI goes furthest in substituting fact-based approaches to disinformation with the lens of ‘adversarial narratives.’ It acknowledges that:
Defining disinformation … feels akin to catching smoke … One common pitfall … is the distinction between … ‘misinformation’ and ‘disinformation’. If disinformation were as simple as someone intentionally lying … we’d be clamouring to moderate every mention of Santa Claus … At the same time, definitions that rely on true versus false dichotomies miss obvious examples of disinformation.
GDI presents its ‘adversarial narratives’ concept as the solution, describing them as:
rooted in … conflict … between actors and their interests, and especially between a social in-group and an out-group … Adversarial narratives are effective because they inflame social tensions by … amplifying perceived grievances.
However, politics is inherently antagonistic. Western politicians regularly accuse one another of ‘undermining the democratic process’ or ‘the rule of law’. Populist amplifications of grievances and the creation of ‘out groups’ is routine (as we know from ‘culture wars’). GDI’s effort to mitigate the semantic spread bedevilling disinformation definitions merely exacerbates it.
A Pathway to Maturity?
None of the issues identified detract from the excellent work of CDUs. Some (GDI and DFRL especially) deserve further praise for their awareness of the conceptual difficulties plaguing them, and their commendable efforts to navigate them. Indeed, it is just such self-reflection that will enable the ‘industry’ to reach full maturity. This will facilitate the eventual development of a toolset less subject to the distortions, reversals and toxifications accompanying usage of everyday polemical terms like ‘disinformation’, ‘propaganda’, ‘fake news’ and ‘trolling’. We should also recognize, like Hannah Arendt in 1957, that distortions and even lies are as much the bread and butter of democratic politicians and their tabloid warriors as they are that of populists and authoritarians, that democracy is itself replete with contradiction, and that facts never exist in a vacuum but are invoked in specific situations within purposeful acts of communication.
Fact claims, meanwhile, must be differentiated from knowledge as the status such claims acquire when they receive validation through societal consensus, and from truth as the elusive, transcendent condition which accords knowledge ethical weight and permanence. What is needed is an apparatus both more pliable than the current one (the contexts of claims to facticity and to falsehood change, along with their meanings), and tighter (hate speech, political polemic, antagonistic narrative, false assertion, unfounded rumour cannot be elided). When this goal is met, CDUs can realise their true potential as bulwarks of the democratic project.