The US election will take place in the most polluted and degraded information space in living memory. Alarmist fears of an AI-generated apocalypse made in the new year were misplaced.
But the trickle of generative AI-enabled spam, deepfakes and memes has added fuel to longer term trends in personalized digital political advertising, the disintegration of local news media and the vulnerability of digital democratic infrastructure to misuse and abuse. Innovations in information integrity – like Twitter/X’s Community Notes feature – are few and far between.
The US election will take place under a pall, as the people and organizations that have historically drawn attention to the threat of disinformation find themselves out in the cold, facing lawfare and harassment, their access to data curtailed and their funding cut.
The… ‘accountability community’ drew attention to where democratic and political processes, institutions and norms were being eroded by the move online.
This is a relatively recent change. Although at times uneasy, governments, technology platforms and civil society previously remained on speaking terms as they jostled over the shape, rules and values of online life. It may have been a David and Goliath struggle at times, but it wasn’t always hostile, and at least David showed up.
Journalists, academics and civil society organizations worked to fill the accountability vacuum left by slow-moving regulation and laissez-faire governance rooted in Western governments’ hesitation to interfere with media or personal freedoms.
The work of this ‘accountability community’ drew attention to where democratic and political processes, institutions and norms were being eroded by the move online – sometimes even before the companies stewarding the technology were themselves aware. This included explosive revelations of attempts to use tech platforms to undermine democratic processes in the US – like the 2018 Cambridge Analytical scandal.
For years, stories like these and the organizations behind them were (sometimes grudgingly) welcomed by both industry and policymakers. But with sixty days to go until the US election, that friendship feels over, and the accountability community has found itself fighting a guerrilla war: always underfunded, always under-resourced, but now also in conflict with technology platforms – and elements of the polarized US political scene.
This is unsettling in the context of new allegations of a Russian backed disinformation campaign intended to influence November’s election; a suspected Iranian hack of the Trump campaign; and while both US parties experiment in the use of AI in their online communications.
Going dark
Access to data – a critical plank in understanding what is happening in an online space – is dwindling.
It was this data that revealed Russian-aligned operatives allegedly using digital platforms to incite social conflict in the run up to the 2016 election.
But Facebook restricted access to data about groups and pages on the site in 2018, moving researchers to a bespoke tool called Crowdtangle. Crowdtangle was itself shut down last month.
Meanwhile, following Elon Musk’s acquisition of X/Twitter in 2022, large scale data access – once free to researchers – was quickly made so expensive that researchers reported being priced out. Reddit made a similar move last year. These spaces – critical pieces of Western political infrastructure – are going dark.
Disinformation research becomes political
Over a similar period ‘fake news’ entered the US lexicon, and disinformation research became politically charged. Researchers – even those focusing on foreign state interference – have increasingly been accused of setting out to silence domestic political dissent.
In March last year, US Rep. Jim Jordan’s Subcommittee on the Weaponization of the Federal Government issued letters to American universities requesting information on anyone supporting a ‘censorship regime’ through ‘advising on so-called “misinformation”’.
The mainstreaming of conspiracy theories targeting the funders of platform accountability work – notably the Open Society Foundation – has been amplified in Congress.
Even research into health communications has been put on ice, as a result of the politicization of research into vaccine efficacy.
Lawfare
Then came the lawsuits. X sued the UK-based Center for Countering Digital Hate after the Center published research claiming a rise in hate speech on the platform. The case was thrown out, but a similar lawsuit in the US, targeting non-profit watchdog Media Matters, is scheduled for next year.
“The chilling effects of this backlash are obvious. Disinformation researchers are today more wary of speaking out.
Other lawsuits in the US have already hit their mark: the Global Alliance for Responsible Media (GARM), a non-profit advertising group, was closed by its parent organization after X filed suit accusing it of conspiring to withhold advertising revenue.
The chilling effects of this backlash are obvious. Disinformation researchers are today more wary of speaking out. ‘Academics, universities and government agencies’, reports the Washington Post, ‘are overhauling or ending research programs designed to counter the spread of online misinformation amid a legal campaign from conservative politicians and activists who accuse them of colluding with tech companies to censor right-wing views.’
At its most effective, counter disinformation work brought together industry, government and civil society with a common purpose, recognizing that the rapid digitization of democracies had introduced serious vulnerabilities, and that patching them was in everyone’s interest. As we approach the climax of this year of elections, the world is a long way from that high point.
Reversing this sudden backlash is important, though it will come too late for the US election in November.
The work of the accountability community is crucial to monitor how the online world is shaping democracies. It is essential that philanthropic support for investigative journalism on disinformation continues.
Better protections for whistleblowers, and government proposals for restrictions on lawfare are essential, to preserve the integrity of the election – and protect public trust in the result. Investment in alternative political infrastructure is also a moonshot worth chasing.
Source: Chatham House