Australian Outlook

In this section

The Impact of Disinformation: Contrasting Lessons from the UK 

03 Oct 2024
By Dr Thomas Colley
Nigel Farage MP (Clacton, Reform UK). Source: UK Parliament / https://t.ly/ulxBi

Recent protests in the UK during this summer’s general election and beyond reveal an ongoing battle against disinformation. While social media play a role in its dissemination, it is how elites engage with disinformation that truly matters.

2024 has often been framed as a pivotal year for democracy. A year in which over half the world’s population will have had the chance to vote in national elections. But rather than celebrating the fact that more people will have gone to the polls this year than ever before, many observers have been preoccupied with anxiety. Specifically, that disinformation, turbo-charged by generative artificial intelligence (AI), will make it impossible for citizens to make informed choices, and that would-be authoritarians will abuse these tools to seize or retain power.

With three months to go, the picture for democracy may not be as bleak as the forecasts (perhaps depending on what happens in Washington in November). In Taiwan’s election in January, China engaged in an extensive disinformation campaign to discredit the eventual winner, William Lai, which included disseminating AI-generated audio deep-fakes. In April, thousands of Indian citizens reported receiving deep-fake videos by local MP’s from Narendra Modi’s Bharatiya Janata Party, addressing them individually with personalised messages, promising to resolve specific local issues. However, in neither case did disinformation obviously shape the outcome. Lai did better than polls expected, and the BJP did far worse, losing its parliamentary majority.

The United Kingdom (UK), whose general election was on 4 July, also saw anxiety about disinformation. But these concerns were largely unfounded. The most prominent accusations about disinformation in the campaign’s first half were relatively benign by recent standards—that both Keir Starmer’s victorious Labour Party, and Rishi Sunak’s Conservative Party, had used statistics misleadingly to exaggerate the benefits of their economic plans and the costs of their opponents’. In the campaign’s second half, more examples of disinformation emerged. A deep-fake video of Labour politician Wes Streeting, swearing at another person about the war in Gaza, got a million views. Another deep-fake went viral of the right-wing populist leader of the Reform Party, Nigel Farage, posing as a gaming live-streamer, playing Minecraft online. In the video, which was clearly a parody, he had supposedly logged into Prime Minister Rishi Sunak’s server and was about to blow up his house. In both cases citizens had no problem identifying the fakes, and dismissed them easily. The Alan Turing Institute later documented only 16 examples of viral disinformation in the six-week campaign—an astonishingly low figure given that several world leaders are documented as having exceeded this on a daily basis. One lesson to take from this is that while disinformation may become easier to produce, this does not mean that it will become more prominent or impactful in future elections.

It was a few weeks later that the UK learned a different lesson about disinformation’s impact. On 29 July, Axel Rudakubana, a British-Rwandan, born in Cardiff and raised in the UK, walked into a children’s centre in Southport, near Liverpool, and stabbed 13 young children during a Taylor Swift-themed dance class, killing three. As news broke about the attack, observers clamoured to learn who was responsible. And yet, because Rudakubana was seventeen and thus not yet an adult, the police delayed naming him for legal reasons. But adhering to the law violated a key rule of politics: leave an information vacuum, and others will rush to fill it. Within hours, anti-immigrant politicians and influencers, mostly from the political right, spread conspiracy theories that this was yet another jihadist terrorist attack. Citizens and influencers spread a false claim from little-known news website, Channel3 Now, that the perpetrator was a Muslim asylum seeker called Ali al-Shakati, who had been on an MI-6 watch-list, and that the attack demonstrated once more why Muslims—and immigrants more generally—are a threat to British society. This time, Reform Party’s Nigel Farage amplified the spread of rumours by speculating to his millions of online followers that “the truth is being withheld from us.” Far right influencers online, such as Tommy Robinson (whose real name is Stephen Yaxley-Lennon), called people to the streets, claiming that the public were being manipulated, and that immigration is destroying the country. Former television actor Laurence Fox said that “We need to permanently remove Islam from Great Britain.” The wife of a Conservative Party councillor – later convicted for inciting racial hatred – tweeted “mass deportation now, set fire to all the f****** hotels full of the bastards for all I care… If that makes me racist, so be it.”

The social media maelstrom quickly spilled offline as riots spread across UK cities over the coming nights—Southport, Birmingham, Liverpool, Manchester, Leeds, Newcastle, and London. Crowds of rioters attacked mosques and asylum-seeker hotels, looted shops, and torched cars. Many fought the police, who some on the political Right accused of “two-tier policing” by responding to far-right protests more aggressively than pro-Palestine or Black Lives Matter protests. Over 500 people were arrested; over 50 police officers were injured. With the new Labour government keen to demonstrate its law and order credentials, dozens were swiftly convicted. This included multi-year prison sentences not just for those who engaged in violence, but also for individuals who had incited it on social media.

As summer in Britain ends, what can we learn from these contrasting events about disinformation and its potential impact? The temptation is to blame social media as a key catalyst of the Southport riots. But social media was no less prominent a communication medium during the election campaign, where there was minimal disinformation. Conversely, generative AI played a minimal role in the Southport riots, which were driven by anti-immigrant, Islamophobic narratives that have circulated globally for over a decade. In fact, the main difference between the election and the riots was how political elites, and leading influencers, responded to them.

In the election, there were few examples of disinformation and political elites ignored them. The Conservatives did not, as Donald Trump did in 2020, claim they would be defrauded of victory. All major parties stuck to relatively traditional campaigns, competing mainly about who would manage the economy more effectively. When disinformation emerged, they mainly ignored or disavowed it, rather than trying to use it for political gain. “Of course” the video was fake, Nigel Farage’s spokesperson said of his Minecraft livestream video, mindful that Reform’s apparent priority in the election was to present itself as a serious alternative to the long-dominant Conservative Party. But in the Southport riots, an information vacuum was ruthlessly exploited by political elites and influencers, creating a disinformation firestorm which helped ignite the worst riots Britain had experienced for years.

In this respect, it is not the medium that matters, but the decisions of those that use it. During COVID-19 some leaders spread disinformation at political rallies, on television, on radio, in newspapers, at press conferences—and in combination these shaped public responses to the disease. But what matters is that they—and their followers—chose to amplify disinformation at all. Social media is dominated by a small number of influencers, groups, and political elites. The vast majority of users are “lurkers,” and do not comment. Those that shout the loudest, therefore, have greater impact—as they do on any mass medium.

As the attention of disinformation watchers turns to November’s US election, the lesson from the UK’s summer of 2024 is simple. Disinformation’s effects will vary in different contexts, but its impact will largely depend on how people respond to it: especially political elites, influencers, and media outlets, but also everyday citizens. Apply this to the rest of 2024’s elections and beyond, and one can begin to anticipate its likely impact.

Dr Thomas Colley is Senior Visiting Research Fellow in the Department of War Studies at King’s College London and Senior Lecturer in Defence and International Affairs at the Royal Military Academy, Sandhurst. His research interests include propaganda, disinformation and their use in war and international politics. 

This article is published under a Creative Commons Licence and may be republished with attribution.