Protecting our elections against tech-enabled disinformation

The Strategist is running a short series of articles in the lead up to ASPI’s Sydney Dialogue on September 2 and 3. The event will cover key topics in critical, emerging and cyber technologies, including disinformation, electoral interference, artificial intelligence, hybrid warfare, clean technologies and more.

 

In 1922, Mr Norman J. Trotter, from the Pappinbarra region of New South Wales, wrote to the then federal Treasurer, the Hon Earle Page MP, complaining that the general election was being held when there wasn’t a full moon.

Trotter pointed out the safety implications of transporting ballot boxes over mountainous roads, on horseback, with insufficient illumination!

A century on, electoral administrators around the world are dealing with a radically changed democratic landscape.  Concerns about moonlight—or its absence—have been replaced by the pervasive presence of disinformation and false narratives, the rise of new technologies such as generative artificial intelligence, occasional madcap conspiracy theories, threats to electoral workers, and the need to maintain citizens’ confidence in electoral outcomes.

Together, these dramatic changes will demand the ongoing vigilance of legislators, regulators and civil society. Increased focus and resourcing on this continually emerging space can harness the opportunities it presents while lessening the potential negative effects already being experienced.

The Australian Electoral Commission has been remarkably successful in maintaining the confidence of the Australian people: survey results show persistently high levels of trust in our operations, with nine out of 10 Australians expressing a high degree of satisfaction. That assurance is indispensable when the democratic legitimacy of governments rests on trust in electoral outcomes—the foundation on which all other actions of democratic government rests.

Yet maintaining these results may become increasingly complex with the rapidly expanding use of new technologies and an ever-evolving information ecosystem.

The attempted manipulation of information isn’t new.  In 1675, King Charles II tried to close London coffee houses because he was worried about false information being peddled in those places where people gathered to talk politics. Modern communications, including the ubiquitous use of mobile phones and social media platforms, has turbocharged the development and spread of information—both accurate and false. This has significantly affected all aspects of elections, from campaigning to the way they are conducted.

The relatively recent advent of generative AI heralds a potentially new epoch in electoral management. Globally, democracies are coming to terms with this new technology, and jurisdictions are trying different approaches from outright bans, through to mandatory declarations on messaging, and voluntary codes.

Regardless of the approach, democracies need to be aware that generative AI will have a significant impact on communications around elections. It will enable the generation of information—including disinformation—at a volume and velocity not seen previously and, perhaps even more troublingly, with a verifiability that may make it hard for audiences to discern the truth of the information they are receiving.

Legislators and regulators need to be alert to the potential impact of these ‘three Vs’. Meanwhile Australia’s regulatory framework needs to evolve to harness the benefits of new technology—including to democratic participation and political inclusion—while ameliorating the potentially negative impacts, and protecting the rights of citizens to express themselves freely.

Citizens’ electoral expectations have also changed dramatically. There was a time when the role of an electoral management body was simply to produce a statistically valid result. Such bodies must now also work to maintain trust by listening to the huge amount of feedback they get through social media and other channels—much of which reflects immediate feelings and does not necessarily take account of the legislation or resourcing realities by which an electoral body is bound. These bodies must swiftly respond to concerns and provide a constant stream of assurance about the electoral process.

The AEC has instituted several initiatives to manage these recent developments.  We have developed a reputation management system, which outlines a range of strategies to ensure citizens can trust election results. This includes arguably the most active media and social media engagement in Australia’s public service and the operation of a disinformation register during electoral events. These activities, and many others, are supported by an AEC command centre that provides real-time data, oversight and connectivity to the manual election operation like we’ve never had before.

We’ve also established a defending democracy unit that works with our partners across the government and social media platforms, and supports the operation of the multi-agency electoral integrity assurance taskforce.

AI-generated deepfakes—using audio, video, or a combination of both—have been used to sway public opinion in a growing number of elections overseas. In some cases, the use of AI has been clearly labelled; but in others, the material is presented as genuine. In extreme cases, voters can be steered toward or away from candidates—or even to avoid the polls altogether.

The next federal election is likely to be the first in Australia in which the use of AI-generated political communication could be a prominent feature of the campaign. The net effect, some experts say, is a genuine threat to democracy with a surge of AI deepfakes eroding the public’s trust in what they see and hear.

The AEC is watching global developments closely and is working to ensure voters are not misled about the electoral process, nor the role, capabilities and performance of the AEC.

We are also looking forward to the Australian Parliament grappling with this issue to produce national legislation to help regulate the use of this new technology. Education—specifically digital media literacy—will be fundamental to supporting voters and protecting elections.

Despite the wave of change, the actual process of voting remains reassuringly the same as it was for the very first federal election in 1901.  Australians use a pencil—or a pen, if they choose—to mark their paper and put it in the ballot box.  Those votes still need to be transported to be counted, and votes are counted by citizens working in a temporary capacity with the AEC, in the presence of party scrutineers. The results are published and certified by the electoral authority. Of course, there are some advances such as postal voting and pre-poll voting, as well as telephone voting for blind and low-vision voters, but the core process remains largely unchanged. (As an aside, Mr Trotter would be pleased that advances in electrification means moonlight is no longer a key concern.)

The AEC is very clear on its role in administering elections and maintaining citizens’ trust. We have never and will never—unless told to do so by Parliament—be involved in ascertaining the truth or otherwise of statements by candidates and parties.

Rather, we focus on protecting the integrity of the electoral system and ensuring citizens have the information they need to participate in the process.

This is becoming a more complex and challenging task. It is one that needs the active commitment and attention of every Australian to ensure trust and confidence in our elections remains strong.