5
min read
Cyber Security
Risks of AI
Social Media
Moral Autonomy
Truth and Reality
Truth
Transparency

How are foreign actors using AI to influence the US election?

October 28, 2024

 A still from the AI-enhanced video of Kamala Harris created by Russian influence actors. 

Tabitha von Kaufmann, Research Assistant at the AI Faith and Civil Society Commission

This is the final article in a series of three articles which seek to explore the role of AI in the democratic process. In this article, the focus will be on the use of AI by foreign actors 

With the US election soon approaching, the fight for votes on both sides is fiercer than ever. Who has the best policies? Who is more trustworthy? Who will lead the country forward towards a brighter future? These are questions which are occupying the thoughts of both voters and many others spectating around the world. Yet, there is another force at play in the US election, one which is potentially reshaping the way in which US citizens are choosing to cast their vote through the distortion of truth: AI technology. Indeed, over the past few months, there have been countless occasions where various actors have used AI to manipulate information to steer votes one way or another.

AI Manipulation by Foreign Actors  

A report from the Microsoft Threat Analysis Centre (MTAC), released on October 23, highlights ongoing efforts by Russia, Iran and China to influence US citizens ahead of the upcoming US election, with a notable use of AI in these attempts. 

In September 2024, a US Intelligence official reported that Russia has generated more AI content than any other foreign power in order to sway the election outcome, specifically to bolster support for Trump over Harris.  For example, Russia has been involved in the production and dissemination of a number of manipulated videos of Harris’s speeches which paint her in a bad light. MTAC reported that in mid-September, Russian-language accounts on X and Telegram shared an AI-enhanced video in which Harris makes an insensitive comment about assassination attempts against Trump, stating he refused to "even die with dignity." This video gained tens of thousands of views. Additionally, two other videos spread by apparently Russian sources falsely suggested that Harris was involved in illegal poaching in Zambia, and spread disinformation about Vice President nominee Tim Walz. Alongside these videos, the Department of Homeland Security also reports that Russia has created fake websites using generative AI that appeared to be authentic US based media outlets.

China and Iran have also been involved in the production of AI-generated content that similarly push specific narratives on divisive issues. Iran has sought to produce content focused on the war in Gaza, while China has targeted topics such as drug use, abortion, immigration and has amplified messages of antisemitism and corruption against down-ballot Republican candidates. 

It is important to note, as Microsoft makes clear, that traditional influence tactics such as deceptive editing, spoofs and staged videos are still used far more frequently by foreign actors than AI-generated content and tend to have a much greater impact.  Microsoft notes that ‘AI usage will be a small subset of a much wider swath of digital manipulations thrust onto audiences in the final days of the election.’ Nevertheless, the impact of a small subset is not to be underestimated, particularly given that ‘during times of crisis, conflict, and competition, manipulated images, audio, and video often travel further and faster across audiences than during an average news cycle.’  Thus, it is highly important to remain alert to the possibility of false AI-generated media being distributed in the final days running up to the election. Furthermore, while use of AI in election content produced by foreign actors may not be the principal method of influence, this may well change in future election campaigns both in the US and worldwide as AI continues to become more sophisticated and advanced. 

Direct Implications  

Influencing the way people vote  

First and foremost, false AI-generated media may change the way people choose to vote. This is a particular threat to swing voters (people who make up their mind in final days before an election) who may be basing their decision largely of what they see on social media. Furthermore, targeted campaigns and AI algorithms can significantly reduce the diversity of perspectives that individuals are exposed to.  

Eroding Public Trust and Objective Truth  

Perhaps the more fundamental implication of AI election media is that continued use will lead to voters to lose trust in anything or anyone. Indeed, this already appears to be the case as indicated by Pew Research Centre who highlight that “57% of US adults say they are very concerned that people or organisations seeking to influence election will use AI to create and distribute fake or misleading information about candidates and campaigns.” In time, people may opt to not vote altogether due to lack of trust in politicians and inability to discern objective truth amidst the multitude of false media. Ultimately, AI severely threatens the integrity of political discourse.  

Impact on human values

Truth and Reality – HIGH RISK

The authenticity of election materials is compromised, making it difficult to discern real information from misinformation.

Authentic Relationships – HIGH RISK

Public trust in government diminishes, leading to a breakdown in relationships and confidence in leadership.

Moral Autonomy – HIGH RISK

Voters may be unduly influenced by AI in their electoral choices. In the case of deep fakes, certain individuals are being used to publicise or promote certain standpoints which are different from their own which strips them of their autonomy.

Dignity of Work – MEDIUM RISK

The role of governance of a nation is undermined, since leaders have no control over the lies and narratives which are spread about them. 

Cognition and Creativity – MEDIUM RISK

Voter decisions may be swayed by AI algorithms rather than through independent research and critical analysis of political parties.

Privacy and Freedom – LOW RISK

 

Policy Recommendations  

Ban on Deceptive AI Practices: Implement legal restrictions on the use of AI to impersonate real individuals (e.g., deepfake voices or images) or spread misinformation. Fines could be imposed on campaigns or individuals using AI to mislead voters.

Pledge for Ethical AI Use by Candidates: Political candidates and parties should be encouraged or required to sign a public pledge that they will not use AI in ways that distort truth or manipulate voters.  

Fact-Checking Collaboration: Fund non-partisan, independent fact-checking organisations to work with government agencies, tech companies, and media outlets to provide real-time checks on AI-generated political content

Global AI Ethics Committee for Elections: Considering that the manipulative use of AI in elections often arises in a trans-national context, countries could collectively create an international body that monitors AI's role in political processes globally, working with countries to improve regulations and share best practices for preventing AI-driven election manipulation.

References

Microsoft Elections Report 5 on Russian Influence

As the U.S. election nears, Russia, Iran and China step up influence efforts - Microsoft On the Issues

Microsoft warns of growing US election cyber interference | TechRadar

The US elections are under attack from hostile countries – and more is coming, Microsoft says | The Independent

Russia, Iran, China expected to use AI to try to influence US election, report says | Reuters

AI in the 2024 election: Most Republicans, Democrats are concerned | Pew Research Center

How AI Is Being Used to Influence the 2024 Election (nymag.com)

AI's growing influence: How election integrity is at risk worldwide (techwireasia.com)

AI will change American elections, but not in the obvious way (economist.com)

Russia produced most AI content to sway presidential vote, US intelligence official says | Reuters

How Russia is using AI for its election influence efforts : NPR

G