In the lead up to our recent federal election, a largely automated pro-Russian news website has been targeting Australia in an influence operation, attempting to “poison” AI chatbots with propaganda, according to a report by ABC news.

Apparently, the move is part of a plan to increase division among Australians and cause disharmony in our community in the longer term.

Read the ABC article

Pravda Australia presents itself as a news site, but analysts allege it's part of an ongoing plan to retrain Western chatbots such as ChatGPT, Google's Gemini and Microsoft's Copilot on "the Russian perspective" aimed at increasing social divisions by spreading so-called “Russian narratives”.

It's one of roughly 180 largely automated websites in the global Pravda Network allegedly designed to "launder" disinformation and pro-Kremlin propaganda for AI models to consume and repeat back to Western users.

This comes as academics and others have warned that this election has been a test of how AI-driven information can and does shape Australia’s democracy.

As one of the most online electorates in the world, where 99 per cent of voters have internet access and about 80 percent use social media, we are seeing that AI-driven algorithms are now shaping what we see and learn about our elections.

A report by Cory Alpert, a PhD candidate from the University of Melbourne, published in March this year, says that AI has had a hand in every election since social media platforms began.

We are all very aware that mainstream media outlets no longer define the public conversation, in a world where millions of individuals, influencers and political players can create and share their own particular perceptions.

The stories that tend to go viral are often those that provoke strong emotions of anger, fear or anxiety. As a result, researchers were nervous about the consequences of stories going viral without necessarily being true. Cory Alpert says one of the biggest challenges of the AI era is that truth has become fragmented.

He says that algorithms only exacerbate fragmentation of common understandings. “They personalise feeds, isolate individuals in ideological bubbles,” he said. “They amplify misinformation and disinformation, making it difficult to establish a common set of facts. And they erode a fundamental democratic principle: the ability to engage in a shared reality.”

He will be travelling through regional Australia talking with people and listening to their thoughts on AI and learning from their experiences. Hopefully he will report back on that State of Australia’s democracy in 2025 and tell us more about how people were influenced by AI-generated images and messages during the election campaign.