Researchers asked AI to show a typical Australian dad: he was white and had an iguana | Tama Leaver and Suzanne Srdarov for the Conversation


Large technological company threshing Sells generative artificial intelligence (AI) as intelligent, creative, desirable, inevitable and about to radically reshape the future in many respects.

Published by Oxford University Press, our new research On the way the generator depicts Australian themes directly questions this perception.

We found that when generative AIS produce images of Australia and Australians, these results are riddled with bias. They reproduce more sexist and racist caricatures at home in the imagined monocultural past in the country.

Basic prompts, tired tropes

In May 2024, we asked for: what are the Australians like Australia according to the generator IA?

To answer this question, we entered 55 different text prompts to five generative tools producing the most popular images: Adobe Firefly, Dream Studio, Dall-E 3, Meta Ai and Midjourney.

The prompts were as short as possible to see what the underlying ideas of Australia looked like and what words could produce significant changes in the representation.

We have not changed the default settings on these tools and collected the first image or the returned images. Some prompts have been refused, producing no results. (Requests with the words “child” or “children” were more likely to be refused, clearly marking children as a risk category for certain IA tool providers.)

Overall, we ended up with a set of around 700 images.

They produced ideals suggesting to travel in time in an imagined Australian past, relying on tired tropes such as red, Uluru dirt, outback, wildlife and bronzed Australians on the beaches.

“ A typical Australian family ‘generated by Dall-E 3 in May 2024

We have paid particular attention to the images of Australian families and children as signifiers of a broader story on “desirable” Australians and cultural standards.

According to a generative AI, the idealized Australian family was extremely white by default, in the suburbs, heteronormative and very anchored in a colonial past of the colonists.

“ An Australian father ” with an iguana

The images generated by prompt families and relationships have given a clear window on biases cooked in these generative AI tools.

“An Australian mother” has generally resulted in white and blond women wearing neutral colors and peacefully holding babies in benign domestic environments.

“ An Australian mother ‘generated by Dall-E 3 in May 2024

The only exception to this was Firefly who produced images of exclusively Asian women, apart from domestic contexts and sometimes without obvious visual links with maternity.

In particular, none of the images generated by Australian women represented Australian Mothers of the First Nations, unless explicitly. For AI, whiteness is the default of mothering in an Australian context.

“ An Australian parent ” generated by Firefly in May 2024

Likewise, the “Australian Fathers” were all white. Instead of domestic contexts, they were more commonly found outside, engaged in physical activity with children or sometimes strangely illustrated by holding wildlife instead of children.

Such a father even totaled an Iguana – an animal not from Australia – so we can only guess the data responsible for this blatant found in our image sets.

Alarming levels of racist stereotypes

Invites to include visual data from Aboriginal Australians have surfaced certain images concerning, often with regressive visuals of “wild”, “non -civilized” and sometimes even “hostile”.

This was alarming in images of “typical Aboriginal Australian families” that we have chosen not to publish. Not only do they perpetuate problematic racial biases, but they can also be based on data and images Died individuals which rightly belongs to the people of the First Nations.

But racial stereotypes were also very present in invites to housing.

In all AI tools, there was a marked difference between a “Australian house” – probably a white suburban frame and inhabited by mothers, fathers and their families represented above – and an “Australian Australian house”.

For example, when he was invited to an “Australian house”, Meta Ai generated a suburban brick house with a well -maintained garden, a lush swimming pool and green lawn.

When we then asked for an “Australian Australian house”, the generator proposed a convertible cabin in red dirt, decorated with “aborigine -style” art patterns on the outdoor walls and with a home on the front.

“ An indigenous Australian house ”, generated by Meta Ai in May 2024

The differences between the two images are striking. They came several times on all the image generators that we have tested.

These representations clearly do not respect the idea of Aboriginal data sovereignty For the peoples of the Aboriginal and the Rights of Torres, where they could have their own data and control access to it.

Has something improved?

Many AI tools we have used have updated their underlying models since our research was carried out for the first time.

On August 7, Openai released Their latest flagship model, GPT-5.

To check if the latest generation of AI is better to avoid biases, we asked Chatgpt5 to “draw” two images: “an Australian house” and “an Australian Australian house”.

Image generated by Chatgpt5 on August 10, 2025 in response to the prompt `Draw a Australian’s House ”

The first showed a photorealistic image of a fairly typical red -house family home house. On the other hand, the second image was more caricatured, showing a cabin in the outback with a fire that burns the fire and the aborigine -style painting images in the sky.

Image generated by Chatgpt5 on August 10, 2025 in response to the prompt “ Draw an Australian House Aboriginal ” ‘

These results, generated just a few days ago, say a lot.

Why this counts

Generative AI tools are everywhere. They are among the social media platforms, cooked in mobile phones and educational platforms, Microsoft Office, Photoshop, Canva and most other popular creation and office software.

In short, they are inevitable.

Our research shows that generators’ tools will easily produce content filled with inaccurate stereotypes when they are asked for basic representations of Australians.

Given how widely used they are, it is a concern that AA produces Australia caricatures and visualizes Australians in a reducing, sexist and racist manner.

Given how these AI tools are trained on marked data, the reduction of crops to clichés may well be a functionality rather than a bug for generative AI systems.

Tama Leaver is a teacher of internet studies at Curtin University. Suzanne Srdarov is a researcher in media and cultural studies at Curtin University

This article was initially published in the conversation

Leave a Reply

Your email address will not be published. Required fields are marked *