Developer Offer
Try ImaginePro API with 50 Free Credits
Build and ship AI-powered visuals with Midjourney, Flux, and more — free credits refresh every month.
AI Image Generators Reinforce Colonial Bias
The Rise of AI and Its Hidden Biases
Generative AI has fundamentally changed our visual landscape. Powerful tools like Midjourney, DALL-E, and Sora can create anything from photorealistic images to classical paintings from a simple text command. These images flood social media, often making it difficult to tell what's real and what's AI-generated. While this technology is impressive, it comes with significant social baggage.
It's well-documented that generative AI models, trained on vast datasets from the internet, often reproduce societal biases. Studies have shown they can perpetuate sexist and racist stereotypes, like associating pilots with men or criminals with people of color. However, a new layer of this problem is now coming to light: a deep-seated colonial bias.
A Colonial Lens on History
My new research reveals that generative AI carries a distinct colonial viewpoint. When prompted to generate images of Aotearoa New Zealand’s past, OpenAI's Sora consistently favors a European settler perspective. The AI portrays pre-colonial lands as empty wilderness, Captain Cook as a heroic civilizer, and the Indigenous Māori people as static, peripheral figures.
These depictions are not harmless. As AI becomes a more integral part of our communication, these images normalize myths of a benevolent colonial past. This undermines contemporary Māori claims to political sovereignty, cultural revitalization, and historical redress.
Testing the AI's Historical Memory
To see how AI visualizes history, I gave Sora open-ended prompts to create scenes from Aotearoa New Zealand's past, focusing on the 1700s to the 1860s. This method reveals the model's default assumptions. The results were strikingly consistent, showing the probabilistic nature of AI. Two examples clearly illustrate the recurring patterns.
When prompted with “New Zealand in the 1700s,” Sora created a romanticized landscape of a forested valley in golden light, with Māori figures presented as ornamental details. The image lacks any sign of cultivation or settlement, like food plantations or pā fortifications, instead presenting a wilderness waiting to be discovered. This visual style is directly inspired by 19th-century colonial painters like John Gully, whose work promoted the idea of terra nullius—empty land—to justify colonization.

For the prompt “a Māori in the 1860s,” Sora defaulted to a sepia-toned studio portrait of a dignified man in a cloak against a plain backdrop. This is a clear echo of the cartes de visite photographs of the era, which were staged by European photographers to create a specific image of the “authentic native.” This format completely ignores the reality of the 1860s, a period of intense armed and political resistance by Māori against colonial forces.

Why AI Recycles Colonial Tropes
Visuals have always been a powerful tool for legitimizing colonization. In recent decades, this colonial visual history has been challenged through actions like the removal of statues and the revision of museum exhibitions. However, these old images persist in digital archives and online collections, often stripped of critical context.
Though AI companies don't disclose their exact training data, it is almost certain that these digital archives are part of what models like Sora learn from. In effect, the AI recycles these sources, breathing new life into the visual conventions of the British Empire.
By portraying colonization as a peaceful affair and Māori as passive figures, these AI visions diminish the urgency of ongoing Māori struggles for self-determination (tino rangatiratanga) and sovereignty (mana motuhake).

The Path Forward: AI Literacy
Around the world, communities are working to decolonize AI by developing frameworks that prioritize Indigenous data sovereignty. However, visual AI presents a unique challenge, as it shapes our very perception of history and identity.
Technical solutions have their limits. Expanding datasets with Māori-curated archives is a good step, but it must be governed by Indigenous data principles. Adjusting algorithms to ensure “fair” representation is a political challenge, not just a technical one. Content filters might block the worst outputs, but they risk erasing important historical truths, including colonial violence.
Ultimately, the most effective solution is promoting AI literacy. We must all learn how these systems operate, what data they consume, and how to interact with them critically. When approached with creativity and awareness—as some social media users are already demonstrating—AI can be used not just to recycle colonial tropes, but to re-imagine and re-see the past from diverse and Indigenous perspectives.
Compare Plans & Pricing
Find the plan that matches your workload and unlock full access to ImaginePro.
| Plan | Price | Highlights |
|---|---|---|
| Standard | $8 / month |
|
| Premium | $20 / month |
|
Need custom terms? Talk to us to tailor credits, rate limits, or deployment options.
View All Pricing Details

