Please note: Our warehouse fulfils orders on Tuesdays & Fridays.

Please note: Our warehouse fulfils orders on Tuesdays & Fridays.

Digital Blackface and the Theft of Aboriginal Identity: Why AI Appropriation Is Colonisation 2.0

Digital Blackface and the Theft of Aboriginal Identity: Why AI Appropriation Is Colonisation 2.0

Posted on Feb 09, 2026
By Jessica Staines

A critical examination of how seemingly 'harmless' AI content perpetuates violence against First Nations people.

Let me be crystal clear from the outset: creating an AI-generated Aboriginal person to profit from 'educational' wildlife content isn't innovative, it's not clever, and it certainly isn't harmless. It's digital blackface. It's theft. And it's yet another manifestation of the colonisation that First Nations people have endured for over 250 years, only now, it's wrapped in the shiny veneer of artificial intelligence and justified with the tired excuse of 'education and awareness.'

When Keagan John Mason, a South African man living in New Zealand, created 'Jarren', a fictional Aboriginal character with no mob, no Country, no kinship, and no accountability to any First Nations community, he didn't just create content. He created a caricature that performs Aboriginality for an audience of nearly 200,000 followers who are none the wiser.

And here's the kicker: many of them think 'Jarren' is real.

So let's unpack exactly why this matters, why it's dangerous, and why the seemingly innocuous act of generating an Aboriginal avatar to talk about wedge-tailed eagles is actually part of a much larger, much more insidious pattern of erasure and violence.

 

 

No Mob, No Country: Why Aboriginal Identity Cannot Be Generated


Aboriginal identity is not a costume you can put on. It's not a voice you can mimic. And it certainly isn't something you can synthesise through AI prompts and digital rendering.

As Dr Terri Janke, a Wuthathi, Yadhaigana and Meriam lawyer and international authority on Indigenous Cultural and Intellectual Property, has emphasised, AI has no Dreaming, no kinship, no connection to Country, and no cultural obligation. It cannot respect who is permitted to share certain stories, images, or language. It doesn't know cultural protocol. It doesn't understand the weight of representation or the responsibility that comes with speaking about or from an Aboriginal perspective.

Aboriginal identity is relational. It's built on kinship systems that stretch back tens of thousands of years. You belong to mob. You belong to Country. You have Elders, aunties, uncles, cousins. You have responsibilities, cultural, spiritual, and communal. You are accountable.

'Jarren' has none of this. He belongs to no nation. He has no biological descent. No community claims him. He is, quite literally, nobody, a digital ghost performing Aboriginality for clicks and profit.

And that's not just problematic. It's erasure.


Cultural Flattening and the 'Palatable' Aboriginal

Let's talk about what Bush Legend actually does. The account shares videos of 'Jarren' encountering Australian wildlife, snakes, crocodiles, eagles, set to the sounds of yidakis (didgeridoos) and percussion instruments. Early iterations even showed the character adorned with white body paint mimicking ochre and wearing beaded necklaces.

This is what Dr Tamika Worrell, a Kamilaroi woman and senior lecturer in critical Indigenous studies at Macquarie University, calls cultural flattening. AI-generated content like this defaults to sharing the 'palatable' or 'comfortable' aspects of Indigenous cultural knowledge and experience, rather than the complex, nuanced, and often uncomfortable reality.

It gives audiences the aesthetic of Aboriginality, the visual and auditory markers that signal 'Indigenous', without any of the substance, the struggle, the sovereignty, or the truth. It's Aboriginality sanitised for mass consumption. It's the version that doesn't challenge anyone, doesn't make anyone uncomfortable, and certainly doesn't require anyone to reckon with the ongoing impacts of colonisation.

And here's the truly insidious part: by creating this 'safe' version of an Aboriginal person, it sets a benchmark. It says, this is what an Aboriginal person should be, friendly, accessible, non-threatening, focused on nature and animals, unburdened by politics or truth-telling. It creates a false standard against which real Aboriginal people are measured and often found wanting.

When real Aboriginal people speak about land rights, about deaths in custody, about the Stolen Generations, about the racist violence we experience daily, they're seen as 'too political,' 'too angry,' 'too divisive.' But 'Jarren'? He's just a 'deadly' bloke teaching people about wildlife. He's the 'Black Steve Irwin.' He's likeable.

That's not representation. That's replacement.

 
 

Theft of Cultural and Intellectual Property: The Economics of Erasure

Let's get into the dollars and cents, because this isn't just about cultural harm, though that alone should be enough. This is also about economic exploitation.

Cultural intellectual property pertains to the rights of Indigenous and local communities over their traditions, knowledge, and cultural expressions. Unlike copyright, which protects individual creators, cultural intellectual property is community-owned and passed down through generations. It's about safeguarding cultural heritage and ensuring respect for communal identities.

When Mason uses AI to generate an Aboriginal avatar, adorned with cultural markers like ochre and beaded necklaces, set to the sounds of traditional instruments, he is appropriating cultural intellectual property without consent, without community involvement, and without accountability.

As Dr Janke notes, this is 'theft that is very insidious in that it also involves a cultural harm.' The images, the sounds, the aesthetic, these are drawn from Aboriginal culture. But there is no relationship with any Aboriginal community. No protocols followed. No permissions sought. No benefit returned.

Bush Legend has monetised this content. Early posts encouraged followers to subscribe for $2.99 per month. The page has nearly 200,000 followers across platforms. That's a significant audience, and a significant revenue stream, built on the back of a stolen identity.

And here's what really stings: while Mason profits from performing Aboriginality through AI, real Aboriginal rangers, storytellers, and educators struggle for visibility and funding. There is a vast network of Aboriginal rangers doing extraordinary work on Country, protecting endangered species, managing land, sharing knowledge. But they don't have six-figure follower counts. They don't have viral videos.

AI-generated content like Bush Legend takes space, attention, and resources away from authentic Aboriginal voices. It creates competition where there should be collaboration. It diverts audiences who could be learning from real Aboriginal people to a fictional avatar with no accountability and no lived experience.

That's not education. That's exploitation.


AI Blackface: The Digital Minstrel Show

Let's call this what it is: digital blackface.

Digital blackface is when a non-Black or non-Indigenous person creates a caricature online, often through avatars, memes, or AI-generated content, that performs Blackness or Indigeneity for entertainment, profit, or engagement.

It's the digital equivalent of minstrel shows, where white performers would paint their faces Black and perform exaggerated, dehumanising caricatures of Black people for white audiences.

Bush Legend is doing the same thing, just with algorithms instead of greasepaint.

Mason has generated an Aboriginal man, complete with physical features that code as Indigenous, cultural markers that signal Aboriginality, and a persona that performs a particular version of Aboriginal identity. And he's doing it for an audience that largely doesn't know the difference.

As Dr Tamika Worrell notes, 'AI becomes this new platform that we have no control or no say in it. Not only stories or language but actual visuals of us can often be taken from people that have passed away, or just blending a range of different people [to create an AI avatar] with no kind of accountability to the communities that these people are from.'

The technology allows for the creation of Indigenous likenesses without consent, without context, and without consequence. It's the ultimate act of colonial power: to take even the image, the face, the identity of First Nations people and use it however you see fit.

And when called out, the response is predictable: 'It's just education.' 'It's not hurting anyone.' 'If you don't like it, scroll on.'

That brings us to the next point.


 

'Harmless' Content and the Continuum of Violence

Here's what people don't understand, or refuse to understand: there is no such thing as 'harmless' cultural appropriation or identity theft. These acts exist on a continuum of violence that ranges from microaggressions to physical harm.

Let me connect the dots.

On January 26th 2026, just 3 weeks ago, a homegrown terrorist enacted a terrorist attack on Invasion Day rallies in Perth. Aboriginal people, gathering peacefully to mourn and protest the ongoing impacts of colonisation, were targeted with violence.

Where was the national outrage? Where were the wall-to-wall media coverage, the politicians falling over themselves to condemn the attack, the community vigils?

Crickets.

Compare that to the response when non-Indigenous Australians are harmed. The difference is stark, and it's rooted in a simple, brutal truth: Aboriginal lives are not valued the same way.

And that devaluation doesn't come from nowhere. It's built. It's cultivated. It's reinforced every single day through a thousand tiny acts that tell the Australian public that Aboriginal people don't matter, that our cultures are consumable, that our identities are negotiable, that our voices are optional.

It's reinforced when First Nations servicemen are booed at ANZAC Day ceremonies, ceremonies meant to honour sacrifice and service. It's reinforced when public support for Welcome to Country ceremonies is withdrawn because they're seen as 'too political' or 'divisive.' It's reinforced when The Voice referendum fails, and the message sent is clear: we don't want to hear you.

And yes, it's reinforced when a South African man creates an AI Aboriginal person to make money off wildlife videos and then dismisses criticism with 'scroll on.'

Because here's what that says: Aboriginal identity is not sacred. It's not protected. It's not even real enough to matter. It's just content. It's just a tool. It's just a means to an end.

And when you normalise that, when you allow that to go unchallenged, you lay the groundwork for greater harms. You create a culture in which Aboriginal people are seen as less than, as other, as objects rather than subjects.

You create a culture in which a terrorist can attack an Invasion Day rally, and the nation shrugs.

That is the continuum. That is why 'harmless' AI videos are not harmless at all.


Colonisation 2.0: Same Violence, New Platform

Let's be blunt: this is colonisation all over again, just with a digital twist.

Colonisation has always been about dispossession, taking land, taking resources, taking children, taking culture, taking identity. And it's always been justified with the same rhetoric: it's for your own good, it's progress, it's education, it's harmless.

AI-generated appropriation is the latest iteration. It's taking Aboriginal identity and repackaging it for consumption without consent, without compensation, and without accountability. It's saying, 'We don't need actual Aboriginal people, we can just make our own.'

And here's the truly dystopian part: under current structures, Aboriginal people have no legal recourse. Copyright law protects individual creators. But cultural intellectual property? That's murkier. It's less defined. It's harder to enforce.

As Corey Tutt OAM, a Kamilaroi man and founder of Deadly Science, notes, 'We are also seeing a rise in non-Indigenous organisations, often dressed in black-clad branding, using AI to manufacture cultural legitimacy.'

This is what's called 'black cladding', when non-Indigenous entities use Aboriginal imagery, language, or aesthetics to appear culturally legitimate without any genuine engagement with Aboriginal communities.

And the law? It hasn't caught up. There are no guardrails. No protections. No consequences.

So Aboriginal people, once again, are left vulnerable. We watch as our cultures are mined for content. We watch as our identities are synthesised and monetised. We watch as fake Aboriginal people get more followers, more engagement, more opportunities than real Aboriginal people.

And we're told to 'scroll on.'

That's not progress. That's the same colonial violence, just with better graphics.

 
 

What Ethical AI Engagement with First Nations Culture Looks Like

Now, let's be clear: AI is not inherently evil. As Dr Janke notes, it is possible to ethically use AI technology to create content about First Nations people, but it requires the consent and involvement of First Nations people.

That means:

  • Engaging with Aboriginal communities before creating content. Not after. Not as damage control. Before.

  • Following cultural protocols. That means understanding who has the right to share certain stories, images, or knowledge, and respecting those boundaries.

  • Ensuring Aboriginal people are involved in every stage, concept, creation, and distribution. Not as consultants brought in to rubber-stamp your project, but as collaborators with decision-making power.

  • Directing audiences to real Aboriginal voices. If you're creating content about Australian wildlife, why not partner with Aboriginal rangers who are already doing that work? Why not amplify their voices instead of replacing them?

  • Ensuring any economic benefit flows back to Aboriginal communities. If you're making money from content that uses Aboriginal cultural markers, Aboriginal communities should benefit.

This isn't difficult. It's just respectful.

But Bush Legend didn't do any of that. Because it was never about education or awareness. It was about profit. And it was easier, and more lucrative, to generate a fake Aboriginal person than to do the work of building genuine relationships with Aboriginal communities.


The Danger of Deepfakes and Misinformation

There's another layer to this that we can't ignore: the potential for AI-generated content to spread misinformation and cause harm.

As Professor Toby Walsh, a laureate fellow and scientia professor of artificial intelligence at UNSW, notes, AI is trained on large-scale data sets with inbuilt biases. 'They are going to carry the biases of that training data. Certain groups may be stereotyped because the video data or the image data that exists in that group online is somewhat stereotypical. So we're going to perpetuate that stereotype moving forwards.'

AI doesn't create neutral content. It replicates what it's been fed, and what it's been fed is often racist, reductive, and harmful.

When AI generates an Aboriginal person, it's drawing on existing representations of Aboriginal people, and those representations are, more often than not, stereotypical. The 'noble savage.' The 'mystic elder.' The 'nature-loving guide.'

And as Walsh warns, 'If not now, in the very near future, it's going to be next to impossible to be able to identify for yourself whether this was real or fake.'

Think about what that means. In a world where deepfakes are indistinguishable from reality, who gets to control the narrative about Aboriginal people? Who gets to decide what an Aboriginal person looks like, sounds like, believes?

Right now, it's not Aboriginal people. It's people like Mason, who can generate an Aboriginal avatar, put words in his mouth, and present it as authentic.

That's not just appropriation. That's erasure. That's replacement. That's annihilation.

 
 

What We Need: Structural Change and Community Accountability

So what do we do?

First, we call it out. Every single time. We don't let this become normalised. We don't let it slide because it's 'just AI' or 'just content.'

Second, we demand legislative change. We need laws that protect cultural intellectual property the same way we protect copyright. We need guardrails around AI that require consent and accountability when it comes to generating representations of First Nations people.

Third, we support real Aboriginal voices. Follow Aboriginal rangers. Support Aboriginal-owned businesses. Amplify Aboriginal educators and storytellers. Direct your attention, your engagement, and your dollars to the people who are doing this work authentically.

And finally, we educate. Because most people genuinely don't understand the harm this causes. They see a cute video of a bloke talking about snakes and think, 'What's the problem?'

So we explain. We connect the dots. We show them the continuum of violence that runs from AI blackface to terrorist attacks on Invasion Day rallies. We show them that this isn't 'just content', it's part of a system that devalues, erases, and harms Aboriginal people.


A Final Word

To Keagan John Mason and anyone else considering creating AI-generated Aboriginal content: you don't get to do this.

You don't get to take our identity. You don't get to profit from our culture. You don't get to create caricatures of us and then tell us to 'scroll on' when we object.

Aboriginal identity is not yours to generate. It's not yours to monetise. It's not yours to play with.

And to the 200,000 people following Bush Legend: you deserve better. You deserve to learn from real Aboriginal people, people with mob, with Country, with knowledge that's been passed down through generations. People who are accountable to their communities and their cultures.

They exist. They're out there. They're doing incredible work. Go find them. Follow them. Learn from them.

Because 'Jarren' isn't real. But we are. And we're not going anywhere.

To report Bush Legends and Keagan John Mason to the relevant digital platforms for fraud and impersonation you can do so at the following social media links:

Search