Australia stands at a pivotal moment in its relationship with artificial intelligence. The release of national and sector-specific policies including Australia’s National AI Plan, the AI Plan for the Australian Public Service, and the Australian Framework for Generative AI in Schools, signals a clear intent to harness AI for the public good. These frameworks emphasise sovereign capability, safety, responsible use and workforce readiness, and rightly position education as both a beneficiary of, and steward for, this transformation. 

But are we doing enough to ensure that the AI systems used in Australia are delivering more good than harm? 

Most AI systems are trained on vast global datasets dominated by North American and European perspectives. Common AI tools such as ChatGPT and Microsoft Copilot are products of ecosystems far removed from the cultural, linguistic and historical realities of Australian communities. When deployed at scale, bias embedded in these datasets risks perpetuating existing societal divides in Australia by: 

  • marginalising Aboriginal and Torres Strait Islander knowledge 
  • overlooking the languages and experiences of Australia’s migrant communities 
  • failing to reflect the lived realities of students and educators outside major cities 

In education, this matters profoundly. Schools shape how young people understand history, identity, belonging, and whose knowledge counts. 

Encouragingly, there is emerging action across Australia’s education sector that directly addresses bias in AI systems and points to what responsible practice can look like. 

Tools such as the NSW Department of Education’s NSWEduChat give students in Years 5–12 a safe, secure way to engage with AI. This context-specific approach helps reduce unintended bias that can arise in general-purpose AI models trained on broad, internet-wide data that may not reflect Australian curriculum standards or student diversity. A key feature of NSWEduChat is its student mode, which uses guided, open-ended questions rather than providing definitive responses. By centring students’ own reasoning, the tool encourages critical thinking and reduces the risk of learners accepting AI outputs at face value. 

Similarly, CSIRO’s freely available AI in Action classroom resources support students in Years 5–8 to explore how AI works, alongside its ethical implications and real-world applications. Delivered through curriculum-aligned presentations, classroom activities, worksheets and teacher guides, these resources are designed for practical classroom use. Importantly, they build age-appropriate AI understanding while emphasising responsibility, ethics and critical thinking about data, algorithms and the decisions AI systems can make. 

While fairness and bias are acknowledged within Australia’s current AI policy landscape, there remains limited practical guidance on how AI systems should engage with First Nations knowledge systems, Indigenous data governance, cultural and linguistic diversity in Australian classrooms. 

Indigenous data sovereignty must be foundational to AI governance — not only in education, but across all sectors. Without it, we risk stripping ancient First Nations knowledge of its meaning and custodianship, while perpetuating decades of historical disadvantage. The CARE Principles — Collective Benefit, Authority to Control, Responsibility and Ethics — are essential in this context. They assert that Indigenous peoples must hold genuine authority over how their data and knowledge are collected, interpreted and used. 

It is vital that we avoid creating disproportionate harm to already marginalised communities and instead put concrete steps in place that give First Nations communities and culturally diverse groups genuine access to, and agency over, AI systems. 

There is enormous potential for AI to support more personalised, inclusive and responsive learning in the Australian context. But this potential will only be realised if AI reflects the full complexity of Australia’s people, histories and cultures, and if Indigenous communities are empowered to lead its design and governance. 

Equally important is the role of education itself. Schools have a responsibility not only to teach students how to use AI tools, but to help them critically question their limitations, particularly their cultural blind spots and embedded biases. Getting AI right in education is a moral challenge, with long-term consequences for equity, knowledge and belonging in Australia.