Australian Outlook

In this section

Australia is Lagging in its Approach to Autonomous Weapons

06 Jul 2023
By Matilda Byrne
Digitial Dehumanisation Conference, San José Costa Rica, February 2023. Source: Stop Killer Robots/https://bit.ly/3PEbs4W

Increasing autonomy in weapons is raising a new set of challenges for countries. For those few leading in artificial intelligence (AI), Australian included, delaying international legal frameworks is seen as in the national interest. 

AI and robotics are revolutionising warfare. As this suggests, regulation is essential to establish important red lines, to address legal, ethical and security concerns, and to protect humanity from a future of “killer robots.”

Global efforts and engagement

Russia, China, Israel, India, the US, UK, and Australia are among the countries leading development of AI in military applications. Alongside this push to innovate is growing discussion on regulation and the need for new international laws to address autonomous weapons. The United Nations secretary-general, the International Committee of the Red Cross (ICRC), AI experts in academia and industry, and civil society are united in their calls for a legally binding instrument. To date, over 90 countries have expressed support for these endeavors.

This year, engagement on the issue has intensified. The recent Luxembourg Autonomous Weapons Systems Conference, convened by its Ministry of Defence, engaged the EU and NATO and included a panel by the campaign to Stop Killer Robots. The Responsible AI in the Military Domain conference by the Netherlands in February illustrates the interest in regulation, although it did not highlight autonomous weapons. Costa Rica hosted a regional meeting for Latin America and the Caribbean, where 33 countries committed to the establishment of a new legally binding instruments through the Belén Communiqué. A similar communiqué was then adopted by the Ibero-America group, which includes Spain, Portugal, and Andorra.

The Stop Killer Robots campaign coalition also came together in Costa Rica for the Digital Dehumanisation Conference. The conference highlighted the potential harms of automated decision-making on humanity and explored strategies for campaigning efforts in 2023. These themes were revisited at RightsCon, with participants from 129 countries addressing the intersection of human rights and digital technology across multiple domains, including warfare. There is increasing consideration of how autonomous weapons cuts across human rights and broader societal implications.

In the warfare context, security, legal, ethical, and humanitarian concerns have been discussed formally for over a decade. Since 2014, dedicated United Nations meetings within the Group of Governmental Experts (GGE) have convened under the Convention on Certain Conventional Weapons. The GGE’s task has been to consider proposals and possible measures relating to a regulatory framework. Despite advances in ideas throughout 2022 and 2023, the GGE has failed to progress beyond discussions due to a handful of militarised states, most notably Russia, using consensus rules to stymy any outcomes.

At the United Nations General Assembly in 2022, a multilateral statement addressing autonomous weapons was delivered. It was the first of its kind on this issue, expanding engagement beyond the GGE meetings, which do not include several countries. The statement echoed the desire for urgent regulation, outlining the need for “internationally agreed rules and limits.”

Australia’s position and autonomy in the defence landscape

Australia is among the countries that, while innovating on autonomy for defence, have sought to shirk new regulations. In the GGE this year, Australia acted more constructively than previously. However, in opposition to global momentum, the Australian government maintains it “remains to be convinced new international law is required.” Australia endorsed the use of possible measures in design, development, deployment, and use of autonomous weapons, but insisted these must not reflect a legal requirement. New specific prohibitions and legal obligations must be established, particularly to address aspects such as autonomy and human control which were not previously even conceived. The Australian Human Rights Commissioner has called for a prohibition on lethal autonomous weapons systems, as have leading Australian AI experts.

Australia is also yet to articulate a commitment to ensuring meaningful human control over the use of force, specifically in the “critical functions” (identifying, selecting, and attacking targets). Even when asked explicitly, the government has not confirmed human control will be ensured nor offered any information on human control requirements. This raises ethical concerns around the delegation of decisions of life and death to machines and applying force to humans based only on sensory data. Responding to these concerns, the Department of Defence merely said that the chain of command retains accountability, failing to address the core of the issue. A paper on AI ethics in defence was developed, but the frameworks are not government policy, and this work completely overlooked these major ethical concerns.

In the Australian defence landscape, there is a push for projects within defence and arms companies. Trusted Autonomous Systems (TAS) launched in 2017 with AUD$50 million over its first seven years and is the first Defence Cooperative Research Centre designed to connect defence, arms companies, and universities. The government has recently announced the Advanced Strategic Capabilities Accelerator, a new defence innovation and funding program with autonomy as one of the priority areas. Other notable projects include “Ghost Bat,” an autonomous aircraft developed by the Royal Australian Air Force with Boeing and an underwater counterpart “Ghost Shark” – an autonomous submarine contracted with arms company Anduril.

Smaller Australian companies such as Cyborg Dynamics, Skyborne Technologies, and DefendTex are pushing the envelope in integrating autonomy into weapons such as smaller aerial and land vehicles and targeting systems. These demonstrate the increasing use of autonomy which is being pursued in absence of adequate guardrails.

An opportunity for leadership

Australia has a moral imperative to embrace new international law in spaces where it is innovating. The autonomous weapons space is particularly acute, with implications for warfare and society. Despite being among the small minority of countries that do not support establishing a new legally binding instrument, Australia has shown progress in its engagement over the last year. However, Australia should join those calling for new international laws and develop clearer domestic policy with urgency. As the Hon Andrew Leigh MP noted in a parliamentary speech in 2021, the government “has taken a somewhat opaque approach…[and] not acknowledged the ethical challenges these technologies present” despite working on autonomous projects in defence.

The Hon Maria Vamvakinou MP identified “the need for Australia to support the establishment of legally binding rules regarding specific prohibitions and limits on autonomous weapons [and] prepare and articulate a national policy that strictly regulates and defines human control and responsibility with respect to such weapons.” She also remarked that technological advances compel us to advance notions of human rights, and to “future proof” these requires appropriate regulation and oversight, which has not been established despite increasingly autonomous weapons development.

Addressing the UN General Assembly in 2022, Foreign Minister Penny Wong proclaimed: “It is up to all of us to create the kind of world to which we aspire – stable, peaceful, prosperous.” How AI is used in warfare will impact the state of our world and peace. Australia can rise to the challenge of grappling with AI and its application across society and must do more to join those countries taking the lead on addressing autonomous weapons.

Matilda Byrne is the National Coordinator of the Australia Stop Killer Robots campaign, based at SafeGround, an Australian non-for-profit that seeks to reduce impacts of legacy and emerging weapons. She is currently undertaking a PhD at RMIT’s Social and Global Studies Centre.

This article is published under a Creative Commons Licence and may be republished with attribution.