The Legal Light Justin Stack Killer robots and the law What was once depicted only in science-fiction films such as The Terminator and I, Robot is now very close to being reality. They may not look like humans - yet - but the killer robot is upon us. Drones that can identify targets through facial recognition technology and fire instantly without a human order are poised to become a reality, thanks to the rapid development of artificial intelligence (AI). Drones being used in Ukraine are programmed to recognise enemy military equipment, such as tanks, vehicles and artillery. To what extent these weapons can operate without meaningful human control is top secret. Can the autonomous weapon itself make decisions on hunting and killing, based on its Al algorithms and data analysis, without a human making the final decision to attack? Semi-autonomous weapons are already in use. Missiles are semi-autonomous - once a human fires them, they seek out and destroy targets on their own. Australia is planning to use more sophisticated drones, including aerial drones, robotic combat vehicles, uncrewed tanks and submersibles, even a robot dog that clears landmines. Defence says all require a level of human interaction, including giving the final order to attack. Many other varieties of Al-driven robots and flying drones are on the drawing board, with varying levels of human control. The defence industry calls them Lethal Autonomous Weapons Systems (LAWS). But what does the law have to say about using machines programmed to hunt and kill without a human order? Christopher Morris, lawyer at Stacks Law Firm, says Australia has no specific law on LAWS, but is engaged in international discussions to develop a global policy on keeping human control of killer robots. "The problem with regulating LAWS is that there is no internationally agreed definition of what is an autonomous weapon that independently selects and attacks targets," Mr Morris said. "Legislators generally agree a human needs to be involved in the decision to kill in new Al weapons technology, even if the human only starts the machine's decision-making system that then acts independently to attack." "There are very serious legal concerns, as machines lack human judgement, can't distinguish between soldiers and civilians, and can't assess the risk to civilians when making military decisions. "Only humans can make the ethical and legal decisions required on the battlefield. Autonomous weapons can't be held responsible for mistakes or war crimes." Many members of the UN have declared that machines with the power and discretion to take lives without human involvement should be banned internationally by 2026. Australia did not support forming new international laws banning LAWS, and is investing heavily in autonomous defence systems. STACKS LAW FIRM Diane Branch Compensation Specialist No Win, No Fee Conditions apply 02 6592 6592 taree.stacklaw.com.au Partners in life The Legal Light Justin Stack Killer robots and the law What was once depicted only in science - fiction films such as The Terminator and I , Robot is now very close to being reality . They may not look like humans - yet - but the killer robot is upon us . Drones that can identify targets through facial recognition technology and fire instantly without a human order are poised to become a reality , thanks to the rapid development of artificial intelligence ( AI ) . Drones being used in Ukraine are programmed to recognise enemy military equipment , such as tanks , vehicles and artillery . To what extent these weapons can operate without meaningful human control is top secret . Can the autonomous weapon itself make decisions on hunting and killing , based on its Al algorithms and data analysis , without a human making the final decision to attack ? Semi - autonomous weapons are already in use . Missiles are semi - autonomous - once a human fires them , they seek out and destroy targets on their own . Australia is planning to use more sophisticated drones , including aerial drones , robotic combat vehicles , uncrewed tanks and submersibles , even a robot dog that clears landmines . Defence says all require a level of human interaction , including giving the final order to attack . Many other varieties of Al - driven robots and flying drones are on the drawing board , with varying levels of human control . The defence industry calls them Lethal Autonomous Weapons Systems ( LAWS ) . But what does the law have to say about using machines programmed to hunt and kill without a human order ? Christopher Morris , lawyer at Stacks Law Firm , says Australia has no specific law on LAWS , but is engaged in international discussions to develop a global policy on keeping human control of killer robots . " The problem with regulating LAWS is that there is no internationally agreed definition of what is an autonomous weapon that independently selects and attacks targets , " Mr Morris said . " Legislators generally agree a human needs to be involved in the decision to kill in new Al weapons technology , even if the human only starts the machine's decision - making system that then acts independently to attack . " " There are very serious legal concerns , as machines lack human judgement , can't distinguish between soldiers and civilians , and can't assess the risk to civilians when making military decisions . " Only humans can make the ethical and legal decisions required on the battlefield . Autonomous weapons can't be held responsible for mistakes or war crimes . " Many members of the UN have declared that machines with the power and discretion to take lives without human involvement should be banned internationally by 2026 . Australia did not support forming new international laws banning LAWS , and is investing heavily in autonomous defence systems . STACKS LAW FIRM Diane Branch Compensation Specialist No Win , No Fee Conditions apply 02 6592 6592 taree.stacklaw.com.au Partners in life