news, latest-news,
Trainee-officers in the Australian Defence Force are “unwilling” to operate near robot warriors as the ADF plans a huge expansion of automated “thinking” machines capable of killing. Eight-hundred officers of the future were asked “about their willingness to deploy in a team ‘involving robots to achieve a combat mission’.” They were given different scenarios but when the machine had the most ability to think and act on its own, most trainee officers said they would be “unwilling” or “somewhat unwilling”, according to the researchers at the Australian Defence Force Academy, Dr Jai Galliott and Dr Austin Wyatt. “While a minority would be currently willing, the majority of this cohort harbors a discomfort with deploying alongside autonomous systems with the independent capability to apply force,” Dr Galliott and Dr Wyatt said in their paper published in a defence journal. The main reason given by 80 per cent of the trainee officers was the safety of robot warriors in battle. The people surveyed are the officers of the future destined to command an army, navy and air force that will increasingly rely on robots, according to an ADF planning document obtained by The Canberra Times. Concept for Robotic and Autonomous Systems spells out how the ADF is planning a big expansion of its development and deployment of Robotic and Autonomous Systems – RAS or “robot warriors”, in plain English. These machines and weapons, primed to think on their own in battle and perform “lethal” actions, would once have seemed like science-fiction fantasy. They call to mind the RoboCop movie (though the bloody ending would not please the leaders of the ADF). Swarms of killer drones is one concept being discussed. Pilotless fighter planes are already being developed to act as armed escorts for fighters flown by humans. Drones carrying grenades are also being developed for use by infantry. A soldier on the battlefield can release many of them and they then fly to identify and kill the target automatically. The ADF is keen on robots because they offer much more military power than the current budget allows. The document is written in dense jargon and sometimes euphemism. It’s meant to brief military leaders on important issues as robots revolutionise warfare. But there are concerns among experts on artificial intelligence and robots over whether machines can be primed to make the finest ethical decisions and be able to tell right from wrong in war. For example, a human soldier who sees a target surrounded by children can make a decision about firing. Can a machine? Life or death decisions are being transferred from human officers to software developers. Can robot warriors have their software coded with enough nuance to recognise people who are surrendering, for example? Can robot warriors be primed to distinguish reliably between a small aggressive adult and a large innocent child? Such questions are taxing academic experts on ethics. Some of these experts fear that the Department of Defence and the ADF are not giving enough weight to ethical considerations. Ethics gets a fleeting mention in Concept for Robotic and Autonomous Systems. The ADF document outlines huge advantages for the Australian military: “Defence will enhance its combat capability within planned resources by employing robotic and autonomous systems in human commanded teams to improve efficiency, increase mass and achieve decision superiority while decreasing risk to personnel.” The document says that robotic and autonomous systems – robots with lethal power – have advantages over human warriors: “Unlike humans, RAS do not get tired, are not affected by the stresses of combat, do not seek revenge and can be programmed to not preserve themselves. “If a human crew is lost during operations, it can take years to replace the training and experience of the crew. In contrast, the latest data and algorithms can be uploaded to a new RAS as soon as it is built.” Examples already off the ground are the Loyal Wingman, a pilotless aircraft that Boeing says provides “fighter-like performance” and uses “artificial intelligence to fly independently or in support of manned aircraft while maintaining safe distance between other aircraft”. It’s currently being built in Australia by Boeing, and the first one was handed over to the RAAF in May. A company near Melbourne – DefendTex – is making “unmanned aerial vehicles” which can be programmed to identify smaller targets – say a particular type of tank or a car carrying a terrorist – and destroy them. It’s a very small drone, like the ones on sale in electronics stores, but with an artificial intelligence and a grenade. Fleets of them could be launched by a single soldier. The company says of its product: “The Drone40 is an autonomous loitering grenade.” It can give a single soldier “multiple round simultaneous impact capabilities” – in other words, one soldier can launch lots of them and they then fly to one or more targets and explode the grenades at the same time. The ADF report says that Australia’s adversaries will be developing similar systems. It paints a picture of swarms of drones fighting each other “where adversaries make decisions at machine speed”. The ADF says the robots will be under the command of a human being but the big unanswered question is: what does human command actually mean when the machine thinks for itself at “machine speed” and can shoot to kill.
/images/transform/v1/crop/frm/steve.evans/014916f7-95c2-4bf1-80bc-2cade120fdf6.jpg/r2_30_958_570_w1200_h678_fmax.jpg
Boeing’s “Loyal Wingman” – on the way to the RAAF
Trainee-officers in the Australian Defence Force are “unwilling” to operate near robot warriors as the ADF plans a huge expansion of automated “thinking” machines capable of killing.
Eight-hundred officers of the future were asked “about their willingness to deploy in a team ‘involving robots to achieve a combat mission’.”
They were given different scenarios but when the machine had the most ability to think and act on its own, most trainee officers said they would be “unwilling” or “somewhat unwilling”, according to the researchers at the Australian Defence Force Academy, Dr Jai Galliott and Dr Austin Wyatt.
“While a minority would be currently willing, the majority of this cohort harbors a discomfort with deploying alongside autonomous systems with the independent capability to apply force,” Dr Galliott and Dr Wyatt said in their paper published in a defence journal.
The main reason given by 80 per cent of the trainee officers was the safety of robot warriors in battle.
The people surveyed are the officers of the future destined to command an army, navy and air force that will increasingly rely on robots, according to an ADF planning document obtained by The Canberra Times.
Concept for Robotic and Autonomous Systems spells out how the ADF is planning a big expansion of its development and deployment of Robotic and Autonomous Systems – RAS or “robot warriors”, in plain English.
These machines and weapons, primed to think on their own in battle and perform “lethal” actions, would once have seemed like science-fiction fantasy. They call to mind the RoboCop movie (though the bloody ending would not please the leaders of the ADF).
Swarms of killer drones is one concept being discussed. Pilotless fighter planes are already being developed to act as armed escorts for fighters flown by humans.
The ‘robot warriors’ defence trainees are unwilling to serve with
/images/transform/v1/crop/frm/steve.evans/014916f7-95c2-4bf1-80bc-2cade120fdf6.jpg/r2_30_958_570_w1200_h678_fmax.jpg
Most ADF trainee officers would be “unwilling” to deploy alongside robots in war
news, latest-news,
2020-12-09T05:00:00+11:00
https://players.brightcove.net/3879528182001/default_default/index.html?videoId=6214051789001
https://players.brightcove.net/3879528182001/default_default/index.html?videoId=6214051789001
The Defendtex Drone 40
Drones carrying grenades are also being developed for use by infantry. A soldier on the battlefield can release many of them and they then fly to identify and kill the target automatically.
The ADF is keen on robots because they offer much more military power than the current budget allows.
The document is written in dense jargon and sometimes euphemism. It’s meant to brief military leaders on important issues as robots revolutionise warfare.
But there are concerns among experts on artificial intelligence and robots over whether machines can be primed to make the finest ethical decisions and be able to tell right from wrong in war.
For example, a human soldier who sees a target surrounded by children can make a decision about firing. Can a machine?
Life or death decisions are being transferred from human officers to software developers.
Can robot warriors have their software coded with enough nuance to recognise people who are surrendering, for example?
Can robot warriors be primed to distinguish reliably between a small aggressive adult and a large innocent child? Such questions are taxing academic experts on ethics.
Some of these experts fear that the Department of Defence and the ADF are not giving enough weight to ethical considerations. Ethics gets a fleeting mention in Concept for Robotic and Autonomous Systems.
The ADF document outlines huge advantages for the Australian military: “Defence will enhance its combat capability within planned resources by employing robotic and autonomous systems in human commanded teams to improve efficiency, increase mass and achieve decision superiority while decreasing risk to personnel.”
The document says that robotic and autonomous systems – robots with lethal power – have advantages over human warriors: “Unlike humans, RAS do not get tired, are not affected by the stresses of combat, do not seek revenge and can be programmed to not preserve themselves.
“If a human crew is lost during operations, it can take years to replace the training and experience of the crew. In contrast, the latest data and algorithms can be uploaded to a new RAS as soon as it is built.”
Examples already off the ground are the Loyal Wingman, a pilotless aircraft that Boeing says provides “fighter-like performance” and uses “artificial intelligence to fly independently or in support of manned aircraft while maintaining safe distance between other aircraft”.
It’s currently being built in Australia by Boeing, and the first one was handed over to the RAAF in May.
A company near Melbourne – DefendTex – is making “unmanned aerial vehicles” which can be programmed to identify smaller targets – say a particular type of tank or a car carrying a terrorist – and destroy them.
It’s a very small drone, like the ones on sale in electronics stores, but with an artificial intelligence and a grenade. Fleets of them could be launched by a single soldier.
The company says of its product: “The Drone40 is an autonomous loitering grenade.” It can give a single soldier “multiple round simultaneous impact capabilities” – in other words, one soldier can launch lots of them and they then fly to one or more targets and explode the grenades at the same time.
The ADF report says that Australia’s adversaries will be developing similar systems. It paints a picture of swarms of drones fighting each other “where adversaries make decisions at machine speed”.
The ADF says the robots will be under the command of a human being but the big unanswered question is: what does human command actually mean when the machine thinks for itself at “machine speed” and can shoot to kill.