A military drone may have autonomously attacked humans for the first time without being instructed to do so, according to a recent report by the UN Security Council,
The Independent reports.
The report, published in March, claimed that the AI drone – Kargu-2 quadcopter – produced by Turkish military tech company STM, attacked retreating soldiers loyal to Libyan General Khalifa Haftar.
The 548-page report by the UN Security Council’s Panel of Experts on Libya has not delved into details on if there were any deaths due to the incident, but it raises questions on whether global efforts to ban killer autonomous robots before they are built may be futile.
Over the course of the year, the UN-recognized Government of National Accord pushed the Haftar Affiliated Forces (HAF) back from the Libyan capital Tripoli, and the drone may have been operational since January 2020, the experts noted.
“Logistics convoys and retreating HAF were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems such as the STM Kargu-2,” the UN report noted.
Kargu is a “loitering” drone that uses machine learning-based object classification to select and engage targets, according to STM, and also has swarming capabilities to allow 20 drones to work together.
“The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability,” the experts wrote in the report.