What is the great threat that artificial intelligence poses to humanity? That a totally autonomous robot surgeon puts doctors out of work? That a descendant of ChatGPT, the fashionable digital chatter, pass the exams with higher grades than the class nerd? That machine learning systems not only beat us at chess, go and poker, but also surpass artists and scientists in depth and creative capacity? Nothing of that. The great threat is the one expressed by the UN Secretary General, António Guterres, in March 2019: “Machines with the power and criteria to kill without human involvement are politically unacceptable and morally repugnant, and should be prohibited by international law.” He hasn’t.
AI-powered weaponry is already a reality, and the war in Ukraine is serving as a grim testing ground to test its powers. Ukraine uses quadcopters (four-propeller drones) to throw grenades at Russian soldiers. The Russians launch swarms of missiles at Ukrainian hospitals, power plants and civilian buildings. The real arms race here does not reside in the size or the explosive charge of these weapons, but in its tiny electronic brain, less and less dependent on its human controllers, more autonomous, more capable of making its own decisions. The trend will inevitably lead to Guterres’s nightmare: machines that locate, select and kill their targets without the slightest human supervision. A dark future indeed.
We won’t have to wait long to see it, unless the UN’s proposed ban is negotiated and implemented in record time, which would be truly unusual in the minefield of international politics. Actually, all necessary technology is ready. The machines already know how to plan strategies, navigate between buildings, recognize targets and coordinate with each other to attack. The reconnaissance drones used in Ukraine are already fully autonomous, requiring only the addition of a bomb to make them lethal agents without human supervision. Fortunately, neither party has done so for the moment. A swarm of autonomous missiles can be considered a weapon of mass destruction, and that is the other red line, along with nuclear weapons, that no one has dared to cross.
Elon Musk’s SpaceX company is making information from its satellite network available to Ukraine, and there are two other less famous American companies, Anduril and Palantir, which have joined the Ukrainian proving ground. Anduril, which makes drones, autonomous submarines and artificial intelligence networks, is supplying systems to Ukraine. Palantir, founded by philosopher Alex Karp, also sells autonomous systems to the Ukrainian military, directly and as part of the NATO intelligence network.
One wonders why autonomous weapons arouse more fear than those controlled by a dozen generals who will rarely have been promoted to office for the height of their ethical convictions. It might be a good topic for a roundtable, but the easy answer is surely that machines are cheaper than soldiers.
Subscribe to continue reading
Read without limits