06/27/2023 / By Arsenio Toledo
The rapid development of artificial intelligence (AI) is permanently reshaping warfare, and experts warn this could lead to the creation of fully autonomous killer drones.
Much attention is focused on the development of unmanned aerial vehicles or drones and their use in warfare, with many military and intelligence experts warning that the wars of tomorrow may not be fought with soldiers battling in the streets but with drone operators at their computers.
Worse yet, these same experts warn that one day these drones may end up being controlled by AI with no human input. (Related: Autonomous KILLER BOTS to dominate battlefields soon as war in Ukraine leads to significant advances in drone technology.)
Carlton King, an author and a former agent for the United Kingdom’s main intelligence service, MI6, warned that AI will one day be able to control pilotless attack aircraft and military leaders will be unable to resist the temptation that the advantages of using machine learning to pilot attack aircraft can bring.
“The moment you start giving an independent robot machine learning, you start losing control,” King warned. “The temptation will be there to say, ‘Let a robot do it all.'”
Currently, human operators are heavily involved in flying the drone fleets of the United States and the United Kingdom. But military leaders may be tempted to remove humans from the equation.
“There’s clearly going to be – if there isn’t already – the move towards taking away that pilot on the ground because their reactions may not be quick enough, and placing that into the hands of an artificial intelligence, [whose] reactions are much quicker and making that decision of fire or don’t fire,” King added.
The ongoing conflict between Russia and Ukraine has provided companies developing drones with a unique environment for the rapid development and deployment of advances in drone technology, especially as both sides of the conflict ramp up the use of drones for surveillance, reconnaissance and combat. This constant push for innovation has led to massive advances in drone technology – as well as concerns regarding the ethical implications of more and more autonomous decision-making in warfare.
AI is already being used for better drone communication, collaboration, target recognition and more autonomous drone operations, leading to more deployments of loitering munitions, or “suicide” drones, which can circle around areas for hours while it looks for targets before attacking.
These innovations are also paving the way for autonomous drones to be used in swarms of dozens or even hundreds of drones that may one day be deployed to the world’s battlefields with devastating effects.
In fact, the U.S., the U.K. and Australia, as part of a trilateral military exercise known as the AUKUS Advanced Capabilities Pillar program, recently conducted their first military trial of an AI-enabled drone swarm.
According to Britain’s Defense Science and Technology Laboratory, this experimental drone swarm successfully tested leading-edge drone technology and AI models shared between the three countries to detect and track military targets “in a real-time representative environment.”
The laboratory added that more and more AI-enabled systems being introduced to drone technology is necessary as “the strategic environment rapidly evolves.” The lab further noted that autonomous and AI-powered drones will transform battlefields and will allow the U.S., the U.K. and Australia to maintain an operational advantage in war.
“Accelerating technological advances will deliver the operational advantages necessary to defeat current and future threats across the battlespace,” said British Deputy Chief of the Defense Staff Lt. Gen. Robert Magowan. “We are committed to collaborating with partners to ensure that we achieve this.”
“As AI-powered systems become more capable of making autonomous decisions, concerns about accountability and the potential for civilian casualties come to the forefront,” warned Erik De Vries of Innovation Origins.
“While AI systems are dispassionate and can circumvent human biases, they may lack the ability to exercise the same level of judgment and discernment as human operators. The potential for errors and misinterpretations in AI decision-making raises serious questions about the responsibility to act, the role of diplomacy and the legitimacy and legality of lethal action based on AI predictions.”
Learn more about the development of artificial intelligence and its integration into most aspects of technology at FutureTech.news.
Watch this clip from the “Rudyk Report” discussing how China is preparing for future drone-focused wars by ordering another 15,000 new drones.
This video is from the Rudyk Report channel on Brighteon.com.
AI is about to change the world for the WORSE: Here are 3 reasons why.
Pentagon plans to launch a program that will develop DRONE SWARMS.
Drone use is widening in Ukraine, bringing with it potential dawn of robotic killing machines.
Sources include:
Tagged Under:
AI, artificial intelligence, autonomous drones, big government, computing, cyber war, cyborg, Dangerous, future science, future tech, Glitch, information technology, innovation, inventions, killer drones, military technology, national security, robotics, robots, unmanned aerial vehicles, weapons technology
This article may contain statements that reflect the opinion of the author
COPYRIGHT © 2017 FUTURE SCIENCE NEWS