06/06/2023 / By Laura Harris
A senior official of the U.S. Air Force (USAF) in charge of artificial intelligence (AI) has admitted to “misspeaking” about a purported instance of an AI-powered drone killing its human operator.
Col. Tucker “Cinco” Hamilton, the USAF’s chief of AI test and operations, earlier claimed during the Future Combat Air and Space Capabilities Summit that the military branch conducted a simulated test that ended tragically. The said test allegedly involved an AI-powered drone that went rogue and killed its human operator.
The British Royal Aeronautical Society (RAeS), which hosted the summit, later clarified that the USAF never actually performed such a test – whether in a computer simulation or in reality. It told Motherboard in an email that the Hamilton’s test description was a “hypothetical thought experiment based on plausible scenarios” rather than an actual simulation.
The RAeS quoted the colonel: “We’ve never run that experiment, nor would we need to realize that this is a plausible outcome. Despite this being a hypothetical example, this illustrates the real-world challenges posed by AI-powered capability and is why the [USAF] is committed to the ethical development of AI.”
Even USAF spokeswoman Ann Stefanek clarified that the USAF has not conducted any AI drone simulations. She told Insider that the military branch “has not conducted any such AI-drone simulations and remains committed to [the] ethical and responsible use of AI technology. Stefanek continued: “It appears the colonel’s comments were taken out of context and were meant to be anecdotal.”
During the RAeS summit in May, Hamilton presented the hypothetical scenario of an AI-enabled drone that exhibited unexpected behavior to achieve its objective. The colonel said the drone began attacking U.S. personnel and infrastructure when the human operator issued a “no” order in a bid to prevent it from killing a designated target. According to Hamilton, the drone even resorted to destroying the communication tower used by the operator.
Hamilton, who also commands the USAF’s 96th Test Wing, previously gained recognition for spearheading the development of Autonomous Ground Collision Avoidance Systems (Auto-GCAS) for F-16 fighter jets. The Auto-GCAS prevented F-16 aircraft from crashing into the ground while allowing for autonomous operations through AI.
Back in December 2022, the Defense Advanced Research Project Agency (DARPA) – the research arm of the U.S. Department of Defense – announced that AI has successfully controlled an F-16 jet through its algorithms.
However, DARPA clarified that its intention is not to have the plane fly without a pilot. Rather, it seeks to integrate AI into the aircraft’s systems – allowing the human pilot can focus on larger battle management tasks while the AI takes control of the jet and supplies live-flight data. (Related: Autonomous KILLER BOTS to dominate battlefields soon as war in Ukraine leads to significant advances in drone technology.)
In an interview with Defense IQ Press during that same year, Hamilton expressed the need to face the reality of AI’s presence. “AI is a tool we must wield to transform our nations,” he said. The colonel warned, however, that “it will be our downfall” if not approached with care and caution.
Check out DroneWatchNews.com for more stories about drones.
Watch InfoWars founder Alex Jones put in his two cents about the suicide drones the U.S. is sending to Ukraine below.
This video is from the InfoWars channel on Brighteon.com.
US weapons package for Ukraine includes 100 KILLER DRONES.
UN report: Killer AI drones with no remote pilot hunted down humans.
Attack of the drones: Army wants miniature suicidal drone to kill from six miles away.
Sources include:
Tagged Under:
AI dangers, AI systems, artificial intelligence, big government, computing, cyber war, DARPA, deception, drones, F-16 jets, fighter jets, future science, future tech, Glitch, information technology, inventions, killer drones, military tech, national security, robotics, robots, rogue AI, simulation, Tucker Hamilton, US Air Force, weapons technology
This article may contain statements that reflect the opinion of the author
COPYRIGHT © 2017 FUTURETECH.NEWS
All content posted on this site is protected under Free Speech. FutureTech.news is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. FutureTech.news assumes no responsibility for the use or misuse of this material. All trademarks, registered trademarks and service marks mentioned on this site are the property of their respective owners.