The U.S. Air Force's AI-powered pilotless aircraft could disrupt traditional warfare strategies. However, the ethical implications of using such technology on the battlefield remain a subject of intense debate.
AI in warfare: A double-edged sword
The U.S. Air Force's pursuit of AI-powered pilotless aircraft is a game-changer, promising to give American forces a strategic advantage in future conflicts. The AI-run XQ-58A Valkyrie experimental aircraft, currently under development, could limit losses to manned planes and pilots in a conflict. However, the use of such powerful technology also poses serious ethical questions. There is an ongoing debate about how and when it is appropriate to use AI-driven lethal weapons, and what accountability measures should be in place.
Kratos Defense & Security Solutions, a major defense contractor, is behind the development of the AI-run XQ-58A Valkyrie. The company was awarded a contract to develop the platform in 2016, and the first successful test flight took place in 2019. Kratos estimates that each unit will cost roughly $4 million, a relatively cheap price tag compared to other military aircraft. The cost-effectiveness of the AI-driven aircraft could make it an appealing option for the U.S. military.
AI 'wingman' transforms warfare
The AI-run Valkyrie is envisioned to function as a wingman for manned planes, and its capabilities extend beyond mere assistance. The aircraft is designed to identify threats, engage targets, and absorb enemy fire if necessary, potentially reducing the risk to human pilots. Its capabilities were recently demonstrated in a test where it flew in formation alongside an F-15E Strike Eagle. The AI-driven 'wingman' concept is a significant departure from traditional warfare strategies and could revolutionize the way battles are conducted.
Ethical considerations of AI in combat
While the advancements in AI-driven combat aircraft are impressive, the technology is not without its critics. The use of AI in warfare raises important ethical questions, especially regarding the autonomy of AI-driven lethal weapons. There are concerns about the potential for civilian casualties, and whether AI systems can make the complex, nuanced decisions required in the heat of battle. These issues will need to be addressed as the U.S. continues to develop and deploy AI-driven warfare technology.