Another year, another delusional Pentagon budget request
DOD is asking Congress to authorize nearly $900 billion to fund a military strategy that will only make the world less secure
The Pentagon released its proposed budget for Fiscal Year 2025 this week. There were no major surprises, unless you’re shocked by the fact we are continuing to overinvest in a strategy and a military force structure that is making the world less secure.
If this budget goes through as requested, the Pentagon and related activities like work on nuclear warheads at the Department of Energy will come in at $894 billion. That’s slightly less than the number being debated for this year, but far more than the levels achieved at other major turning points like the Korean and Vietnam wars or the peak of the Cold War. Meanwhile, Congress has shown little ability to provide adequate input or oversight of these huge figures. Over five months into the new fiscal year, it has yet to even pass a 2024 budget.
What could possibly justify devoting these enormous sums to the Pentagon at a time of urgent national need to address other threats to our lives and livelihoods, from climate change to epidemics of disease to rampant inequality? The primary answer is the same one we have heard repeatedly in recent years: China, China, and China.
But as I have noted in a recent paper for the Brown University Costs of War Project, by any measure the United States already spends two to three times as much on its military as China does, and outpaces it by far in basic military capabilities like nuclear weapons, naval firepower, and modern transport and combat aircraft. In the areas where there is room for doubt about the relative military power of the two rivals, from emerging technology to the likely outcome of a war over Taiwan, dialogue and diplomacy offer a far better chance of reaching a stable accommodation than spinning out scenarios for “winning” a war between two nuclear-armed powers, or by running a costly new arms race.
Unfortunately, the rhetoric and resources underpinning the new Pentagon request are more consistent with arms racing than accommodation. The department remains firmly committed to its plan to build thousands of “autonomous, attritable systems” by August 2025, with the express purpose of developing the ability to overwhelm China in a conflict in Asia. In plain English, this means building swarms of drones and other high-tech systems controlled by artificial intelligence. And the plan is for these systems to be cheap and readily replaced if large numbers are destroyed in battle.
The idea that the U.S. arms industry can produce large numbers of new systems quickly and affordably, and build replacements on short notice, runs contrary to the experience of recent decades. It’s an exercise in wishful thinking that could result in the worst of both worlds — spurring China to increase its investments in next generation military technology even as it is unclear whether the United States can develop and integrate it successfully in any reasonable time frame.
Far from increasing our security, once these new systems are developed and fielded they will almost certainly make the world a more dangerous place. This point is underscored in a new report from Public Citizen which notes that “[i]ntroducing AI into the Pentagon’s everyday business, battlefield decision-making and weapons systems poses manifold risks.”
For example, although current Pentagon guidelines pledge to keep humans in the loop in decisions to engage in lethal force, once autonomous weapons are produced on a large scale the temptation to use them without human intervention will be great. This in turn will have a cascade of potential negative effects, from dehumanizing the targets of these systems, to making it easier to contemplate going to war, to risking mass slaughter caused by a malfunction in one of these complex systems.
And as Michael Klare has written in an analysis for the Arms Control Association, the dangers of AI and other emerging military technologies are likely to “expand into the nuclear realm by running up the escalation ladder or by blurring the distinction between a conventional and nuclear attack.”
Klare also rings the alarm bell about the real risks of technical failures involving next generation technologies:
“Non-military devices governed by AI, such as self-driving cars and facial-recognition systems, have been known to fail in dangerous and unpredictable ways; should similar failures occur among AI-empowered weaponry during wartime, the outcomes could include the unintended slaughter of civilians or the outbreak of nuclear war.”
https://responsiblestatecraft.org/pentagon-budget-2667494544/