Opinion
The Editorial Board
the
future
of
war
Something strange happened at the meeting between President Joe Biden and President Xi Jinping of China in a mansion south of San Francisco on Nov. 15, 2023. After a working lunch, as the two leaders rose to leave, an aide to Mr. Xi signaled to one of the Chinese president’s bodyguards, who approached the table, took a small bottle out of his pocket and quickly sprayed down every surface that Mr. Xi had touched, including what remained of the almond meringue cake on his dessert plate.
The purpose, the Americans concluded, was to remove any trace of Mr. Xi’s DNA that his hosts might collect and exploit. “This is the way they’re thinking,” said an official who attended the meeting, “that you could design a disease that would only affect one person.” To the handful of U.S. officials who were there, it was a sobering coda to an otherwise successful summit: Even as Beijing and Washington pursue diplomacy, the pace of technological change is deepening suspicion and fear between the two sides.
Human history can be told as a series of advances in warfare, from chariots to crossbows to nuclear-tipped missiles, and we are living through what may be the fastest advancement in weaponry ever. Ask any five veteran national security experts and you will hear about five different emerging technologies with the potential to change the world of combat. Swarms of robotic aircraft that work in unison to find and kill targets without any human oversight. Advanced cyberweapons that can immobilize armed forces and shut down electrical grids across the country. A.I.-designed bioweapons engineered to kill only those with certain genetic characteristics.
Some of these weapons will remain confined to the pages of science fiction, but others are already in the works. Innovations in artificial intelligence, synthetic biology and quantum computing are set to change how we wage war just as they transform all aspects of our lives. The United States has the lead in some areas, especially in A.I., thanks to the massive investments of the private sector. But China, Russia and other authoritarian regimes are accelerating state investments at purpose-built universities and finding ways to incorporate innovations into their militaries now.
Keeping pace in these 21st-century arms races will require political will and national coordination between the public and private sectors and research institutions. The Pentagon must embrace technological change and incorporate it into recruitment, training and strategies. Congress needs to expand funding for research and development into technologies with military applications. The president needs to reverse his administration’s war on universities and bring private industry into the mission.
It wouldn’t be the first time the country rallied to stay ahead of innovative military adversaries. American science was behind Germany’s before World War II began, but it surged ahead and ultimately won the war in large part thanks to victory in the cutting-edge scientific race to develop an atomic weapon. Crucial to the effort was the collaboration between American government officials, academic researchers and private companies. The need for collaboration is particularly acute in A.I., which is a rare example of a technology with profound national security implications that was developed not by the government but by the private sector.
Congress and the courts must vigorously oversee such public-private collaboration to ensure it serves the public interest, not that of any one company or administration. America must also be aware of the dangers of an unchecked arms race. The world learned in the last century that powerful weapons are sometimes best kept at bay through treaties, and the United States should join other countries in arms control before new and dangerous technologies are ready to be used.
Deterrence will be necessary. The biggest reason to build a military that can win the wars of the future is to prevent those wars from ever happening. It is to signal to America’s rivals and enemies that an attack on the United States and its allies is too costly to launch. We urge the leaders of both parties — President Trump, members of Congress and Democrats considering campaigns for the White House in 2028 — to recognize the threat in front of us.
America’s national security establishment is scrambling to adapt to this coming world of warfare. The National Geospatial-Intelligence Agency, which collects and analyzes much of the Pentagon’s space-based intelligence, leads one effort: developing America’s military A.I. targeting system. Synthesizing a constant, high-volume stream of data from satellites, spy planes and other sensors, N.G.A.’s imagery analysis program, called Maven, identifies objects faster than the army of humans who once pored over grainy computer images in search of threats.
Maven is now in every major U.S. military command headquarters worldwide. It has suggested targets in Iraq, Syria and Yemen that U.S. forces have subsequently destroyed, and has produced intelligence that Ukraine has used to strike Russian targets. The program can identify military objects like rocket launchers, advancing troop formations and docked battleships and spots and tags objects on a no-strike list, including hospitals, schools and religious buildings. N.G.A. has testified to Congress that these algorithms, fed with new intelligence data every day, get faster and more reliable with use.
This software-driven approach to warfare is forcing the Pentagon to turn to the private sector for help. Palantir, a data analytics company co-founded in 2003 by the billionaire libertarian Peter Thiel and Alex Karp, its chief executive, is the largest of the new tech-heavy contractors. The company takes intelligence data from agencies like N.G.A., integrates it with other information and presents it in a program called Maven Smart System that military intelligence officers use daily on a classified network worldwide. Palantir’s success at working with the government has made it one of the most prominent businesses pushing the Department of Defense to adopt new ways of working with Silicon Valley.
In November a smaller start-up, Anduril, founded by the Oculus inventor Palmer Luckey, won an Army contract to provide the service with an A.I.-powered drone defense program. The company’s software is designed to organize information from radars and other detection systems so that humans can target and destroy enemy drones faster and more effectively. On Oct. 31 over the Mojave Desert, the firm flew a dart-shaped drone, called Fury, in a first test of its A.I.-controlled flight. The Pentagon hopes eventually to field a fleet of 1,000 robotic wingmen that will fly alongside traditional fighter jets into battle with the ability to dogfight with other aircraft, perform reconnaissance and conduct electronic warfare.
America’s adversaries are pursuing their own advances. China has released video footage of what appears to be a robotic wingman being tested in flight. Ukraine’s forces have captured a Russian drone that could fly dozens of miles, identify and lock onto targets, then plunge into a nosedive and deliver six pounds of explosives to blow up the target, apparently all without any human guidance. Built with off-the-shelf parts, the V2U, as it is known, costs a fraction of its American and Chinese counterparts.
Not all high tech is smart tech. For decades the United States has reflexively opted for cutting-edge, bespoke weapons that take years to perfect, rather than buying simple, commercially available equipment. America’s newest fighter plane, the F-35 Lightning II, can see other fighters before they see it, and destroy them with advanced guided missiles. The pilot’s helmet alone costs about $400,000. But all that gee-wizardry can break. And it often does, which has caused the plane to spend more time undergoing maintenance in a hangar than in the air.
Much of America’s vulnerability stems from its reliance on such expensive, exquisite systems. Everything from sensors that track enemy ships, aircraft and missiles to the ability of commanders to coordinate now relies on delicate, poorly defended space-based systems. China has built a military capable of disrupting those networks using cyberweapons and missiles. “That is their theory of victory,” the former deputy defense secretary Bob Work said in 2021. “Every single link or communication system we have is covered by a Chinese jammer. They do all sorts of cyber intrusions. And they put them all under one commander and this commander just looks at the American battle network and says, ‘How can I break it apart?’”
The hardest question for the United States in figuring out how to respond to such challenges isn’t whether to pursue our own weapons that can deter and defeat them, but how to do so safely and ethically.
Turning over decision-making to robots can threaten civilians. Israel has successfully mixed traditional and new warfare tactics over the last two years in fighting Hezbollah and Iran. Its use of A.I.-enabled surveillance systems in Gaza to suggest human targets has been controversial. The systems have reportedly misidentified civilians as combatants and resulted in the deaths of innocent people.
The dangers go beyond the traditional battlefield. Experts warn that advancements in A.I. could usher in a new era of bioterrorism. With basic coding knowledge, a laptop and an internet connection, rogue actors could direct A.I. programs to comb through open-source databases discovering ways to fine-tune viruses to make them spread faster and be more deadly. Groups with specific genetic characteristics or individual species of farm animals could be targeted.
This year two major companies — OpenAI and Anthropic — warned that if nothing is done, A.I. will soon be able to assist bad actors attempting to create bioweapons. Students at M.I.T. used chatbots to come up with four pandemic pathogens. The A.I. explained how to generate them from synthetic DNA; it suggested companies that were unlikely to screen orders for the DNA; and it recommended that if the students lacked the skills to do all this, they could contact a research organization. This was done in one hour.
The Biden administration imposed multiple safety controls on A.I. development and use, including by the military. Mr. Trump reversed some of those steps and replaced them with his own directive to revoke “barriers” to innovation. The Pentagon intends to expand its use of A.I. in intelligence analysis and combat in the coming months, a top official told a defense conference earlier this month. “The A.I. future is not going to be won by hand-wringing about safety,” said Vice President JD Vance at a summit in Paris in February.
The world is unprepared for what’s coming and what’s already here. As the wars of the 20th century showed, deterrence alone is often not enough to prevent the catastrophic use of new weapons. The United States needs to negotiate and sign treaties to limit how and where these weapons are used.
We join the United Nations secretary-general and the International Committee of the Red Cross in their call for a new treaty to be concluded by 2026 on autonomous weapons systems. This should include: limits on the types of targets, such as outlawing their use in situations where civilians or civilian objects are present; and requirements for human-machine interaction, notably to ensure effective human supervision, and timely intervention and deactivation.
At the very least, the administration should push comprehensive requirements that companies that sell equipment used to make biological agents adopt rules to verify who their customers are and the nature of their work. The White House should also call for the Biological Weapons Convention — a 50-year-old treaty with about 190 states participating — to be amended to address the technological advancements underway.
The United States should end its self-defeating war on universities and return to long-term and open-ended investing in the technologies of the future through grants for research and development. It should also expand the bets it makes — large and small — in private industry’s nascent technologies, like autonomous drones. Finally, the United States should tighten export controls on advanced A.I. chips to ensure they don’t find their way to adversarial nations. Mr. Trump’s decision, announced Dec. 8, to allow the sale of some of the world’s most powerful chips to China is a mistake.
From Ukraine’s battlefields to Mr. Xi’s apparent post-lunch bioweapons defense, it’s clear that America’s oceans, long bulwarks against foreign enemies, can’t protect against an adversary who uses A.I. as part of a total war against the United States. “The speed of warfare will soon outpace human ability to control it,” said Andrew C. Weber, the Pentagon official in charge of nuclear, chemical and biological defense programs under President Barack Obama.
To counter the growing threat, America must simultaneously win the race to build autonomous weapons and lead the world in controlling them.
Video credit: HighGreat drone show, via YouTube. Photo credits: Zhang Jiansong/Xinhua, via Getty Images; Aleksey Kondratyev for The New York Times (2); Senior Airman Erica Webster/U.S. Air Force; Aleksey Kondratyev for The New York Times (2).
The editorial board is a group of opinion journalists whose views are informed by expertise, research, debate and certain longstanding values. It is separate from the newsroom.
Published Dec. 9, 2025