WASHINGTON -- Going off to war has always meant risking your life, but a wave of robotic weaponry may be changing that centuries-old truth.
The "pilots" who fly U.S. armed drones over Afghanistan, Iraq, and Pakistan sit with a joystick thousands of miles away, able to pull the trigger without being exposed to danger.
Other robots under development could soon ferry supplies on dangerous routes and fire at enemy tanks.
The explosion in unmanned vehicles offers the seductive possibility of a country waging war without having to put its own soldiers or civilians in the line of fire.
But analysts say the technology raises a host of ethical and legal questions, while political and military leaders have yet to fully grasp its implications.
"What's the effect on our politics? To be able to carry out operations with less human cost makes great sense. It is a great thing, you save lives," said Peter Singer, author of "Wired for War."
"On the other hand, it may make you more cavalier about the use of force," he told AFP.
Commanders see unmanned vehicles as crucial to gaining the edge in combat and saving soldiers' lives, freeing up troops from what the military calls "dull, dirty, and dangerous" tasks.
Cruise missiles and air strikes have already made war a more remote event for the American public.
Now, robots could offer the tantalizing scenario of "pain-free" military action, said Lawrence Korb, a former U.S. assistant secretary of defense.
"That raises the whole larger question — does it make it too easy to go to war, not just here or anyplace else?" he said.
Robotic technology is taking armies into uncharted territory where tens of thousands of sophisticated robots could eventually be deployed, including unmanned vehicles possibly designed to automatically open fire.
U.S. officials insist a human will always be "in the loop" when it comes to pulling the trigger, but analysts warn that supervising robotic systems could become complicated as the technology progresses.
Military research is already moving toward more autonomous robots that will require less and less guidance.
The trend is illustrated by the Air Force's plans to have a single human operator eventually supervise three drones at once instead of one aircraft.
Even if humans can still veto the use of force, the reality of numerous robots in combat producing a stream of information and requiring split-second decisions could prove daunting.
Future robotic weapons "will be too fast, too small, too numerous, and will create an environment too complex for humans to direct," retired Army colonel Thomas Adams is quoted as saying in "Wired for War."
Innovations with robots "are rapidly taking us to a place where we may not want to go, but probably are unable to avoid," he said.
Experience has shown humans are sometimes reluctant to override computerized weapons, placing more faith in the machine than their own judgment, according to Singer.
He cited the tragic downing of an Iranian airliner in 1988 over the Persian Gulf, when U.S. Navy officers deferred to Aegis missile defense computers, which identified the plane as "an assumed enemy." The officers' radar and radio information had indicated it was a civilian plane.
The military is still trying to figure out how an armed robot on the ground should be designed and operated to conform to the law of armed conflict, said Ellen Purdy, the Pentagon's enterprise director of joint ground robotics.
"Nobody has answered that question yet," Purdy said. "There's a threshold where just because you can, doesn't mean you should."
As dozens of countries join the robotic arms race, human rights groups are beginning to take notice of its implications for warfare.
Although in theory drones provide more precise targeting that can minimize civilian casualties, rights activists are concerned about weapons that could shoot without a human issuing the order.
If an entirely autonomous machine committed a war crime, experts say it remains unclear how the atrocity could be prosecuted under international laws drafted decades before the advent of robots.
"Who's responsible?" asked Marc Garlasco, a military adviser at Human Rights Watch.
"Is it the developer of the weapons system? Is it the developer of the software? Is it the company that made the weapon? Is it the military decision-maker who decided to use that weapon?" he continued.
"No one has really dealt with that because luckily, we're not there yet."
Copyright © 2009 Agence France Presse. All rights reserved. The information contained in the AFP News report may not be published, broadcast, rewritten or redistributed without the prior written authority of Agence France Presse.