Human rights groups have launched a campaign to put a stop to the development of what they call "killer robots", saying such machines should be outlawed along with cluster bombs and landmines.
The campaign organisers are concerned that the increased use of unmanned drone attacks by the US and automatic defence systems are the precursors to thinking robots that could kill with impunity if programmed to do so - much like the Arnold Schwarzenegger character in the Terminator movies.
Automatic defence systems include the Iron Dome, which was successfully used by the Israelis to shoot down rockets from Gaza late last year.
One of the campaigners, Jody Williams, said: "We're worried about machines that can be programmed, you put in a certain set of criteria, but then you set that machine free and it goes off whether it is in the air or on land or in the sea.
"I don't want a programmed machine to kill me."
Ms Williams, who won the Nobel Peace Prize in 1997 for her campaign to ban landmines, added: "I know we can do the same thing with killer robots. I know we can stop them before they hit the battlefield."
Defence contractors, particularly in the UK and US, are working increasingly on robotic aids to warfare; the so-called "bloodless war" is akin to the Holy Grail for western military leaders.
But in Britain, at least, defence chiefs draw the line at killer robots with no human control.
"The MoD has no intention of developing any weapons systems that are used without human involvement," a Ministry of Defence spokesperson said.
Although the Royal Navy does have defensive systems, such as phalanx, which can be used in an automatic mode to protect personnel and ships from enemy threats like missiles, a human operator oversees the entire engagement.
Furthermore, all of our Remotely Piloted Aircraft Systems used in Afghanistan to protect troops on the ground are controlled by highly trained military pilots.
There are no plans to replace skilled military personnel with fully autonomous systems.
Even the terminology is inflammatory, with the idea of autonomy meaning different things to different groups.
Robotics Professor Noel Sharkey from Sheffield University is clear that the Terminator is not about to come to life.
"In robotics we came up with the term 'autonomy' to simply mean programmed and operating with censors, rather than calling them killer robots, it's a weapons system that can select a target and engage without further human involvement," he said.
Either way robotics, particularly in its military guise, is increasingly important to defence planners as a way of increasing the safety of soldiers and their capability at the same time.
But it is probably premature to say whether or not we are at the dawn of the age of armies of robots slugging it out with each other.