Developments in the field of robotics are accelerating us towards a bio-mechanical future. But if robots are to plug safely into society, we must start thinking about the rules of interaction today
Robots are everyone's favourite technology story. From robot fish to robot doctors, robot football teams and driverless robot cars, interest in the latest robot-related developments frequently propels them to the top of the popular news agenda.
Where did our fascination with robots begin? It's a question Noel Sharkey, professor of artificial intelligence and robotics at the University of Sheffield, often gets asked.
In the early years of the 20th century, he said, "you see that technology's moving incredibly fast and it really worried people. I think that robots in a sense gave technology a public face and… that's why people were so fascinated with it.
"It's still considered to be the peak of technology," he told silicon.com.
The mix of fear and fascination has dogged robotics ever since the term was coined.
The Czechoslovakian playwright Karel Čapek who first used the word 'robot' in his play Rossum's Universal Robots, which imagined robots as artificial entities created to work for humans but which ended up seeking to kill their makers despite their apparent resemblance to them. It was robots' first piece of negative press - and a plot clearly drawn on by the film Bladerunner with its renegade replicants.
Bad headlines around robots continue to this day. Who could forget the 'corpse-eating robot' moniker slapped on a military robot called EATR (Energetically Autonomous Tactical Robot) which powers itself by consuming organic matter. EATR consumed vegetable matter in fact, not corpses - but 'biomass-eating robot' presumably doesn't have the same horrific ring.
For all our supposed queasiness around robotics, the field continues to develop apace with scientists predicting robots will be used in increasingly greater numbers in industries including healthcare, defence and transport over the coming decades.
But are we ready for the robot revolution?
Greater use of robots in everyday life will not simply require technological developments, it will necessitate a new code of behaviour - perhaps even a new set of ethics - to manage our interactions with these artificially intelligent machines.
In the short term this may seem a very academic concern but the need for new rules to accommodate robots is an issue of greater practical urgency than you might think.
After all, the first real-world uses of robotics are already beginning to emerge. In the UK, for example, robot pod cars will be put into service next year, transporting business travellers to Heathrow Terminal 5 between the car park and the terminal building.
Sharkey believes more widespread adoption of robot cars is inevitable - and in the not too far distant future either. "I expect to see a lot of autonomous operation in robots. I would expect certainly one thing for sure would be autonomous cars - and that's been talked about for a long time. But not just autonomous; cars that will fit into a road system with sensors so that we can all hammer up the motorway at hundreds of miles an hour without worry of bumping into each other."
The University of the West of England's professor Alan Winfield, who does research at the Bristol Robotics Lab, is also convinced a driverless future is on the horizon. "Probably in 20 years' time, maybe a bit less, your car will drive itself," he says. "It'll actually be a robot that you get in and it carries you around, it takes you to where you want to go and while you're doing it you're on the phone or you're surfing the web or you're chatting to friends or whatever."
Driverless cars may be one of the more gentle uses of robotics but even they will need a host of new rules written to help them fit smoothly into our society.
Take questions of insurance, for example - in the event of an accident, who do you hold responsible? If the crash involves an artificially intelligent robot, do you blame its creator, or the robot that can think for itself?
It's a problem that would apply to any autonomous robot large enough to do accidental or erroneous damage to humans or property, according to Sharkey. "[It's] going to be the same with any robot in the public domain that's independent. Who's accountable? Who's responsible?"
There would also be the issue of which humans associated with the robot would be blamed for any misuse...
"There could be a very long chain of accountability," he added. "The manufacturer, the person who deployed it, the person who's using it currently. If I'm irresponsible with my autonomous car is it my fault? That's one of the problems with it."
Robot cars have certainly accelerated in recent years. In the first Darpa Grand Challenge in 2004 - a race for driverless cars across the Mojave Desert - none were able to finish the 150-mile course. The following year however, five cars crossed the line. The next race in 2007 saw the Grand Challenge turned into a 60-mile Urban Challenge, where robots were expected to obey traffic signals, avoid obstacles and merge with traffic. Six cars were able to complete this more challenging course.
Darpa commissions advanced research for the US Department of Defence - part of an industry that has proved no slouch when it comes to investing in robot tech.
Remote-controlled military robots are already in use and it only looks to be a matter of time before humans are taken out of their loop entirely.
In Afghanistan and Iraq, for example, thousands of robots are in service - from bomb-disposal robots to unmanned airborne drones, which fly without a pilot but carry a missile payload.
These drones are currently remote-controlled from afar, with a human apparently sanctioning when to drop the payload, but Sharkey believes "they're gradually becoming autonomous".
Professor Kevin Warwick, of Reading University's Cybernetics department, has misgivings about military robots - which he describes as "humanless fighting machines".
"The potential to harm and injure humans is there and they're being used for that already," he said.
It's not too big a leap to imagine drones being equipped with machine learning capabilities so they can become autonomous killing machines, according to Warwick.
"It wouldn't take too much to sprinkle in a little bit of learning - so based on the experience of trying to track this person down last time, this time it does it a different way," he added. "That would be a true AI system, pseudo-Terminator style. I don't think we're too far away [from that being possible]."
While robot fighters may remain on every military's must-have list, the structures needed to define how such armed and potentially deadly autonomous agents should be used and not used are not yet in place.
"This is not science fiction anymore," said Ron Chrisley, professor of philosophy at Sussex University. "This is really a pressing question - because in particular the US military is building more and more artificial systems that are going to be responsible for in some sense deciding whether or not to bomb co-ordinates or something. Now we need to get ethical principles in place to say, well, even if this system is in some sense responsible that doesn't mean that this other system - namely the people who deployed it - are not also responsible."
"I would hope that in the very near future a very rich field of machine ethics, machine-human ethics starts developing," he added.
Even if war zones seem very far off places to many in the West there are other developments that are bringing robots much closer to home too - developments which also demand our attention sooner rather than later if we are to ensure the technology transforms our society for the better.
Sheffield University's Sharkey highlighted a move in the Far East to create childcare robots - which he says currently amount to "advanced monitoring systems with entertainment on board" but warns of the future potential for busy parents to leave children entirely in the care of robots.
"These robots are getting more and more sophisticated so they're getting to the point where they could actually look after your children - especially with robots that can make facial expressions and copy emotions. Looking at the short term future, over five to 10, 20 years, I think that people might be very tempted to leave their children with them for long hours and we don't know what the impact of that will be."
What if a child develops a psychological attachment to a robot not its parents? "For me a relationship has to be two-way - [otherwise] it's a deceptive relationship; you can love your robot but it's not going to love you back."
Another area of potential concern is using robots as companions or carers for elderly people, something the Japanese are also starting to look at, according to Sharkey - and an issue of increasing focus in Western societies with growing populations of over-65s.
"The idea is that robots will assist the elderly and I'm all for this in many ways," he said. "If it can keep the elderly out of care homes for longer that's a good thing, but it's a set of ethical trade-offs really because if they're very successful - and as robots get better and better at doing it and in hard economic times - it might be tempting to leave people completely in the hands of robots if they're very safe.
"There the real problem is that old people, like everyone else, need human love and contact and if you're left... completely in the care of robots it would be an appalling way to go and I'd rather have a throat-cutting robot to be honest."
Reading University's Warwick is also convinced robot companions will play a big role in helping older people stay independent for longer. However, he believes such machines will need to be...
...sophisticated enough to be able to identify and respond to human emotions - and therefore project an illusion of emotional engagement - if people are to accept them.
"[These robots] will have to interact more, they will have to be emotional to some extent, they will do things like checking up on security as to who's at the front door, maybe fetching and carrying for the person - I think we're going to need that."
Emotional capabilities could also serve robots well in another big potential market for the technology - a market that won't necessarily be as immediately visible as others: sex.
"I'm surprised frankly that the sex industry hasn't yet cottoned on to robotics," the University of the West of England's Winfield said.
"For better or for worse, whatever your opinion on the subject, it is true that the sex industry has been responsible for a good deal of innovation on the internet, in terms of web technologies and so on," he added.
Sex with robots is inevitable, in Sheffield University's Sharkey's view - though he does not go as far as anticipating mass human-robot marriages, as another AI researcher, David Levy, has predicted.
"I don't agree with [Levy] that people will marry robots, except slightly perverted people. I can't imagine you'd want to marry it but certainly robots will be used in the sex industry, there's no doubt about that. And you could think of that as dystopian - I would. But people have sex with dolls, so you just make the doll move a little bit and you've got a robot."
And why not? Sex sells in all its forms so there's little reason to think sex with robots will be any different. But don't expect the world of sexbots to be a libertarian's paradise - or not for long anyway.
Like every area of human-robot interaction, ethics seem to be lurking in the background and it may not be too long before this most intimate of interactions stops being a guilt-free way to get your rocks off.
"If a robot isn't much smarter than a washing machine how can you possible abuse it?" says Winfield. "But if a robot is very, very much smarter than a washing machine then at what point would having sex with it be considered to be abusive, an abusive relationship? I think that's a deeply interesting question.
"When I say interesting I don't just mean academically interesting, I think within a decade or two society will have to grapple with that question."
A decade or 20 years may still sound like more than enough time to come up with new rules, and even for legislators and lawyers to hammer out new laws, but Sheffield's Sharkey points to technology's potential for sudden, exponential growth - saying a similar leap forward could propel robotics into the mainstream before the mainstream is ready for robots.
"Bill Gates talks about robotic technologies suddenly taking off the way the internet did - and I've been working in computer science for a very long time, and I've been working in the internet since about 1980 when I was in the United States, and world wide web took me by surprise - it's taken all of us by surprise, it just swept through suddenly and we don't want to be caught sleepwalking into a robotic world like that where we don't have control so the time is now to have the discussions."