Watch: Robot that can feel pain invented by scientists

A future in which androids look and feel so much like humans that they start to believe they are actually alive - as depicted in the film Blade Runner - may soon be reality.

Scientists in Japan have invented a robot that can ‘feel’ pain and is programmed to visibly wince when an electric charge is applied to its synthetic skin.

The team from Osaka University is hoping that coding pain sensors into machines will help them develop empathy to human suffering, so they can act as more compassionate companions.

For lead researcher Prof Minoru Asada, who is also President of the Robotics Society of Japan, the question of whether robots could one day seem human is almost irrelevant.

“In Japan we believe all inanimate objects have a soul, so a metal robot is no different from a human in that respect, there are less boundaries between humans and objects,” he said.

In the film 1992 Blade Runner, which was based on the short-story ‘Do androids dream of electric sheep?’ by Philip K Dick, androids became so lifelike it was impossible to tell them apart from humans.

Asked if such a future was possible, Prof Asada said: “I think we are not far away from that technically, but obviously ethically that is another matter.

“We are embedding a touch and pain nervous system into the robot to make the robot feel pain so that it can understand the touch and pain in others. And if this is possible, we want to see if empathy and morality can emerge.

“We are aiming to construct a symbiotic society with artificially intelligent robots, and a robot that can feel pain is a key component of that society.

“Japan is a very high ageing society and many senior people are living alone, so these kinds of robots could provide physical and emotional assistance.”

The artificial pain system has been built into an eerily-lifelike robot child head called ‘Affetto’ which was unveiled by Osaka engineers in 2018.

But mapping 116 different points on the face, scientists are able to create nuanced expressions such as smiling and frowning, and now wincing, as seen above.

Affetto has soft tactile sensors which can detect both a gentle touch and painful blow and induce a range of facial expressions to demonstrate the level of discomfort.

Although presently a thump today produces a synthesised reaction, it is hoped that in the future the robot will be able to understand that it is being harmed through being hit, and experience a real kind of pain.

Dr Hisashi Ishihara, who helped design the robot, said such empathy was crucial if robots and humans were to live side by side.

The robot winces when in pain
The robot winces when in pain

“Generally, I believe that robots will be more effective in social bonding with humans when they have a more sensitive and expressive body,” he said.

“That’s why I’m trying to develop expressive android head and sensitive tactile sensors. Of course I think one day we humans will create robots that are difficult to distinguish from humans.

“However, the problem is that we don’t know exactly what the difference is between robots and humans despite the fact that they are apparently different.

“For example, many android robots can show smiling faces and we can feel something is wrong.”

The strange not-quite human appearance of some androids has been dubbed ‘uncanny valley’ and is proving a sticking point for android engineers, as it appears the closer robots come to looking like humans, the more we find them disturbing.

“Unless we can not explain it, we cannot fix it,” added Dr Ishihara. “So our team is now investigating the difference between them with a precise analysis approach.”

The new artificial pain system was presented at the recent AAAS meeting in Seattle.