The technology behind the machines we use to help us do our jobs could soon be transforming how we interact with each other and with our environment.
In a paper published this week in Nature Communications, researchers from the University of Nottingham, the University Of Sydney, the UK’s Royal Veterinary College, and Imperial College London describe the ability of robots to recognise people in photographs and to understand them.
In short, it’s the ability to “read” people’s facial expressions, and it’s this understanding that allows a robot to respond to a human being.
The ability to understand and interact with other people could be a game-changer in the world of artificial intelligence, and researchers from both universities and the Royal Veterinary and the Imperial College are leading the charge to use the technology to make robots more friendly, friendly and friendly again.
“It’s a big shift in the way we interact in the physical world, because we can now read their facial expressions,” says co-author Andrew Rowntree, who is the Head of Research at the University.
“The robots can’t read the facial expression of a human person, but they can understand the facial expressions of other people.”
The researchers found that they could tell a robot’s facial expression by observing its behaviour.
For example, when a robot is trying to approach someone, its pupils become wider and its pupils dilate.
This shows that it’s trying to get a closer look at the human, and the robot’s pupils dilated as it did.
When the robot gets closer, its eyes dart from side to side, indicating that it is trying for a closer contact.
When that happens, the robot can’t be seen, because the pupils dilating show that the robot is not looking at the person.
“We’re not seeing the robot, we’re seeing the human face,” Rowntrees says.
“So, if we want to understand the human and we want the robot to understand us, we need to be able to recognise each other.”
Understanding human emotions and their relationships with objects in a virtual world is a key part of understanding what robots will become, Rowntees says.
The researchers were able to use a technique called “facial-emotion mapping” to work out what emotions were being expressed by a robot when it came across a human face.
They took a photo of the robot and asked it to identify what kind of emotions the robot had, and they then analysed the robot as it was looking at a photograph.
They found that robots are able to interpret facial expressions to different levels, but that a robot that had experienced a human facial expression would have a more complex understanding of what was being said.
“When a robot sees a human, it doesn’t know what that person is saying, but it knows that the human is talking about a certain emotion, and that emotion is related to that person,” says Rowntreys.
“You can understand that when you look at someone in the mirror.
You don’t necessarily know what the emotion is, but you can see the face.”
When a robot does understand what the emotions of another person are, the researchers say it could be an advantage to allow people to interact with the robots as they’re working together.
“People will be able work together, and you can have robots with different levels of AI capabilities working with people,” Riptree says.
He believes that the ability for robots to understand facial expressions and human emotions could make a big difference to the way robots are used in the future.
“What we’re doing with these robots is building robots with the capacity to interact and be in situations where humans can’t, so they’re not always looking at things from a distance,” he says.
Rowntreed is also keen to make sure that the technology is safe for people to use, as the technology could be dangerous in the wrong hands.
“If you were to have a person use a robot as a weapon, then the robot could be used to attack you,” he explains.
“But that’s not the case with robots.
The team has published their work on the team at Nottingham’s Department of Psychology, where they are working on a number of projects, including “Neural Networks for Social Interaction” and “The Emotion Machine: The Role of Emotion in Social Interactions”. “
I don’t want to see people use these robots as weapons, I want them to be safe for themselves, but I also want them safe for other people, for other robots.”
The team has published their work on the team at Nottingham’s Department of Psychology, where they are working on a number of projects, including “Neural Networks for Social Interaction” and “The Emotion Machine: The Role of Emotion in Social Interactions”.
This work will look at how human emotion, like our facial expressions or body language, can be used by robots to help them understand the people around them.
“One of the big things that I wanted to look at was how to use this technology in the right contexts,” Rippree says, “and so we’ve got a whole bunch of different projects coming up.”