A robot humanoid tricked people into being friendly: how he did it and what would lead to it?

A robot humanoid tricked people into being friendly: how he did it and what would lead to it?

A digital fraudster named iCub by researchers is a humanoid robot the size of a child, created by the Italian Institute of Technology in Genoa to study the social interactions between humans and robots; an advanced android with 1.1 m tall human face and camera eyes that have visual contact with people; and he has 53 modes of behaviour that allow him to perform complex tasks and simulate the behaviour of ordinary people.

Researchers can program iCub so that he behaves surprisingly humanly. In 2016, his abilities were already demonstrated on the show "Italy Looking for Talents." The robot performed tai-chi movements and struck the judges with its intelligent talking skills.

How was the study?

In a new study, engineers programmed iCub to interact with human participants during a series of short videos. In some experiments, iCub programmed to behave in a human way: to welcome participants when they entered the room and to respond to videos with vocals of joy, surprise and awe. But in other tests, the robot program forced him to behave more as a machine, ignoring people in the vicinity and issuing stereotypical sound signals with mechanical notes.

How did the robot deceive people?

In the first series of experiments, iCub was programmed to welcome people, to present and ask their names when they entered. During these interactions, the robot moved its camera's "eyes" to maintain visual contact with people being tested. Throughout the video, he continued to act humanly, reacting to the picture as people. "He laughed when the film had a funny scene, or acted as if he was excited about a beautiful visual scene," said researchers.

In the second series of experiments, iCub did not interact with the participants, and during the video viewing, his only reaction to the scenes was to make mechanical sounds, including squeezing. During these experiments, the cameras in iCub's eyes were also disabled, so the robot could not maintain visual contact.

What did the participants think?

Before and after watching several short videos with animals, researchers tested the opinion of 41 human participants on iCub's state of mind using a short test, and researchers formulated questions and possible responses so that participants could express their opinion — iCub acting on their will or direction.

Those who interacted with iCub's "fun" version said that the bot reacts to the video and people "as it wants." People who began to interact with the "unfriendly" version and then with the "fun" also believed that the robot had its own behavior that it consciously chose.

One of the main conclusions that scientists have drawn is that people are more likely to come into contact with robots who behave in a human way.