News Release

When robot becomes boss: Research on authority, obedience and relationships with machines

Peer-Reviewed Publication

SWPS University

Konrad Maj, PhD, from SWPS University, with a robot taking part in the study

image: 

Konrad Maj, PhD, from SWPS University, with a robot taking part in the study
view more 

Credit: SWPS University

How does a robot perform as a boss at work? The results of research by Polish scientists published in Cognition, Technology & Work suggest that while robots can command obedience, it is not as strong as in the case of humans. The level of obedience towards them is generally lower than towards human authority figures, and work efficiency under the supervision of a robot is lower. For employers and HR departments, this means the need to take the psychological aspects of implementing robots in the work environment into account - their perception as an authority figure, trust in them, and potential resistance to following orders, says Konrad Maj, PhD, from SWPS University, a psychologist and head of the HumanTech Center for Social and Technological Innovation.

Robot as an authority figure?

The development of robotics has led to a situation in which robots are increasingly found in roles associated with authority, e.g. in education, healthcare or law enforcement. Researchers were intrigued by the extent to which society would accept robots as authority figures. We have shown that people demonstrate a significant level of obedience towards humanoid robots acting as authority figures, although it is slightly lower than towards people (63% vs. 75%). As the experiment has shown, people may exhibit a decrease in motivation towards machines that supervise their work – in our studies, participants performed their assigned tasks more slowly and less effectively under the supervision of a robot. This means that automation does not necessarily increase efficiency if it is not properly planned from a psychological point of view, Maj believes. 

Course of the study

The study was carried out in the SWPS University laboratory by scientists from this university: Konrad Maj, PhD, Tomasz Grzyb, PhD, a professor at SWPS University,  Professor Dariusz Doliński and Magda Franjo. Participants were invited to the laboratory and randomly assigned to one of two study groups: with the Pepper robot or with a human acting as an experimenter. The task was to change the extensions of computer files. If the participant showed signs of reluctance to continue (e.g., a pause in work lasting more than 10 seconds), the robot or the experimenter used verbal encouragement. The average time to change the extension of one file was shorter under human supervision (23 seconds), while in the groups supervised by a robot this time increased to 82 seconds. The average number of files changed in the first variant was 355, and in the second it was nearly 37 percent less - 224 files.

Human-robot relations

The experiments indicate the complexity of human-robot interactions and the growing role of robots in society. Studies show that anthropomorphic features of robots affect the level of trust and obedience. Robots that are more human-like are perceived as more competent and trustworthy. On the other hand, too much anthropomorphisation can cause the uncanny valley effect, which results in lower trust and comfort in the interaction. Maj points out that there are several explanations for this phenomenon: If a machine has clear human features, but still exhibits various imperfections, this causes a cognitive conflict - we are at a loss as to how to treat it, we do not know how to behave towards something like that. But we can also talk about a conflict of emotions: fascination and admiration mixed with disappointment and fear. On the other hand, supporters of the evolutionary explanation claim that humans are programmed to avoid various pathogens and threats, and a robot that pretends to be a human, but is still not perfect at it, may appear to be a threat. Why? Because it looks like someone sick, disturbed or imbalanced. 

At the same time, giving certain human features to a robot can facilitate cooperation with the machine - after all, we are used to working with humans. A robot that looks like a human and communicates like a human simply becomes easy for us to use. But there is also a dark side to this - if we create robots that are very similar to humans, we will stop seeing boundaries. People will start to befriend them, demand granting them various rights, and perhaps even get married to them in the future. In the long run, humanoid robots may create a rift between people. There will also be more misunderstandings and aversion - and this is because robots owned at home will be personalised, always available, empathetic in communication, and understanding. People are not so well-matched, Konrad Maj points out. 

References:

Maj, K., Grzyb, T., Dariusz Doliński, & Franjo, M. (2025). Comparing obedience and efficiency in tedious task performance under human and humanoid robot supervision. Cognition Technology & Work. https://doi.org/10.1007/s10111-024-00787-1


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.