How would a robot perform in a managerial position? Research carried out by Polish scientists suggest that robots garner less respect from subordinates than their human counterparts. The results of this study were published in Cognition, Technology & Work. Not only is the level of obedience commanded by robots lower, but also efficiency achieved under the supervision of robots is lower. "This means that employers and HR departments wishing to introduce robots at the workplace must consider various psychological aspects of such innovations, for example perception of robots as authority figures, the level of trust people are willing to put in them, and potential resistance to following orders given by machines", says Konrad Maj, Ph.D., psychologist, Head of the HumanTech Center for Social and Technological Innovation at SWPS University.
Robot as an authority figure?
The development of robotics has led to a situation in which robots are increasingly found in roles associated with authority, e.g. in education, healthcare or law enforcement. Researchers were intrigued by the extent to which society would accept robots as authority figures.
We have shown that people demonstrate a significant level of obedience towards humanoid robots acting as authority figures, although it is slightly lower than towards people (63% vs. 75%). As the experiment has shown, people may exhibit a decrease in motivation towards machines that supervise their work – in our studies, participants performed their assigned tasks more slowly and less effectively under the supervision of a robot. This means that automation does not necessarily increase efficiency if it is not properly planned from a psychological point of view.
Konrad Maj, Ph.D., social psychologist, head of the HumanTech Center at SWPS University
Course of the study
The study was carried out in the SWPS University laboratory by scientists from this university: Konrad Maj, PhD, Tomasz Grzyb, PhD, a professor at SWPS University, Professor Dariusz Doliński and Magda Franjo. Participants were invited to the laboratory and randomly assigned to one of two study groups: with the Pepper robot or with a human acting as an experimenter. The task was to change the extensions of computer files. If the participant showed signs of reluctance to continue (e.g., a pause in work lasting more than 10 seconds), the robot or the experimenter used verbal encouragement. The average time to change the extension of one file was shorter under human supervision (23 seconds), while in the groups supervised by a robot this time increased to 82 seconds. The average number of files changed in the first variant was 355, and in the second it was nearly 37 percent less - 224 files.
Human-robot relations
The experiments indicate the complexity of human-robot interactions and the growing role of robots in society. Studies show that anthropomorphic features of robots affect the level of trust and obedience. Robots that are more human-like are perceived as more competent and trustworthy. On the other hand, too much anthropomorphisation can cause the uncanny valley effect, which results in lower trust and comfort in the interaction. Maj points out that there are several explanations for this phenomenon:
If a machine has clear human features, but still exhibits various imperfections, this causes a cognitive conflict - we are at a loss as to how to treat it, we do not know how to behave towards something like that. But we can also talk about a conflict of emotions: fascination and admiration mixed with disappointment and fear. On the other hand, supporters of the evolutionary explanation claim that humans are programmed to avoid various pathogens and threats, and a robot that pretends to be a human, but is still not perfect at it, may appear to be a threat. Why? Because it looks like someone sick, disturbed or imbalanced.
Konrad Maj, Ph.D., social psychologist, head of the HumanTech Center at SWPS University
At the same time, giving certain human features to a robot can facilitate cooperation with the machine - after all, we are used to working with humans.
A robot that looks like a human and communicates like a human simply becomes easy for us to use. But there is also a dark side to this - if we create robots that are very similar to humans, we will stop seeing boundaries. People will start to befriend them, demand granting them various rights, and perhaps even get married to them in the future. In the long run, humanoid robots may create a rift between people. There will also be more misunderstandings and aversion - and this is because robots owned at home will be personalised, always available, empathetic in communication, and understanding. People are not so well-matched.
Konrad Maj, Ph.D., social psychologist, head of the HumanTech Center at SWPS University
More information in research publication:
Maj, K., Grzyb, T., Dariusz Doliński, & Franjo, M. (2025). Comparing obedience and efficiency in tedious task performance under human and humanoid robot supervision. Cognition Technology & Work.