Perceived Agency Changes Trust in Robots



Frazier, Chelsea

Journal Title

Journal ISSN

Volume Title



This thesis experimentally investigated the effect of people’s perception of a robot’s compliance (and resistance) to social norms on their evaluation of a robot’s perceived agency and trust. Participants reported higher perceived agency and trust to a norm-conforming robot compared to a norm-violating robot (experiment 1). Furthermore, results from experiment 1 found that perceived agency, regardless of how much a robot followed norms, was positively correlated with trust; therefore, suggesting that as people see a robot as having agency, they trust it more. While specifically examining the effect of negative attitudes on the relationship between social norms, perceived agency and trust, results from experiment 2 replicated experiment 1 and found that participants’ negative attitudes towards robots did not impact perceptions of trust in robots. These findings are useful in that they provide understanding for situations when people decide whether to trust a norm-conforming robot with perceived agency, regardless of if they feel negatively about robots. These results suggest that robot designers should pay careful attention on how to integrate robots with perceived agency cues. This is particularly important for safety critical robots in health care or the military because the more perceived agency a robot has, the more people seem to trust it (i.e., agency cues may not be representative of competency or ethics). Ideally, robot designers should understand the implications of integrating a robot high in perceived agency and low in capability, or conversely, low in perceived agency and high in capability. Future HRI research is necessary with specific emphasis on executable tools for measuring perceived agency for a diversified set of robots.



Trust, Social norms, Perceived agency, Human-Robot Interaction