Track:
Contents
Downloads:
Abstract:
Trust has been shown to be an important factor in cooperative work between human operators and automation. Much previous research has shown that the extremes of mistrust or over-trust are detrimental to effective team play between humans and automated agents, and that the development of calibrated trust is an important goal for automation design and training. More recent work suggests, however, that an equally important factor is the degree to which an automated agent communicates according to established rules and procedures which do not impose a heavy demand on the cognitive resources of the human operator. An experiment was conducted in which users interacted with an automated fault management system that provided information and decision advice concerning aircraft engine malfunctions. The communication style ("etiquette") and the reliability of the advisories provided by the automation were systematically manipulated. The results showed that in some cases violation of "etiquette" reduced the value of trustworthy (high-reliability) automation, leading to less efficient fault diagnosis. Conversely users worked more efficiently with low-reliability automation when its communication style conformed to the preferred norm than when it violated that norm.