Slow response times undermine trust in algorithmic (but not human) predictions

Publication date: March 2020Source: Organizational Behavior and Human Decision Processes, Volume 157Author(s): Emir Efendić, Philippe P.F.M. Van de Calseyde, Anthony M. EvansAbstractAlgorithms consistently perform well on various prediction tasks, but people often mistrust their advice. Here, we demonstrate one component that affects people’s trust in algorithmic predictions: response time. In seven studies (total N = 1928 with 14,184 observations), we find that people judge slowly generated predictions from algorithms as less accurate and they are less willing to rely on them. This effect reverses for human predictions, where slowly generated predictions are judged to be more accurate. In explaining this asymmetry, we find that slower response times signal the exertion of effort for both humans and algorithms. However, the relationship between perceived effort and prediction quality differs for humans and algorithms. For humans, prediction tasks are seen as difficult and observing effort is therefore positively correlated with the perceived quality of predictions. For algorithms, however, prediction tasks are seen as easy and effort is therefore uncorrelated to the quality of algorithmic predictions. These results underscore the complex processes and dynamics underlying people’s trust in algorithmic (and human) predictions and the cues that people use to evaluate their quality.
Source: Organizational Behavior and Human Decision Processes - Category: Psychiatry & Psychology Source Type: research
More News: Psychology | Study