Pal, Anjan orcid.org/0000-0001-7203-7126, Chua, Alton Y.K. and Banerjee, Snehasish orcid.org/0000-0001-6355-0470 (Accepted: 2025) When algorithms and human experts contradict, whom do users follow? Behaviour & Information Technology. ISSN 0144-929X (In Press)
Abstract
Drawing on the theory of planned behavior and the risk-taking theory, the objective of this research is to investigate how attitude toward algorithms, attitude toward humans, and willingness to take risks affect user intention to follow in the situation where recommendations from algorithms and human experts contradict. Set in the context of investment decision-making, a 2 (attitude toward algorithms: algorithm aversion vs. algorithm appreciation) x 2 (attitude toward human experts: unfavorable vs. favorable) x 2 (willingness to take risks: low vs. high) quasi-experiment was conducted online (N=804) where contradictory recommendations were presented from algorithms and human sources. Favorable attitudes toward algorithms and human experts promoted the intention to follow algorithm-generated and human-generated recommendations, respectively. A high willingness to take risks increased the intention to follow regardless of the source of the recommendations. Moreover, willingness to take risks moderated the relationship between attitude toward algorithms and the intention to follow the algorithm-generated recommendation as well as that between attitude toward humans and the intention to follow the human-generated recommendation. While the literature has shed light on how individuals evaluate recommendations from algorithms and humans separately, this is one of the earliest efforts to study the situation where algorithms contradict humans.
Metadata
Item Type: | Article |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | This is an author-produced version of the published paper. Uploaded in accordance with the University’s Research Publications and Open Access policy |
Keywords: | AI recommendation,AI attitude,algorithm-generated recommendation,decision-making,human-algorithm interaction,investment decision |
Dates: |
|
Institution: | The University of York |
Academic Units: | The University of York > Faculty of Social Sciences (York) > The York Management School |
Depositing User: | Pure (York) |
Date Deposited: | 04 Jul 2025 08:20 |
Last Modified: | 04 Jul 2025 08:20 |
Status: | In Press |
Refereed: | Yes |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:228743 |
Download
