Sanders, Jet Gabrielle, Ueda, Yoshiyuki, Yoshikawa, Sakiko et al. (1 more author) (2019) More human than human:a Turing test for photographed faces. Cognitive research: principles and implications. 43 (2019). ISSN 2365-7464
Abstract
BACKGROUND: Recent experimental work has shown that hyper-realistic face masks can pass for real faces during live viewing. However, live viewing embeds the perceptual task (mask detection) in a powerful social context that may influence respondents' behaviour. To remove this social context, we assessed viewers' ability to distinguish photos of hyper-realistic masks from photos of real faces in a computerised two-alternative forced choice (2AFC) procedure. RESULTS: In experiment 1 (N = 120), we observed an error rate of 33% when viewing time was restricted to 500 ms. In experiment 2 (N = 120), we observed an error rate of 20% when viewing time was unlimited. In both experiments we saw a significant performance cost for other-race comparisons relative to own-race comparisons. CONCLUSIONS: We conclude that viewers could not reliably distinguish hyper-realistic face masks from real faces in photographic presentations. As well as its theoretical interest, failure to detect synthetic faces has important implications for security and crime prevention, which often rely on facial appearance and personal identity being related.
Metadata
Item Type: | Article |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | © The Author(s). 2019 |
Dates: |
|
Institution: | The University of York |
Academic Units: | The University of York > Faculty of Sciences (York) > Psychology (York) |
Depositing User: | Pure (York) |
Date Deposited: | 25 Nov 2019 15:50 |
Last Modified: | 16 Oct 2024 16:13 |
Published Version: | https://doi.org/10.1186/s41235-019-0197-9 |
Status: | Published |
Refereed: | Yes |
Identification Number: | 10.1186/s41235-019-0197-9 |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:153884 |