Meese, Timothy S, Baker, Daniel Hart orcid.org/0000-0002-0161-443X and Summers, Robert J (2023) Blurring the boundary between models and reality: Visual perception of scale assessed by performance. PLoS ONE. e0285423. ISSN 1932-6203
Abstract
One of the primary jobs of visual perception is to build a three-dimensional representation of the world around us from our flat retinal images. These are a rich source of depth cues but no single one of them can tell us about scale (i.e., absolute depth and size). For example, the pictorial depth cues in a (perfect) scale model are identical to those in the real scene that is being modelled. Here we investigate image blur gradients, which derive naturally from the limited depth of field available for any optical device and can be used to help estimate visual scale. By manipulating image blur artificially to produce what is sometimes called fake tilt shift miniaturization, we provide the first performance-based evidence that human vision uses this cue when making forced-choice judgements about scale (identifying which of an image pair was a photograph of a full-scale railway scene, and which was a 1:76 scale model). The orientation of the blur gradient (relative to the ground plane) proves to be crucial, though its rate of change is less important for our task, suggesting a fairly coarse visual analysis of this image parameter.
Metadata
Item Type: | Article |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | © 2023 Meese et al. |
Dates: |
|
Institution: | The University of York |
Academic Units: | The University of York > Faculty of Sciences (York) > Psychology (York) |
Depositing User: | Pure (York) |
Date Deposited: | 09 May 2023 08:40 |
Last Modified: | 16 Oct 2024 19:11 |
Published Version: | https://doi.org/10.1371/journal.pone.0285423 |
Status: | Published |
Refereed: | Yes |
Identification Number: | 10.1371/journal.pone.0285423 |
Related URLs: | |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:199004 |