Huang, G. and Moore, R.K. orcid.org/0000-0003-0065-3311 (2022) Is honesty the best policy for mismatched partners? Aligning multi-modal affordances of a social robot: an opinion paper. Frontiers in Virtual Reality, 3. 1020169. ISSN 2673-4192
Abstract
Spoken interactions between a human user and an artificial device (such as a social robot) have attracted much attention in recent decades (Lison and Meena, 2014; Oracle, 2020). Shifting from automation robots in the industrial domain, social robots are expected to be used in social domains, such as the service industry, education, healthcare, and entertainment (Bartneck et al., 2020, p.163).
According to Darling (2016)’s definition, a social robot is “a physically embodied, autonomous agent that communicates and interacts with humans on an emotional level”. Many features play important roles in interactions with a social robot, such as people’s experience with technology products, expectations of social robots, interactional environments and other features such as a social robot’s appearance, voice and behaviours. In this last regard, affordance design affects how people perceive a social robot and how such perception affects their behaviours and experiences. The term “affordance” was invented by ecological psychologist Gibson (1977), who proposed that our perception of what it is possible to do with objects is shaped by their form. Affordance indicates what users see and can do with an object in a given situation; it is about perceptual action possibilities in an environment (Matei, 2020).
A strong tendency in social robot affordance design is to make human-robot interaction (HRI) resemble human-human interaction (HHI). It is hoped in many studies that robots designed with anthropomorphic appearances and human-like cognitive behaviours can enable humans to interact with them in similar ways as they would interact with other humans, even to develop social bonds (Leite et al., 2013; Kahn et al., 2015; Koyama et al., 2017; Ligthart et al., 2018). However, there are concerns about this approach. In fact, speech-based artificial agents’ conversational interaction with human users is far from natural, and the language used tends to be formulaic (Moore et al., 2016).
One of the reasons behind this is a significant change in the applications of spoken human-agent interaction (HAI) along the evolution of spoken language technology applications (Moore, 2017a). Compared with “command and control systems” of the 1970s and contemporary smartphone-based “personal assistants”, social robots are expected to be used in more dynamic and open environments. This implies that users’ expectations, demands and ways to interact with spoken agents differ depending on the use case. What has succeeded before in real-time spoken HAI (e.g., voice command for specific uses) may not work well for social robots in some contacts. Additionally, a social robot’s human-like affordances could be seen as “dishonest” because such signals hide the fact that a social robot has limited interactive capabilities and is a “mismatched” conversational partner (Moore, 2015; 2017b). What’s more, the approach to constructing a robot by integrating off-the-shelf human-like technologies lacks an appreciation of the function and behaviour of speech in a broader theoretical framework (Moore, 2015).
This paper takes a step back to consider what human users look for when speaking to a social robot. It starts by looking at the nature and the process of spoken interactions. It then discusses why honesty is the best policy for a social robot in HRI. Furthermore, the arguments presented here support the hypothesis that aligning a social robot’s external affordances coherently with internal capabilities can shape its usability and improve human users’ experience in HRI.
Metadata
Item Type: | Article |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | © 2022 Huang and Moore. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms (https://creativecommons.org/licenses/by/4.0/) |
Keywords: | social robot; affordance design; honest signals; use cases; internal capabilities |
Dates: |
|
Institution: | The University of Sheffield |
Academic Units: | The University of Sheffield > Faculty of Engineering (Sheffield) > Department of Computer Science (Sheffield) |
Funding Information: | Funder Grant number Engineering and Physical Sciences Research Council EP/S023062/1 Engineering and Physical Sciences Research Council 2431584 Engineering and Physical Sciences Research Council 2638499 |
Depositing User: | Symplectic Sheffield |
Date Deposited: | 10 Oct 2022 11:08 |
Last Modified: | 17 Jul 2024 10:51 |
Status: | Published |
Publisher: | Frontiers Media SA |
Refereed: | Yes |
Identification Number: | 10.3389/frvir.2022.1020169 |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:191263 |