Aim and Scope
This full-day workshop will investigate the sensorimotor and affective mechanisms that underlie human-robot interaction. Non-verbal and affective cues and expressions used to foster cooperation, mutual understanding and signal trustworthiness are manifested by humans all the time. If these cues are not appropriately reciprocated, however, the interaction can be negatively impacted. Moreover, inappropriate reciprocation, or lack thereof, may be the result of misperception and or non-timely reactions. Failure to adequately account for biologically plausible perceptual and temporal facets of interactions may detract from the quality of human-robot interaction and hinder progress in the field of social robotics more generally.
Incorporation of naturalistic and adaptive forms of sensorimotor and affective human-robot non-verbal communication is challenging because such interaction is highly dependent on the context and the relationship between the observer and the expresser. Biological species based interaction often requires explicit forms of social signalling such as nodding, nonverbal gestures, emotional expressions, etc., the interpretation of all of which may be highly context-sensitive. Furthermore, naturalistic social signalling may involve a certain degree of mimicry of autonomic responses such as pupil dilation, blinking, blushing, etc. which, in human-robot interaction requires the implementation of time-sensitive perceptual mechanisms currently underused in both commercial and research robotics platforms.
In this workshop, we will investigate and discuss to what extent the aforementioned naturalistic social signalling capabilities needs to be accounted for in human-robot interaction and what modalities are more relevant, and in what contexts. The workshop will focus strongly on research motivated by naturalistic empirical data. We hope to provide a discussion friendly environment to connect with research covering complementary interests in the areas of: robotics, computer science, psychology, neuroscience, affective computing and animal learning research.
The primary list of topics covers the following (but not limited to):
- Emotion recognition
- Gesture recognition
- Social gaze recognition
- The development of expression and recognition capabilities
- Joint visual attention and activity
- Alignment in social interactions
- Non-verbal cues in human-robot interaction
Canada 150 Research Chair in Intelligent Robotics, University of Waterloo, Canada
My talk will cover some research on (primarily) non-verbal human-robot interaction that I have been involved in over the past few years. This includes applications of robots as home companion robots with the goal to assist independently living, as well as using robots as social mediators in robot-assisted therapy for children with autism. I will present results from a few studies in these domains, and point our challenges for future research.
Faculty of Psychology, University of Vienna, Austria
Children imitate actions that serve no apparent function with regard of the goal of the action sequence, a phenomenon termed over-imitation. In my talk I will present a series of experiments in which we determined relevant characteristics of the model affecting the occurrence and persistence of over-imitation in preschoolers. We show that communication is not necessary to elicit over-imitation, but it does enable children to switch more flexibly between different more or less efficient action strategies. Group membership, manipulated through minimal groups, did not affect over-imitation rates when all models were equally communicative. Similarly, children were equally likely to imitate a communicative robot model as a human. I will discuss our findings in the light of the underlying motivations and potential rationality of over-imitation.
Center for Human Technologies, Istituto Italiano di Tecnologia, Italy
Joint Research Centre, European Commission
Lund University, Sweden
International Research Center for Neurointelligence, The University of Tokyo, Japan
School of Computer Science, University of Hertfordshire, UK
In this presentation, I will summarise the results of some of our recent research on motor resonance in human-robot interaction (HRI), and their potential to measure the 'quality of interaction'. I will subsequently attempt to clarify the often tacit assumptions about the potential role of motor resonance measures in HRI, as well as discuss the link between motor resonance and the related notions of entrainment and rapport.
Call for Papers
Participants are invited to submit short paper (max 4 pages) following the standard IEEE conference style. Submissions must be in PDF and should be send per email to email@example.com with [ICDL-EPIROB 2019] in the subject
Selected contributions will be presented during the workshop.
Paper submission deadline: 12th May 2019
Notification of acceptance: 7th June 2019
Camera-ready version: 1st August 2019
Workshop: 19th August 2019
For information about registration for this workshop please refer to the ICDL-EPIROB 2019 website.
University of Gothenburg