Autistic young people adaptively use gaze to facilitate joint attention during multi-gestural dyadic interactions

Author:

Caruana Nathan1ORCID,Nalepka Patrick1,Perez Glicyr A1,Inkley Christine1,Munro Courtney1,Rapaport Hannah2,Brett Simon1,Kaplan David M1,Richardson Michael J1,Pellicano Elizabeth13ORCID

Affiliation:

1. Macquarie University, Australia

2. University of Cambridge, UK

3. University College London, UK

Abstract

Autistic people often experience difficulties navigating face-to-face social interactions. Historically, the empirical literature has characterised these difficulties as cognitive ‘deficits’ in social information processing. However, the empirical basis for such claims is lacking, with most studies failing to capture the complexity of social interactions, often distilling them into singular communicative modalities (e.g. gaze-based communication) that are rarely used in isolation in daily interactions. The current study examined how gaze was used in concert with communicative hand gestures during joint attention interactions. We employed an immersive virtual reality paradigm, where autistic ( n = 22) and non-autistic ( n = 22) young people completed a collaborative task with a non-autistic confederate. Integrated eye-, head- and hand-motion-tracking enabled dyads to communicate naturally with each other while offering objective measures of attention and behaviour. Autistic people in our sample were similarly, if not more, effective in responding to hand-cued joint attention bids compared with non-autistic people. Moreover, both autistic and non-autistic people demonstrated an ability to adaptively use gaze information to aid coordination. Our findings suggest that the intersecting fields of autism and social neuroscience research may have overstated the role of eye gaze during coordinated social interactions. Lay abstract Autistic people have been said to have ‘problems’ with joint attention, that is, looking where someone else is looking. Past studies of joint attention have used tasks that require autistic people to continuously look at and respond to eye-gaze cues. But joint attention can also be done using other social cues, like pointing. This study looked at whether autistic and non-autistic young people use another person’s eye gaze during joint attention in a task that did not require them to look at their partner’s face. In the task, each participant worked together with their partner to find a computer-generated object in virtual reality. Sometimes the participant had to help guide their partner to the object, and other times, they followed their partner’s lead. Participants were told to point to guide one another but were not told to use eye gaze. Both autistic and non-autistic participants often looked at their partner’s face during joint attention interactions and were faster to respond to their partner’s hand-pointing when the partner also looked at the object before pointing. This shows that autistic people can and do use information from another person’s eyes, even when they don’t have to. It is possible that, by not forcing autistic young people to look at their partner’s face and eyes, they were better able to gather information from their partner’s face when needed, without being overwhelmed. This shows how important it is to design tasks that provide autistic people with opportunities to show what they can do.

Funder

Australian Research Council

Macquarie University

Publisher

SAGE Publications

Subject

Developmental and Educational Psychology

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3