Behavioral and neural responses to American Sign Language avatars

ID: 4035
Status: Ongoing
Start date: February 2020
End Date: September 2021

Description

This is the first neuroscience and Human-Computer Interaction (HCI) study to examine how the biological and synthesized motions of signing avatars will impact neural responses to ASL. Investigating which technology modality of ASL is able to provide users with embodied cognition engagement that comes with fluent biological motions in signing avatars will help us improve the design of signing avatars as embodied interface. The signing avatars can then be used to inform and develop technology design with ASL and novel systems. This will lead to improved ASL resources and HCI design for signing avatars while answering current neuroscience questions on embodied cognition and sign languages. This study will collect user rating and EEG signals as deaf fluent signers and hearing nonsigners imitate signs from human signer video, synthesized motion avatar, and biological motion avatar. Our hypotheses are that in both signing deaf and hearing non-signing groups, both video of human signer and signing avatar with biological motion will elicit significantly better HCI user rating than the signing avatar with synthesized motion. Secondly, there will be significant differences in brain activity for the synthesized motion signing avatar because the signing deaf group will be able to draw on their ASL experience.

Principal investigators

Priorities addressed

Funding sources