PhD in Educational Neuroscience (PEN)

Students in our pioneering PEN program gain state-of-the-art Cognitive Neuroscience training in how humans learn, with a special strength in the neuroplasticity of visually guided learning processes. While Cognitive Neuroscience includes studies of learning and higher cognitive processes across the lifespan, its sister discipline, Educational Neuroscience, includes intensive study of five core domains that are crucial in early childhood learning, including language and bilingualism, reading and literacy, math and numeracy, science and critical thinking (higher cognition), social and emotional learning, and includes study of action and visual processing. PEN students become expert in one of the world's cutting-edge neuroimaging methods in the discipline of Cognitive Neuroscience (e.g., fNIRS, EEG, fMRI, and beyond), study Neuroethics, gain strong critical analysis and reasoning skills in science, and develop expertise in one of the core content areas of learning identified above. While becoming experts in both contemporary neuroimaging and behavioral experimental science, students also learn powerful, meaningful, and principled ways that science can be translated for the benefit of education and society today.

Dr. Laura-Ann Petitto, Chair, PEN Steering Committee
Dr. Thomas Allen, Program Director, PEN
Dr. Melissa Herzig , Assistant Program Director, PEN


Behavioral and neural responses to American Sign Language avatars

ID: 4035
Status: Ongoing
Start date: February 2020
End Date: September 2021

Description

This is the first neuroscience and Human-Computer Interaction (HCI) study to examine how the biological and synthesized motions of signing avatars will impact neural responses to ASL. Investigating which technology modality of ASL is able to provide users with embodied cognition engagement that comes with fluent biological motions in signing avatars will help us improve the design of signing avatars as embodied interface. The signing avatars can then be used to inform and develop technology design with ASL and novel systems. This will lead to improved ASL resources and HCI design for signing avatars while answering current neuroscience questions on embodied cognition and sign languages. This study will collect user rating and EEG signals as deaf fluent signers and hearing nonsigners imitate signs from human signer video, synthesized motion avatar, and biological motion avatar. Our hypotheses are that in both signing deaf and hearing non-signing groups, both video of human signer and signing avatar with biological motion will elicit significantly better HCI user rating than the signing avatar with synthesized motion. Secondly, there will be significant differences in brain activity for the synthesized motion signing avatar because the signing deaf group will be able to draw on their ASL experience.

Principal investigators

Priorities addressed

Funding sources


Developmental Neuroplasticity and Timing of First Language Exposure in Infants

ID: 3608
Status: Ongoing
Start date: February 2018
End Date: August 2020

Description

This research project seeks to understand the mechanisms that underlie learning (i.e. language acquisition) in the developing brain in order to improve understanding of typical and atypical cognition. Much controversy exists in both science and speech, language, and hearing professionals regarding the optimal age (if at all) to expose young children to a visual signed language. This study promises to have high impact to broader society as our understanding from this study will ameliorate barriers to the successful use of hearing enhancement technologies by identifying optimal developmental timing of language exposure in conjunction with cochlear implantation. We utilize functional near infrared spectroscopy (fNIRS) and behavioral techniques that are compatible with young children and particularly recipients of cochlear implants to capture the modulation of the language neural networks as a function of different language exposure experiences. Congenitally deaf infants with cochlear implants provide scientists with an extraordinary natural experiment in which exposure to auditory-based and visual-based language permits investigation into controlled timing of linguistic exposure. Thus, in this first-time targeted study of brain tissue development in young cochlear implanted infants, we will better understand the neural network that underlies language acquisition and processing in terms of its neurobiological maturational sensitivity as well its neuroplasticity and resilience to modality of language.

Principal investigators

Additional investigators

Priorities addressed


Do expert signers recruit signed phonology processes while solving single digit multiplication problems?

ID: 3984
Status: Ongoing
Start date: August 2020
End Date: August 2021

Description

The project aims at investigating the role of ASL phonology and the underling neural substrates in solving single digit multiplication problems. In spoken languages, phonology and the left lateralized language areas are recruited when verbally retrieving single digit multiplication problems. The role of ASL phonology in arithmetic in general and specifically in the retrieval of multiplication problems is unknown despite abundant literature addressing reading in ASL users. In this study, we will recruit deaf participants with profound to severe hearing loss who have been exposed to ASL prior to age 2 and have had substantial exposure throughout their educational upbringing. Participants will have no history of neurological or developmental disorders and no know learning disability. Results will inform on the typical network involved in multiplication problem for ASL signers and outline a model to then investigate the impact of late ASL sign language exposure or math learning disability in ASL signers.

Principal investigators

Priorities addressed

Funding sources


Impact of Language Experience on Early Numerical Cognition

ID: 3742
Status: Ongoing
Start date: July 2019
End Date: December 2022

Description

The objective of the study is to evaluate longitudinally the impact of language modality and early language experience on the core numerical representation and on the acquisition of the concept of exact number. To do this, 180 children aged 3 to 5 will be followed for up to two years. Leveraging the natural variability occurring within the deaf community, 60 children will be native American Sign Language (ASL) users, 60 children will have been exposed to a visual language after 24 months of age (e.g., deaf children with late cochlear implant and no in-home visual language), and the remaining will be English speaking children with no hearing loss and no delay in language exposure. Children will be evaluated at ~8 months intervals, between 2 to 4 times, on basic number skills until they reach proficient understanding of the exact number concept. They will also be assessed for language skills and general IQ. Parents will fill out a comprehensive survey on their child's language use and in-home language. This paradigm will allow to determine the impact of language modality and proficiency on the developmental trajectory of the core numerical representation. It will also allow to determine if the stages for reaching a full understanding of the exact number concept can be delayed or facilitated depending on language modality. Could the use of fingers in ASL to represent numbers facilitate early number concept acquisition? Does a delay in language exposure impact both the core number system and the acquisition of formal number concepts? Are the different stages impermeable to early language experience? What role does language play in the relation between the core numerical representation and the acquisition of exact number concept? These long-standing questions in the field of numerical cognition can be uniquely answered through the perspective of a visual language and time of language exposure.

Principal investigators

Additional investigators

Priorities addressed

Funding sources


Investigation of young ASL signing children's counting skills through online testing

ID: 3985
Status: Ongoing
Start date: July 2020
End Date: March 2021

Description

We are testing the use of online conferencing to evaluate the development of counting skills in 3 to 5 years old signing children.

Principal investigators

Additional investigators

Priorities addressed

Funding sources


Neural Bases of Tactile and Visual Language Processing

ID: 3609
Status: Ongoing
Start date: April 2017
End Date: December 2020

Description

The proposed experiments in this project build towards addressing questions about neuroplasticity and resilience in the human cortex. To understand the neuroplasticity and resilience of the neural systems that underlie human communication, it is vital to include in a program of study populations with variations in (1) timing of first and second language exposure, (2) modality of language (i.e. tactile, auditory, visual), and (3) sensory experience (deaf-blind, hearing, and deaf populations.) The proposed project here focuses specifically on a DeafBlind population that uses a tactile language (i.e. ProTactile ASL, PTASL). We know that human language processing neural networks are constrained, yet flexible, and permits our species to learn and use a wide range of language structures and languages encoded in multiple modalities (visual, tactile, and auditory) and by including DeafBlind PTASL signers in the corpus of cognitive neuroscience literature, we advance understanding of the mechanisms that make this possible and, vitally, we illuminate possible overarching principles that guide human neural reorganization and resilience. Furthermore, the proposed experiments in this project begin to address key questions that have very strong relevance to society (particularly DeafBlind populations) surrounding debates about whether observed neural reorganization are instances of "maladaptive plasticity" or "functional resilience." By clarifying the scientific principles that underlie neuroplasticity findings and their interpretation, policies revolving around learning (e.g. optimizing language acquisition, sensory intervention for infants, reading practices, etc.) can be optimized greatly and the community may benefit indirectly from this proposed research project.

Principal investigators

Additional investigators

Priorities addressed

Funding sources


Neural Correlates of Biological Motion Perception in Sign Language Users

ID: 3708
Status: Ongoing
Start date: September 2019
End Date: September 2020

Description

Although widely studied in typically developing populations, the neural basis of biological motion perception has not yet been studied amongst a group that uses action as their primary mode of communication: sign language users. We hypothesized that the continuous perception of biological motions used in sign language may mean that native signers show an increased ability to extract relevant action information. With this EEG study we test whether Deaf signers' (N = 19) sensorimotor systems are differentially sensitive to biological motion presented in two conditions (scrambled vs. unscrambled) compared to hearing non-signers. We predicted greater central alpha event-related desynchronization (ERD) for the unscrambled condition, due to greater demands on sensorimotor cortices when understanding movement. Everyday actions (e.g., jumping jacks, jump rope) were presented using point light displays (PLD). Time-frequency activity in the alpha and beta ranges was computed for each condition at frontal electrodes and central sites overlying the sensorimotor cortex. Paired comparisons showed significantly greater ERD at central electrode sites in response to scrambled PLDs as compared to unscrambled PLDs (p<.05, bootstrapped). This finding suggests that deaf signers may recruit sensorimotor systems more strongly in response to unintelligible actions than coherent action, contrary to our prediction. Frontal electrodes showed the same pattern of ERD (p<.05, bootstrapped), suggesting that executive functions are involved in parsing scrambled PLDs. The results from Deaf native signers were statistically compared to the EEG responses of hearing non-signers. This work provides the first investigation of sensorimotor EEG in Deaf signers during PLD observation.

Principal investigators

Additional investigators

Priorities addressed

Funding sources

Products

Kubicek., E. & Quandt, L. C. (2019). Neural correlates of biological motion perception in sign language users. Data Blitz talk at Cognitive Neuroscience Society in San Francisco, CA.

Kubicek., E. & Quandt, L. C. (2019). Neural correlates of biological motion perception in sign language users. Presented at the annual meeting of the Cognitive Neuroscience Society in San Francisco, CA.

Quandt, L. C., Kubicek, E., & Lamberton, J. (2020). Superior discrimination of complex biological motions in native ASL signers. Presented at the annual meeting of the Cognitive Neuroscience Society.


Neural Correlates of Observing and Producing Sign Language

ID: 3709
Status: Ongoing
Start date: September 2019
End Date: October 2020

Description

Large cognitive neuroscience EEG project with 60+ participants enrolled in a multi-part study to examine how signers and non-signers process written English, perceive ASL, and imitate ASL signs.

Principal investigators

Additional investigators

Priorities addressed

Funding sources

Products

Berger, L. & Quandt, L. C. (2018). Sensorimotor EEG indicates deaf signers simulate tactile properties of ASL signs when reading English. Presented at the annual meeting of the Society for Neurobiology of Language in Quebec City, Canada.

Kubicek, E. & Quandt, L. C. (2018). Deaf signers' sensorimotor system activity during perception of one- and two-handed signs. Presented at the annual meeting of the Cognitive Neuroscience Society in Boston, MA.

Kubicek, E. & Quandt, L. C. (2018). Deaf signers' sensorimotor system sensitive to motoric and linguistic parameters of sign language. Presented at SACNAS 2018: The National Diversity in STEM Conference in San Antonio, TX.

Kubicek, E. & Quandt, L. C. (2019). Sensorimotor system engagement during ASL sign perception: an EEG study in deaf signers and hearing non-signers. Cortex, 119, 457-469.

Kubicek, E. & Quandt, L. C. (2019). Sensorimotor system engagement during ASL sign perception: an EEG study in deaf signers and hearing non-signers. Cortex. Pre-print available ahead of publication at https://doi.org/10.1101/558833

Quandt, L. C. & Kubicek, E. (2018). Sensorimotor characteristics of sign translations modulate EEG when deaf signers read English. Brain and Language, 187, 9-17.

Quandt, L. C. & Kubicek, E. M. (2017). Motor system contributions to cross-linguistic translation when deaf signers read English. Society for Neuroscience Annual Meeting, Washington, D.C.

Quandt, L. C. & Willis, A. S. (2019). Sensorimotor EEG activity during sign production in deaf signers and hearing non-signers. Presented at the annual meeting of the Society for Neurobiology of Language in Helsinki, Finland.

Willis, A. & Quandt, L. C. (2019). Sign language experience increases motor resonance during imitation of signs. Presented at the annual meeting of the Cognitive Neuroscience Society in San Francisco, CA.


Neural investigation on the impact of a visual language on arithmetic processing: an fMRI approach

ID: 3744
Status: Ongoing
Start date: July 2019

Description

Investigate the neural network, brain structures and cognitive processes involved in arithmetic processing for native ASL signers compared to hearing English speakers. Brain activation from adults performing single digit arithmetic problems, subtraction and multiplication problems, will be recorded. Different brain areas will be independently localized to identify which cognitive components are involved and to which extent depending on language modality. We will adopt a numerical processing localizer, a verbal rhyming localizer (ASL or English) and a hand movement localizer. Within the areas identifies by the localizers we expect to find similar numerical quantity processes across language modality groups. We expect language based activation for multiplication processes if both groups rely on verbal retrieval memory, regardless of modality. We expect increased motor movement activation in the ASL signing group given the observation in previous studies that ASL signers activate motor and supplementary motor areas when processing linguistic information regardless of whether presented in written English or ASL.

Principal investigators

Additional investigators

Priorities addressed

Funding sources


Signing Avatars & Immersive Learning (SAIL)

ID: 3707
Status: Ongoing
Start date: August 2018

Description

The aim of the proposed work is to develop and test a system in which signing avatars (computer-animated virtual humans/characters built from motion capture recordings) help deaf or hearing individuals learn ASL in an immersive virtual environment. The system will be called Signing Avatars & Immersive Learning (SAIL). Interactive speaking avatars have become valuable learning tools, whereas the potential uses of signing avatars have not been adequately explored. Due to the spatial and movement characteristics of natural sign languages, this project leverages the cognitive neuroscience of action perception to test the SAIL system. We will use motion capture recordings of native deaf signers, signing in ASL, to create signing avatars. The avatars will be placed in a virtual reality landscape which can be accessed via head-mounted goggles. Users will enter the virtual reality environment by wearing the goggles, and the user's own movements will be captured via gesture-recognition system (e.g., smart gloves). When using SAIL, users will see a signing avatar from a third person perspective, and they will also see a virtual version of their own arms, from a first person perspective. This first-person perspective can be matched onto their actual movements in the real world. By using gesture recognition systems users will imitate signs and learn through interactive lessons given by avatars. SAIL helps users to visualize and embody a spatial and visual language. This creates an embodied, immersive learning environment which may revolutionize ASL learning. SAIL will provide us the opportunity to understand the cognitive process of visual perception of ASL in a controlled 3d digital environment. Following the development of SAIL, we propose an electroencephalography (EEG) experiment to examine how the sensorimotor systems of the brain are engaged by the embodied experiences provided by SAIL. The action observation network of the human brain is active during the observation of others' movements. The extent of this activity during viewing of another person signing will provide insight into how the observer's own sensorimotor system processes the observed signs within SAIL.

Principal investigators

Additional investigators

Priorities addressed

Funding sources

Products

Quandt, L. C. (2019) Embodying sign language: Using avatars, VR, and EEG to design novel learning tools. Center for Adaptive Systems of Brain-Body Interactions Seminar Series, George Mason University, Fairfax, VA.

Quandt, L. C., Malzkuhn, M. (2019). Participant: NSF STEM for All Video Showcase. "SAIL: Signing Avatars & Immersive Learning" dissemination video.

Quandt, L.C. (2019). Signing avatars and embodied learning in virtual reality. NSF AccessCyberlearning 2.0 Capacity Building Institute, University of Washington, Seattle, WA.


The impact of language experience on the neural activations of arithmetical processing

ID: 3606
Status: Ongoing
Start date: March 2018
End Date: October 2021

Description

The aim is to investigate the differences and similarities in the neural correlates, through the EEG recordings, of native ASL users and English native speakers while performing single-digit arithmetic problems.

Principal investigators

Additional investigators

Priorities addressed


The Role of Auditory Experience in the Neurocognitive Systems for Everyday and Effortful Listening

ID: 3587
Status: Ongoing
Start date: January 2018
End Date: December 2019

Description

Current models of auditory cognition suggest that cognitive resources for processing degraded acoustic information are limited, creating a trade-off between effort and comprehension. Indeed, everyday listening frequently occurs under a wide range of inescapable suboptimal and adverse conditions, challenges which are exacerbated by reduced hearing acuity and the use of imperfect hearing amplification and prosthetic devices. In a cognitive neuroscience experiment using optical neuroimaging, we assess: (A) the effects of early-life sensitive windows on the neuroplasticity and stability of language processing networks in response to early-life, chronic exposure to acoustically degraded speech; and (B) the strength of the relationship between self-reported global health, subjective mental effort ratings, and neural activation patterns for different listening conditions. Advancing these scientific questions allows us to better understand of the complex nature of neuroplasticity and early-life sensitive windows for language processing, and ultimately informs us of the underlying cognitive mechanisms that play a role in spoken language outcomes for hearing aid and cochlear implant users. This work has profound implications for transformative translational impacts across several domains, such as educational practice and policy, aural (re)habilitation clinical practice approaches, and assessment of clinical health outcomes. Ultimately, this work will advance several scientific and societal questions regarding the role of deafness mediated by hearing technologies in certain cognitive functions, such as language processing and comprehension, effort, stress, and fatigue. These advancements could improve overall health and quality of life outcomes in those with hearing loss.

Principal investigators

Additional investigators

Priorities addressed

Funding sources

Products

White, B. E. (2018, May). The role of auditory experience on the neurobiological systems for effortful listening. Presented at the Neuroimaging with fNIRS: Basic to Advanced Concepts workshop hosted by NIRx Medical Technologies, Gallaudet University, National Science Foundation and Gallaudet University Science of Learning Center on Visual Language and Visual Learning, and the Gallaudet University Ph.D. in Educational Neuroscience (PEN) Program, Washington, DC.

White, B. E. (2020, February). Listening Effort and Neurocognitive Plasticity in Hearing Aid and Cochlear Implant Users. Presentation at the Department of Linguistics and Cognitive Science, University of Delaware, Newark, DE.

White, B. E., & Langdon, C. (2018, August). Hierarchical processing of degraded speech: A functional near-infrared spectroscopy study. Poster presented at the annual meeting of the Society for the Neurobiology of Language, Québec City, Québec, Canada.

White, B. E., & Langdon, C. (2018, January). Auditory experience and the neurobiological systems for effortful listening: A combined optical neuroimaging and thermal imaging study. Presented at the Mid-Atlantic Symposium on Hearing, College Park, MD.


Scholarship and creative activity

2018

Berteletti, I. (2018, June). Educational Neuroscience, what is it and what it's not. Presented at the University of Trento, Rovereto, Italy

2018

Parks, A., White, B. E., Lancaster, L., and Bakke, M. (2018, April). The test-retest reliability of the Early Speech Perception Test in adults with severe to profound hearing levels. Poster presentation at the Department of Hearing, Speech, and Language Sciences, Gallaudet University, Washington, DC.

Parks, A., White, B. E., Lancaster, L., and Bakke, M. (2018, February). The role of pure-tone average and auditory linguistic experience on word recognition and pattern perception ability in adults with severe to profound hearing levels. Poster presentation at the Department of Hearing, Speech, and Language Sciences, Gallaudet University, Washington, DC.

White, B. E. (2018, April). Building the visual vocabulary: A resource guide on vocabulary development in young deaf and hard-of-hearing children. Available at https://www.visvoc.bradleywhite.net

White, B. E. (2018, April). Language development timeline (0-5 years old): A resource guide on vocabulary development in young deaf and hard-of-hearing children. Available at https://www.visvoc.bradleywhite.net/assets/files/white16-languagedevelopmenttimeline.pdf

White, B. E. (2018, April). Resting state functional connectivity: Methodological and statistical approaches for functional near-infrared spectroscopy. Presentation at the Language and Educational Neuroscience Laboratory, Washington, DC.

White, B. E. (2018, April). Tips for facilitating vocabulary development: A resource guide on vocabulary development in young deaf and hard-of-hearing children. Available at https://www.visvoc.bradleywhite.net/assets/files/white16-tipsforfacilitatingvocabularydevelopment.pdf