VL2's Dr. Lorna Quandt and Melissa Malzkuhn Win $850,000 NSF Funding Award

A study of ASL training using Virtual Reality
Virtual Reality ASL study

The National Science Foundation has awarded $850,000 over three years to VL2’s Dr. Lorna Quandt and Melissa Malzkuhn for a project entitled, “New Dimensions of ASL Learning: Implementing and Testing Signing Avatars & Immersive Learning.”

Known as SAIL 2, this research project will build upon a previous three-year, $300,000 project funded by the NSF. Said Dr. Quandt, “This project is a major advancement of our existing work in this area, and the ultimate goal of this project is to create a powerful, multidimensional ASL-learning game in virtual reality. Also, this funding will provide many enriching opportunities for students and staff at Gallaudet University. We are eager to start work on this project.”

Dr. Quandt, an assistant professor in the Ph.D. in Educational Neuroscience (PEN) program and the director of the Action & Brain Lab, is the Principal Investigator, while Ms. Malzkuhn, creative director of the Motion Light Lab, is the Co-Principal Investigator.

Congratulations to Dr. Quandt and Ms. Malzkuhn!

[For more, from University Communications: "Quandt, Malzkuhn secure NSF grant for signing avatars research"]

 

Project Abstract

The highly-spatial three-dimensional nature of American Sign Language (ASL) has created a serious barrier to technology-supported ASL instruction. What if ASL learners could access high-quality ASL instruction from native sign language instructors through a virtual reality-based, game-like environment? This project launches from prior work on the NSF-funded Signing Avatars & Immersive Learning (SAIL) project. The SAIL project yielded a working prototype of an immersive sign language learning environment in virtual reality. The current project expands past the prototype stage into a fully-fledged ASL learning experience. In the new version of SAIL, called SAIL 2, the research team is developing a more complete system where users enter virtual reality and interact with signing avatars (computer-animated virtual humans built from motion capture recordings) who teach users ASL vocabulary. Access to signed language is key to healthy development for many deaf individuals, but it remains a major challenge when access to high quality ASL instruction is limited by time and resources. SAIL 2 sets a foundation for greater access to learning ASL, which has potential for improving the lives of deaf children and adults. The project focuses on developing and testing this entirely novel ASL learning tool and fostering the inclusion of underrepresented minorities in STEM. This work has the potential to substantially advance the fields of virtual reality, ASL instruction, and embodied learning.

Immersive virtual reality is particularly well suited for highly spatial signed languages. The SAIL 2 project leverages head-mounted virtual reality and high-quality signing avatars to create a gamified ASL-learning system. SAIL 2 will be the only ASL learning system in virtual reality which does not require the user to wear specialized gloves or other peripheral devices. The project develops a functioning version of the comprehensive SAIL 2 system, and user testing during the design process guides the details of development. Key features of the system include sign recognition through hand tracking cameras, corrective feedback, and a gamified experience. Following the design and development of SAIL 2, the research team conducts behavioral research to evaluate the learning outcomes of SAIL 2. Evaluation of specific learning outcomes includes both understanding of ASL vocabulary and accuracy of sign production. Because of the embodied nature of signed language, mechanistic measures of the neural substrates of learning, including engagement of the sensorimotor cortices, are obtained through electroencephalography (EEG). The patterns of neural oscillatory activity provide insight into short-term changes in brain activity associated with using SAIL 2.The cognitive neuroscience experiment builds on previous research identifying the neural processes supporting sign language perception, and overall this project extends technological advances in high-fidelity motion capture recordings, avatar creation, and virtual reality.

Read about the first SAIL research project: “Signing Avatars & Immersive Learning: Development and Testing of a Novel Embodied Learning Environment.”