Virtual-reality games to assess and habilitate auditory spatial awareness

Period of Performance: 07/15/2017 - 06/30/2018


Phase 1 STTR

Recipient Firm

VisiSonics Corporation
Principal Investigator


Project Summary: Virtual-reality games to assess and habilitate auditory spatial awareness Auditory spatial awareness (ASA) is the critically important skill of using sound to understand extrapersonal space and its content. Both explicit (sound localization) and implicit (environmental awareness) aspects of ASA contribute to everyday listening, effort, fatigue, and spatial attention. Because hearing alone provides spatial information in all directions, ASA is critical for awareness outside the visual field and for the experience of sensory ?immersion? in both natural and virtual scenes. Self-report studies suggest that hearing impairment, aging, and device use can negatively impact ASA, but the current lack of tools for objective psychophysical assessment in realistic spatial scenes remains a significant barrier to progress in this area. This Phase I STTR proposal tests the feasibility of using virtual-reality (VR) technology to quantify ASA in simulated but perceptually immersive auditory scenes. Software for 3D audio rendering over earphones and over loudspeaker arrays will be modified to work with established psychophysical paradigms in order to assess listeners? sensitivity to room acoustics and to changes in dynamic auditory scenes. Psychophysical measures will be presented in the context of a simple VR game using a high-frame-rate head-mounted display that tracks head movements in real time. Behavioral testing will involve both earphone and free-field presentation. Earphone testing will present spatial sound in rendered 3D space using VisiSonics? RealSpace 3D software. A commercial advantage of the earphone-based approach is its low cost of implementation in standard research and clinical facilities. In contrast, free-field testing will use a high-resolution loudspeaker array to provide naturally accurate spatial cues for each listener. This approach will eventually facilitate testing with hearing aids and cochlear implants that may not be compatible with earphone presentation. If successful, the technology developed via this proposal will ultimately provide quantitative psychophysical measures (e.g., thresholds) of ASA for comparison across diverse pediatric and adult patient populations, across signal processing algorithms for hearing aids, cochlear implants, and 3D audio technology, and across time points in longitudinal studies of ASA habilitation in neurological and audiological patients. Work in Phase II will develop a commercial prototype suitable to such applications and assess performance across diverse populations of listeners. Commercial impact is anticipated in clinical settings, in the design and evaluation of clinical devices, and in perception research for basic science, entertainment, and gaming itself.