The ability to understand a talker in a noisy environment, such as a crowded restaurant, is one of the most impressive perceptual abilities of the human brain. It is especially impressive that, with the two ears, the human brain can leverage spatial acoustic information to parse an auditory scene. However, listeners with hearing loss, including cochlear implant (CI) users, struggle with this task. The overarching goal of my research is to design and test strategies for ameliorating outcomes in clinical populations who struggle with spatial hearing. My dissertation project, currently underway, includes simultaneous functional near-infrared spectroscopy (fNIRS) and electroencephalography (EEG) recordings in both normal hearing listeners and bilateral cochlear implant users. Using combined neuroimaging, I ask questions about how listening task performance, electrophysiological responses, and hemodynamic responses are affected by spatial acoustic information.
More broadly, my research program focuses on understanding the neural processes underlying auditory attention. This is crucial for advancing treatment of hearing loss, as well as neurological and developmental disorders like ADHD and Autism Spectrum Disorder. In the future, I plan to expand my methodological toolkit to include new neuroimaging techniques, new clinical applications, and more fundamental questions in hearing science.
