Project Details
Description
Comprehension of speech in background noise can be challenging, especially when suffering from hearing impairment and when relying on a hearing aid. However, witnessing a speaker’s lip-movements can increase speech comprehension. This is enabled by the integration of auditory and visual information. In speech comprehension, slow frequency oscillations in the brain, phase aligned to the speech envelope, seem to be relevant for comprehension of speech. Strikingly, recent research has found that there is cortical synchronization to unheard auditory speech when only videos of lip-movements are presented. Instead of the articulatory information, it has also been shown that listeners can benefit from a visually presented speech-envelope. The advantage of visual presentation of the auditory speech envelope is, that we reduce the information from two modalities to only auditory information. In this MEG study, we will be the first to implement a circle corresponding to the auditory speech envelope instead of the lip movements while recording underlying brain activity. For the presentation of this circle, we hypothesize that there is an enhanced comprehension at the neural level, leading to a stronger cortical speech synchronization in challenging listening conditions. For better understanding of these processes, the presentation of the timing and the location of the circle will be varied. Thereby, we will gain insights in the neuronal basis of audiovisual integration and will make first approaches towards a visually enhanced hearing aid.
| Status | Finished |
|---|---|
| Effective start/end date | 1/12/21 → 31/10/25 |
-
Ocular speech tracking persists in blindness, but its dynamics and oculo-cerebral connectivity depend on visual status
Benz, K. R., Reitinger, L., Schmidt, F., Bottari, D., Hauswald, K. A., Collignon, O. & Weisz, N., 21 Oct 2025, bioRxiv, 30 p.Research output: Working paper/Preprint › Preprint
-
Influence of visual analogue of speech envelope, formants, and word onsets on word recognition is not pronounced
Benz, K. R., Hauswald, A. & Weisz, N., May 2025, In: Hearing Research. 460, 7 p., 109237.Research output: Contribution to journal › Article › peer-review
Open Access -
Eye Movements in Silent Visual Speech Track Unheard Acoustic Signals and Relate to Hearing Experience
Benz, K. R., Hauswald, A., Suess, N., Gehmacher, Q., Demarchi, G., Schmidt, F., Herzog, G., Rösch, S. & Weisz, N., 14 Apr 2025, In: eNeuro. 12, 4, 12 p., ENEURO.0055-25.2025.Research output: Contribution to journal › Article › peer-review
Open Access
Activities
- 2 Poster presentation
-
Eye Movements in Silent Visual Speech track Unheard Acoustic Signals and Relate to Hearing Experience
Benz, K. R. (Speaker)
11 Jul 2024Activity: Talk or presentation › Poster presentation › science to science / art to art
-
Speech tracking in eye movements
Benz, K. R. (Presenter)
14 Jul 2023Activity: Talk or presentation › Poster presentation › science to science / art to art