Projektdetails
Beschreibung
Comprehension of speech in background noise can be challenging, especially when suffering from hearing impairment and when relying on a hearing aid. However, witnessing a speaker’s lip-movements can increase speech comprehension. This is enabled by the integration of auditory and visual information. In speech comprehension, slow frequency oscillations in the brain, phase aligned to the speech envelope, seem to be relevant for comprehension of speech. Strikingly, recent research has found that there is cortical synchronization to unheard auditory speech when only videos of lip-movements are presented. Instead of the articulatory information, it has also been shown that listeners can benefit from a visually presented speech-envelope. The advantage of visual presentation of the auditory speech envelope is, that we reduce the information from two modalities to only auditory information. In this MEG study, we will be the first to implement a circle corresponding to the auditory speech envelope instead of the lip movements while recording underlying brain activity. For the presentation of this circle, we hypothesize that there is an enhanced comprehension at the neural level, leading to a stronger cortical speech synchronization in challenging listening conditions. For better understanding of these processes, the presentation of the timing and the location of the circle will be varied. Thereby, we will gain insights in the neuronal basis of audiovisual integration and will make first approaches towards a visually enhanced hearing aid.
| Status | Abgeschlossen |
|---|---|
| Tatsächlicher Beginn/ -es Ende | 1/12/21 → 31/10/25 |
-
Ocular speech tracking persists in blindness, but its dynamics and oculo-cerebral connectivity depend on visual status
Benz, K. R., Reitinger, L., Schmidt, F., Bottari, D., Hauswald, K. A., Collignon, O. & Weisz, N., 21 Okt. 2025, bioRxiv, 30 S.Publikation: Working paper/Preprint › Preprint
-
Influence of visual analogue of speech envelope, formants, and word onsets on word recognition is not pronounced
Benz, K. R., Hauswald, A. & Weisz, N., Mai 2025, in: Hearing Research. 460, 7 S., 109237.Publikation: Beitrag in Fachzeitschrift › Artikel › Peer-reviewed
Open Access -
Eye Movements in Silent Visual Speech Track Unheard Acoustic Signals and Relate to Hearing Experience
Benz, K. R., Hauswald, A., Suess, N., Gehmacher, Q., Demarchi, G., Schmidt, F., Herzog, G., Rösch, S. & Weisz, N., 14 Apr. 2025, in: eNeuro. 12, 4, 12 S., ENEURO.0055-25.2025.Publikation: Beitrag in Fachzeitschrift › Artikel › Peer-reviewed
Open Access
Aktivitäten
- 2 Poster-Präsentation
-
Eye Movements in Silent Visual Speech track Unheard Acoustic Signals and Relate to Hearing Experience
Benz, K. R. (Redner/in)
11 Juli 2024Aktivität: Gastvortrag oder Vortrag › Poster-Präsentation › science to science / art to art
-
Speech tracking in eye movements
Benz, K. R. (Präsentator/in)
14 Juli 2023Aktivität: Gastvortrag oder Vortrag › Poster-Präsentation › science to science / art to art