13th Speech in Noise Workshop, 20-21 January 2022, Virtual Conference 13th Speech in Noise Workshop, 20-21 January 2022, Virtual Conference

P02 Audiotactile speech perception and neural mechanisms

Pierre H. Guilleminot
Imperial College London, UK

Tobias Reichenbach
FAU Erlangen-Nuremberg, Germany

(a) Presenting
(b) Attending

Background: Speech involves a hierarchical structure of information, ranging from phonemes to syllables, words and sentences. These different units of information need to be segmented in order to be processed by the brain. The segmentation presumably relies on oscillations in the delta and in the theta frequency ranges (1-4 Hz and 4-8 Hz) in the auditory cortex, which track incoming speech at the rhythm of syllable and words. The tracking in the theta range plays a functional role in speech processing, as its modulation using transcranial current stimulation has been found to affect speech comprehension. Because these cortical oscillations can also be influenced by somatosensory stimulation, we wondered if such stimulation could impact speech comprehension as well.

Methods: We delivered sparse vibrotactile pulses to the hand of subjects while they listened to speech in background noise. The pulses were aligned to the centre of syllables and shifted in time to study the effect of different lags between the two sensory streams. We assessed the participants' comprehension of the speech signal. Furthermore, we studied the neural encoding of speech and the vibrotactile pulses through electroencephalographic recordings (EEG).

Results: We found that tactile pulses presented at the rate of syllables can modulate and even improve speech-in-noise comprehension. The enhancement occurred when the pulses were aligned with the perceptual centers of the syllables, without temporal delay. The neural responses to both speech and vibrotactile pulses were modulated by different delays between the two sensory streams, displayed audiotactile integration, and reflected the behavioural modulation of speech comprehension. Finally, we demonstrated that the comfort of subjects in understanding speech could be predicted from the electrophysiological markers of multisensory integration.

Conclusions: Our results provide evidence for the role of cortical oscillations and vibrotactile information for speech processing. The observed enhancement of speech comprehension suggests that such vibrotactile stimulation might be employed in auditory prosthesis to aid people with hearing impairment to better understand others in noisy environments.

Last modified 2022-01-24 16:11:02