Auditory cortex encodes lipreading information through spatially distributed activity

Affiliations
  • 1Department of Psychology, University of Michigan, Ann Arbor, MI 48109, USA.
  • 2Department of Neurology, University of Michigan, Ann Arbor, MI 48109, USA.
  • 3Henry Ford Hospital, Detroit, MI 48202, USA; Department of Neurology, Wayne State University School of Medicine, Detroit, MI 48201, USA.
  • 4Department of Psychology, University of Michigan, Ann Arbor, MI 48109, USA. Electronic address: djbrang@umich.edu.

|

Abstract

Watching a speaker’s face improves speech perception accuracy. This benefit is enabled, in part, by implicit lipreading abilities present in the general population. While it is established that lipreading can alter the perception of a heard word, it is unknown how these visual signals are represented in the auditory system or how they interact with auditory speech representations. One influential, but untested, hypothesis is that visual speech modulates the population-coded representations of phonetic and phonemic features in the auditory system. This model is largely supported by data showing that silent lipreading evokes activity in the auditory cortex, but these activations could alternatively reflect general effects of arousal or attention or the encoding of non-linguistic features such as visual timing information. This gap limits our understanding of how vision supports speech perception. To test the hypothesis that the auditory system encodes visual speech information, we acquired functional magnetic resonance imaging (fMRI) data from healthy adults and intracranial recordings from electrodes implanted in patients with epilepsy during auditory and visual speech perception tasks. Across both datasets, linear classifiers successfully decoded the identity of silently lipread words using the spatial pattern of auditory cortex responses. Examining the time course of classification using intracranial recordings, lipread words were classified at earlier time points relative to heard words, suggesting a predictive mechanism for facilitating speech. These results support a model in which the auditory system combines the joint neural distributions evoked by heard and lipread words to generate a more precise estimate of what was said.

Related Concept Videos

JoVE Research Video for Auditory Pathway 01:15

2.8K

Auditory pathways constitute the complex neural circuits responsible for transmitting and interpreting auditory information from the peripheral auditory system to the brain. Sound waves are initially captured by the outer ear, funneled through the ear canal, and reach the tympanic membrane (eardrum). These vibrations are transmitted via the middle ear's ossicles to the inner ear's cochlea.
When viewed cross-sectionally, the cochlea reveals the scala vestibuli and scala tympani flanking…

JoVE Research Video for Motor and Sensory Areas of the Cortex 01:14

1.5K

The cerebral cortex, the brain's outermost layer, is pivotal in processing complex cognitive tasks, emotions, and various sensory inputs and executing voluntary motor activities. This intricate structure is divided into three primary functional areas: the motor areas, sensory areas, and association areas.
Motor Areas
The motor areas located in the frontal lobe are central to controlling voluntary movements. This region is further subdivided into the primary motor cortex and the premotor…

JoVE Research Video for The Cochlea 01:13

43.0K

The cochlea is a coiled structure in the inner ear that contains hair cells—the sensory receptors of the auditory system. Sound waves are transmitted to the cochlea by small bones attached to the eardrum called the ossicles, which vibrate the oval window that leads to the inner ear. This causes fluid in the chambers of the cochlea to move, vibrating the basilar membrane.

The basilar membrane extends from the basal end of the cochlea near the oval window to the apical end at its tip….

JoVE Research Video for Hearing 01:31

49.3K

When we hear a sound, our nervous system is detecting sound waves—pressure waves of mechanical energy traveling through a medium. The frequency of the wave is perceived as pitch, while the amplitude is perceived as loudness.

Sound waves are collected by the external ear and amplified as they travel through the ear canal. When sounds reach the junction between the outer and middle ear, they vibrate the tympanic membrane—the eardrum. The resulting mechanical energy causes the…

JoVE Research Video for Association Areas of the Cortex 01:21

3.0K

Association areas are regions of the cerebral cortex that do not have a specific sensory or motor function. Instead, they integrate and interpret information from various sources to enable higher cognitive processes such as memory, learning, and decision-making. Some key association areas include the following:
Prefrontal Association Area: This area is located in the frontal lobe and is involved in planning, decision-making, and moderating social behavior. It connects with primary motor areas,…

JoVE Research Video for Somatosensory, Motor, and Association Cortex 01:24

127

The somatosensory cortex in the parietal lobes is crucial for interpreting sensory data such as touch, temperature, and proprioception. The somatosensory cortex, situated in the parietal lobes, plays a vital role in interpreting sensory information like touch, temperature, and proprioception—awareness of body position. This specialized brain region features an organized structure wherein neurons at the top primarily process sensations originating from the lower body. In contrast, those at…