Anatomically distinct cortical tracking of music and speech by slow (1-8Hz) and fast (70-120Hz) oscillatory activity
View abstract on PubMed
Summary
This summary is machine-generated.Brain activity tracks music and speech envelopes differently across frequency bands. Cortical tracking in the slow frequency band (SFB) shows widespread, rapid responses, while the high frequency band (HFB) is more specialized for speech.
Area Of Science
- Neuroscience
- Auditory Processing
- Signal Processing
Background
- Neural populations in auditory regions track acoustic signal envelopes.
- Previous studies explored oscillatory activity but not the interplay of stimulus type, frequency band, and brain anatomy.
Purpose Of The Study
- Investigate how stimulus type (music vs. speech) and frequency band influence cortical tracking of acoustic signals.
- Examine the anatomical and temporal dynamics of neural tracking in the left cerebral hemisphere.
Main Methods
- Reanalyzed intracranial electrocorticography (ECoG) recordings from 30 subjects.
- Used cross-correlation, density-based clustering, and linear mixed-effects modeling.
- Analyzed neural responses to music and speech stimuli during passive movie viewing.
Main Results
- Widespread left-hemisphere tracking of both music and speech in the Slow Frequency Band (SFB, 1-8Hz) with minimal lag.
- Enhanced tracking in the High Frequency Band (HFB, 70-120Hz) for speech, concentrated in language areas.
- HFB tracking showed a frontal-to-temporal lag gradient for speech, absent for music.
Conclusions
- Cortical tracking of naturalistic auditory stimuli is shaped by a complex interaction between brain region and frequency band.
- Distinct neural dynamics in SFB and HFB reflect specialized processing of music and speech.
- Findings reveal frequency-specific and stimulus-dependent organization of auditory cortical tracking.

