ENS, Salle Séminaire du DEC, 29 rue d'Ulm, 75005 Paris
Temporal integration is fundamental to sensory processing. In the auditory domain, the brain must integrate across hierarchically organized structures (e.g., phonemes, syllables, and words) spanning tens to hundreds of milliseconds to derive meaning from sound. Yet surprisingly little is known about the specific timescales over which different regions of the human auditory cortex integrate information, in part due to their complex, nonlinear tuning for natural sounds. In this talk, I will describe a new method to estimate neural integration windows using natural stimuli (the temporal context invariance paradigm). Our method is conceptually simple and general, and thus applicable to virtually any brain region, sensory domain, or temporally precise recording modality. By applying this method to intracranial recordings from human neurosurgical patients, we have found that the human auditory cortex integrates hierarchically across diverse timescales spanning ~50 to 400 ms with substantially longer integration windows in non-primary regions, bilaterally. Moreover, we have found that neural populations with short and long integration windows exhibit distinct functional properties: short-integration electrodes (less than ~200 ms) show prominent spectrotemporal modulation selectivity, while long-integration electrodes (greater than ~200 ms) show prominent category selectivity. These findings reveal how multiscale integration organizes auditory computation in the human brain.