Cross-species comparison of rhythm perception in ferret

Auditory rhythm perception is essential to human speech and music cognition. A recent and influential theory, the vocal learning hypothesis, proposes that vocal learning is an evolutionary and mechanistic prerequisite for auditory flexible rhythmic pattern perception. This hypothesis predicts that, compared to vocal learning species, vocal nonlearners are incapable of predictive and flexible rhythm perception in auditory stimuli. However, prior work suggests that ferrets, which are vocal nonlearners, can distinguish stimuli based on tempo or rhythmicity.

How does the brain construct auditory space?

Unlike vision or touch the position of a sound source is not represented at the cochlea and instead must be computed from sound localisation cues. These cues include differences in the timing and intensity between the two ears, and monaural spectral cues that result from the direction-dependent interaction of sound with the pinna. Auditory cortex is required for accurate sound localisation behaviour, yet sound localisation cues are extracted by dedicated centres in the brainstem. What role does auditory cortex play in representing sounds in space?

Temporal processing in the central auditory system

The sensation of time is necessary for learning and behavior such as communication, sensory-motor processing, or memorization. As no dedicated sensory system for time exists, its perception must have an intimate connection to (the experience of) sensory features of an event, such as the duration of a sound. One way to study time perception is therefore to understand how sound duration is computed, or how a sound’s beginning (onset) or end (offset) are coded and perceived.

Changing behavioral state and the impact of correlated variability on neural population coding in auditory cortex

Correlated variability within neural populations, sometimes called noise correlation, substantially impacts the accuracy with which information about sensory stimuli can be extracted from neural activity. Previous studies have shown that changes in behavioral state, reflecting phenomena such as attention and/or arousal, can change correlated variability. However, the degree to which these changes impact neural encoding of sensory information remains poorly understood, particularly in the auditory system.

The meaning of sounds: Acoustic to semantic transformations in human auditory cortex

A bird chirping, a glass breaking, an ambulance passing by. Listening to sounds helps recognizing events and objects, even when they are out of sight, in the dark or behind a wall, for example. In this talk, I will discuss how the human brain transforms acoustic waveforms into meaningful representations of the sources, attempting to link theories, models and data from cognitive psychology, neuroscience and artificial intelligence research.

Perception and neural coding of pitch through the lifespan: Peripheral and cortical considerations

Pitch is a primary perceptual attribute of our auditory world, playing a critical role in music, speech, and the organization of the auditory scene into perceptual objects. It has long been thought that stimulus timing information, conveyed by the auditory nerve, underlies and limits our exquisite sensitivity to differences in frequency, and our ability to detect very small fluctuations or modulations in frequency.

Encoding & decoding language representations in human cortex

Abstract: The meaning, or semantic content, of natural speech is represented in highly specific patterns of brain activity across a large portion of the human cortex. Using recently developed machine learning methods and very large fMRI datasets collected from single subjects, we can construct models that predict brain responses with high accuracy. Interrogating these models enables us to map language selectivity with unprecedented precision, and potentially uncover organizing principles.