The rational use of cognitive resources

Psychologists and computer scientists have very different views of the mind. Psychologists tell us that humans are error-prone, using simple heuristics that result in systematic biases. Computer scientists view human intelligence as aspirational, trying to capture it in artificial intelligence systems. How can we reconcile these two perspectives? In this talk, I will argue that we can do so by reconsidering how we think about rational action.

Curious, cooperative, and communicative: How we learn from others and help others learn

Humans are not the only species that learns from others, but only humans learn and communicate in rich, diverse social contexts, and build repertoires of abstract, structured knowledge. What makes human social learning so distinctive, powerful, and smart?  In this talk, I argue that social learning is inferential at its core (inferential social learning); rather than copying what others do or trusting what others say, humans learn from others by drawing rich inferences from others’ behaviors, and help others learn by generating evidence tailored to others’ goals and knowledge states.

The organization of neural integration windows in the human auditory cortex

Temporal integration is fundamental to sensory processing. In the auditory domain, the brain must integrate across hierarchically organized structures (e.g., phonemes, syllables, and words) spanning tens to hundreds of milliseconds to derive meaning from sound. Yet surprisingly little is known about the specific timescales over which different regions of the human auditory cortex integrate information, in part due to their complex, nonlinear tuning for natural sounds.

Cross-species comparison of rhythm perception in ferret

Auditory rhythm perception is essential to human speech and music cognition. A recent and influential theory, the vocal learning hypothesis, proposes that vocal learning is an evolutionary and mechanistic prerequisite for auditory flexible rhythmic pattern perception. This hypothesis predicts that, compared to vocal learning species, vocal nonlearners are incapable of predictive and flexible rhythm perception in auditory stimuli. However, prior work suggests that ferrets, which are vocal nonlearners, can distinguish stimuli based on tempo or rhythmicity.

How does the brain construct auditory space?

Unlike vision or touch the position of a sound source is not represented at the cochlea and instead must be computed from sound localisation cues. These cues include differences in the timing and intensity between the two ears, and monaural spectral cues that result from the direction-dependent interaction of sound with the pinna. Auditory cortex is required for accurate sound localisation behaviour, yet sound localisation cues are extracted by dedicated centres in the brainstem. What role does auditory cortex play in representing sounds in space?

Temporal processing in the central auditory system

The sensation of time is necessary for learning and behavior such as communication, sensory-motor processing, or memorization. As no dedicated sensory system for time exists, its perception must have an intimate connection to (the experience of) sensory features of an event, such as the duration of a sound. One way to study time perception is therefore to understand how sound duration is computed, or how a sound’s beginning (onset) or end (offset) are coded and perceived.

Changing behavioral state and the impact of correlated variability on neural population coding in auditory cortex

Correlated variability within neural populations, sometimes called noise correlation, substantially impacts the accuracy with which information about sensory stimuli can be extracted from neural activity. Previous studies have shown that changes in behavioral state, reflecting phenomena such as attention and/or arousal, can change correlated variability. However, the degree to which these changes impact neural encoding of sensory information remains poorly understood, particularly in the auditory system.

The meaning of sounds: Acoustic to semantic transformations in human auditory cortex

A bird chirping, a glass breaking, an ambulance passing by. Listening to sounds helps recognizing events and objects, even when they are out of sight, in the dark or behind a wall, for example. In this talk, I will discuss how the human brain transforms acoustic waveforms into meaningful representations of the sources, attempting to link theories, models and data from cognitive psychology, neuroscience and artificial intelligence research.