Talks by Barbara Shinn-Cunningham and Benjamin Blankertz

Practical information
07 November 2017
4:00pm - 6:00pm

 Centre Culturel Irlandais, 5 rue des Irlandais, 75005 Paris


Benjamin Blankertz  (Technische Universität Berlin,

Reclaiming the Free Will: A Real-Time Duel between a Human and  a Brain-Computer Interface

As a novel example of Brain-Computer Interface (BCI) technology, I will show how it can be employed to answer questions in a different field of research, here in cognitive neuroscience. There is a strong debate about the fact that spontaneous movements are preceded by predictive EEG signals, in particular as some studies suggest that those signals start prior to the conscious decision to move. We used BCI technology for an investigation of this phenomenon in which real-time prediction of movement decisions is used to intervene in the experimental flow. Our findings suggest that voluntary control over choice-predictive brain signals is limited, but movements can be canceled up to a point-of-no-return which was found to be on average around 200 ms before EMG onset and movement completion can be avoided even after that. This result has important implications for potential applications of BCI technology and contributes to ongoing discussions in cognitive neuroscience.

Barbara Shinn-Cunningham (Boston University, :

Controlling attention: audition vs. vision

In many social settings, there are multiple, competing sounds vying for attention. The ability to separate sound streams coming from different sources and focus on whatever source you want to understand is critical for communication in such environments. This talk reviews behavioral and neuroimaging studies that explore how listeners control auditory attention. Results show that when listeners decide to focus attention on a sound stream from a particular direction or from a particular talker, there is preparatory activity in various brain networks. Once the sound stimuli begin to play, the cortical representation of the competing sound streams is modulated, such that responses to an attended stream of sound is strong relative to streams that are being ignored. Importantly, if attention is focused on a sound from a particular direction vs. focused on sound with particular non-spatial sound features, seemingly similar behavioral tasks actually engage very different brain netwo!
 rks. The network engaged by spatial auditory attention includes prefrontal and parietal areas, while non-spatial attention engages regions associated with high-level auditory processing. By contrasting fMRI activity during comparable auditory and visual selective attention tasks, we find that the cortical networks engaged by spatial auditory attention map directly to regions that are commonly assumed to comprise a visuo-spatial attention network. Together, these results support the view that auditory inputs are naturally processed differently from visual inputs, but that the brain has the capacity to recruit different brain networks to process the same inputs, based on task demands.