Neural dynamics and computations constraining speech and music processing
Benjamin Morillon
D-CAP Team lead – Institut de Neurosciences des Systèmes (INS), Marseille, France

Abstract
I will present recent work investigating the neural dynamics that underlie speech and music perception, with a focus on temporal scales and adaptive processes. My approach integrates detailed acoustic analyses, human intracranial EEG and MEG recordings, and computational modeling. I will discuss how auditory cortical activity reflects the temporal structure of speech and music, and how these neural dynamics support predictive processing. Particular attention will be given to the role of the motor system in generating temporal predictions during both speech comprehension and music listening. Finally, I will show how recurrent neural network models can help identify which specific information the brain estimates to understand and anticipate such complex auditory sequences.

Invited by David Robbe
Monday, 15 December 2025 at 11 a.m. – INMED conference room

Partager l'article