The ability to synchronize movements with auditory rhythms is fundamental to human music-making and social coordination, yet its developmental trajectory suggests that synchronization is not innate but learned. What drives this learning process? In this study, we investigate whether reinforcement learning can serve as a model for how biological systems acquire sensorimotor synchronization. We train a recurrent neural network to synchronize with a metronomic beat using different reinforcement schemes, evaluating which reward structures best promote robust, human-like synchronization.
Our findings reveal that rewarding early taps more than late taps, along with incentivizing tempo matching, leads to synchronization behaviour that mirrors key aspects of human tapping, including asymmetric error correction and the emergence of negative mean asynchrony. Moreover, the neural dynamics of the trained network exhibit patterns observed in primates engaged in rhythmic tasks, as well as features suggesting human-like perception of beats in groups of two. These results suggest that intrinsic reinforcement for early action and interval imitation may scaffold the development of sensorimotor synchronization in humans. Our work provides a computational framework for exploring rhythm learning and raises intriguing questions about the role of reinforcement in shaping the neural mechanisms underlying entrainment.
Understanding the neural mechanisms underlying visual temporal resolution is crucial for elucidating how the brain processes time. Despite being a basic perceptual feature, temporal acuity forms the building block for higher-level cognitive functions, influencing how we perceive and interpret the world. Our research reveals that temporal acuity vary significantly across individuals and correlates with age and the propensity for anomalous perceptual experiences, suggesting broader implications for cognitive and perceptual traits.
To investigate the neural basis of these differences, we conducted an EEG study where participants performed the two-flash fusion task (2FF). This task measures temporal acuity by determining the minimum interval at which two flashes are perceived as distinct. The results indicate that 2FF thresholds are associated with the frequency of ongoing alpha oscillations, while the slope of the 2FF psychometric function—reflecting perceptual uncertainty—correlates with the aperiodic component of the EEG signal. We propose that alpha oscillations regulate sensory processing by creating periodic windows of excitability, effectively determining the brain’s perceptual sampling rate. In contrast, the aperiodic component of EEG reflects broadband neural excitation, which may disrupt this rhythmic inhibition, leading to greater perceptual uncertainty. Together, our results highlight how temporal resolution in vision is shaped by the balance and timing of neural excitation and inhibition. Given the foundational role of temporal resolution in perception, these findings have significant implications for understanding how low-level sensory processes can cascade into higher-level cognitive functions.
Presenters

