Implementing Virtual Partners for Sensorimotor Synchronization Research

A workshop with Bavo Van Kerrebroeck, a post-doctoral researcher in the Input Devices and Music Interaction Lab and Sequence Production Lab of McGill University. This session will demonstrate the display of stimuli in virtual and augmented reality head-mounted displays followed by an open discussion on data logging and multi-user setups.

Registration

Registration for this event is free and can be done via this link.

Abstract

Successful music making requires communication of expressive intentions and a healthy balance of anticipation and adaptation in fine-grained spatiotemporal and embodied dynamics. As such, it represents a fertile ground to empirically test and validate behavioural models of coordination that explain synchronization and entrainment. While music research with adaptive auditory metronomes has offered fundamental insights into the (neuro)physiological processes underlying sensorimotor synchronization, introducing more embodied interactions allowing for expressive cues or posture mirroring could shed further light on the spontaneous processes and planned strategies in successful music making. Extended reality technology is naturally equipped for this task as its digital nature allows for flexible control of stimuli, replication of empirical testing, and integration of real-time, human animated stimuli. As such, research integrating these so-called embodied virtual agents in virtual partner paradigms has already contributed fundamental insights into the synchronization mechanisms of coordinating humans.

This session will demonstrate a technological pipeline to perform coordination and sensorimotor synchronization research using embodied virtual agents in a virtual partner paradigm. The session will introduce the stages of capturing, processing, and displaying multimodal stimuli as performed in an earlier study (Van Kerrebroeck et al., 2021). Specifically, motion capture data will be briefly introduced and imported into the Unity game engine platform. This data will be used to animate a virtual human, controlled using the Kuramoto model for synchronization. Auditory stimuli will be imported using the Ableton and Pure Data applications together with a note on latencies and data synchronization. The session will demonstrate the display of stimuli in virtual and augmented reality head-mounted displays followed by an open discussion on data logging and multi-user setups.

Van Kerrebroeck, B., Caruso, G., & Maes, P.-J. (2021). A methodological framework for assessing social presence in music interactions in virtual reality. Frontiers in Psychology, 12, 663725.

Bio

Bavo Van Kerrebroeck
Bavo Van Kerrebroeck

Bavo Van Kerrebroeck is a post-doctoral researcher in the Input Devices and Music Interaction Lab (IDMIL) and Sequence Production Lab of McGill University. He is a researcher in embodied music cognition, extended reality, and human-computer interaction. He obtained a master’s in engineering and computer music at the KU Leuven and Sorbonne University (IRCAM) and completed his PhD at Ghent University (IPEM). He currently works on the development of musical agents to investigate the emergent qualities in collective music making.