Workshop on motion capture of string quartet performances

This workshop is organized by CIRMMT Research Axis 2 (Musical gestures, devices and motion capture).  It will take place on October 12, 2011, from 2:00pm-5:00pm in Clara Lichtenstein Recital Hall (C-209).

Registration

Registration is mandatory, and is on a first-come first-served basis: Workshop on motion capture of string quartet performances - Registration form


Description

This workshop will be comprised of three talks and a discussion of methods for mocap of multiple musicians:

  • Tools for mocap at CIRMMT  - an overview
  • Towards a computational analysis of inter-dependence in string quartet performance
  • RepoVizz: a multimodal on-line database and browsing tool for music performance research

---------------------

"Towards a computational analysis of inter-dependence in string quartet performance" 

Despite recent advances in data acquisition, analysis, and simulation of diverse facets of instrumental playing, music performance still represents one of the most challenging aspects of the field of computer research in music. Within music performance data-driven research, the study of inter-dependence phenomena taking place during instrumental group performance appears as a unique opportunity to explore underlying communication processes driving a goal-oriented, collaborative task as it is ensemble music playing. In this direction, a joint initiative coordinated by MTG-UPF, CIRMMT-McGill, BRAMS, and CCRMA-Stanford will lead to the realization of a series of multimodal recordings (multichannel audio, video, and motion capture) of string quartet performances in an experimental scenario. Data acquisition and processing are to be carried out in an attempt to examine how musicians plan and control elemental performance resources (timing, dynamics, intonation, etc.) in different contexts, by means of a number of basic experiments based on standard quartet exercises and musical pieces. A public data repository is going to be constructed and made available to the scientific community.

"RepoVizz: a multimodal on-line database and browsing tool for music performance research" 

RepoVizz is a data repository and visualization tool for structured storage and user-friendly browsing of music performance multimodal recordings. The primary purpose of RepoVizz is to offer means for the scientific community to gain on-line access to a music performance multimodal database shared among researchers. RepoVizz is designed to hold synchronized streams of heterogeneous data (audio, video, motion capture, physiological measures, extracted descriptors, etc.), annotations, and musical scores. Data is structured by customizable XML skeleton files enabling meaningful retrieval. Skeleton files are first created during data gathering, and provide means to (re-)organize acquired data in any desired hierarchical structure, as they only hold pointers to stored data files. Once a data-set is created and uploaded to the server, each skeleton file defines a view. Multitrack data visualization is done remotely via a powerful HTML5-based environment that enables web-driven editing (add annotations, extract descriptors) and downloading of data-sets. A preliminary instance of RepoVizz is created within the EU FET-Open Project SIEMPRE, devoted to studying social interaction in ensemble performance.