Exploring machine learning, artificial creativity and human musicality

Exploring machine learning, artificial creativity and human musicality

Following Marc Chemillier’s Distinguished Lecture, this workshop will be organized in two parts. The first part will provide an opportunity to CIRMMT members to learn more about the Djazz software and test it: performers and improvisers are welcome! The second part of the workshop will be dedicated to a discussion around the integration of computer tools in musical creativity and its relationship to various musical and cultural traditions.

Following Marc Chemillier’s Distinguished Lecture, this workshop will be organized in two parts.

The first part will provide an opportunity to CIRMMT members to learn more about the Djazz software and to test it: performers and improvisers are welcome!

CIRMMT members who would like to present their own research in the field are encouraged to bring their own system: please contact us in advance to benefit from a time-slot during the workshop (limited numbers of slots).

The second part of the workshop will be dedicated a discussion around the integration of AI tools in musical creativity and its relationship to various musical and cultural traditions. This session will begin with a presentation by ethnomusicologist Rujing Huang and machine learning scientist Anna Huang, who both serve as co-organizers of the AI Song Contest.

Workshop schedule

  • 2:30-4:15pm Demonstrations and improvisations with Djazz and other computer programs dedicated to improvisation with live performers
  • 4:15-4:30pm Coffee-break
  • 4:30-5:30pm Presentation by Rujing Huang and Anna Huang and discussion (moderated by Fabrice Marandola)

Participants

An Inter/Transdisciplinary Dialogue: From AI Song Contest, Collaborative Songwriting, to Critical Technology Studies

Rujing Huang and Anna Huang

The AI Song Contest (AISC), launched in 2020, is an international competition exploring the use of artificial intelligence (AI) in the process of songwriting. Via analyzing the song entries and accompanying "process documents" from the contest, we examine how AISC — in enabling human-AI partnership and direct musician-scientist collaborations — has effected new modes of songwriting both as inquiry and as action. We uncover layers of tension that arise in the cultural, technical, creative, and ethical spheres when AI becomes involved in songwriting.

A collaboration between an ethnomusicologist and a machine learning (ML) scientist, both of whom serve as AISC co-organizers, this paper attempts an inter/transdisciplinary dialogue between engineering and the humanities that we argue is increasingly vital as technologies such as AI intensify their impact on the music ecosystem(s). We address emerging issues in the past editions of AISC, such as:

  1. How affordances provided by AI tools vary when used to support narratives in different musical and cultural traditions, thus impacting the labor needed to attain expressiveness and virtuosity;
  2. How different ML models evoke different expectations and mental models, forms of human-AI alignment, songwriting strategies, and the nature of human-human teamwork, and;
  3. How AISC entries contribute to such discourses as timbre and vocality, tuning and temperament, theories of listening, virtuosity, and the artificiality/authenticity dichotomy.

This paper speaks to fields including ethnomusicology, machine learning, music information retrieval, popular music studies, the philosophy of technology, and songwriting (as creative practice, as subject of critical inquiry, and as pedagogy).

eTu{d,b}e with spatialised improvising agents

Kasey Pocius, Maxwell Gentilli-Morin, Tommy Davis

The eTube is an infra-instrument that combines a saxophone mouthpiece, PVC tube, and a custom controller interface which facilitates interaction between a performer and musical agents. eTu{d,b}e is our improvisation framework which adopts and adapts existing musical agents created by other developers through which we explore human-computer interaction, spatialisation, \ and machine agency in improvised performance.

Martin Daigle

The Corpora project features Bob Pritchard’s KICKASS software for connecting Kinect cameras with Max MSP. This software was used by the Inland Ocean Cooperative to allow dancers to control the real-time audio processing of improvising musicians in real time.

The examples showcasing the setup can be found in the following videos: Video 1 & Video 2.