Christophe Lengelé

student - Postdoc

Université de Montréal

Composition électroacoustique, Faculté de musique
https://github.com/Xon77/Live4Life
Electronic music | Spatial improvisation | Open source | Supercollider
Christophe Lengelé holds a doctorate in composition and sound creation from Université de Montréal. He has been developing since 2011 a sound creation and spatialisation tool, which aims to facilitate the improvisation of electronic music on multiple speakers. His spatial research, which questions ways of associating rhythmic and spatial parameters, is based on the concept of free and open works, both from the point of view of form and in the diffusion of open source code (https://github.com/Xon77/Live4Life).

He is currently doing a postdoctoral internship at UQAM in the School of Visual and Media Arts, from September 2022 until August 2024 under the supervision of Philippe-Aubert Gauthier. This internship is made possible thanks to FRQSC funding for post-doctoral research.

During this internship, he will carry out a research-creation project in spatial improvisation (sound and multi-sensory) around three axes:

1. The creation of performative installations around the theme of loneliness:
The public will be able to experience in real time the control of the parameters of multiple sound objects in space via several physical and tactile interfaces. The sound performance tool will thus be evaluated by the public, both from a perceptual point of view and in terms of control and ergonomics.

2. The establishment of training workshops in immersive sound and spatialisation:
Participants will be able to use and experiment free tools to create and improvise with space.

3. The creation of spatial improvisations in a multi-sensory context, combining and alternating music, video / light and dance:
Audiovisual collaborations (he's currently looking for) will be set up both with video developers to create the visual from the sound data generated, and with dancers to finally assess the impact between the gesture of the performer, the dancing bodies and the video environment.
Technically, the audiovisual object mapping could be developed through open source tools, such as Processing, Open Frameworks, Hydra, or even through commercial tools like Touch Designer and Resolume, as long as the creation process and the code are published on Github and open to everyone.