Transnodal, for cello and live electronics
Transnodal, for cello and live electronics
Transnodal is a 16-minute composition for cello and live electronics and the output of the research project "Enhancing Interactivity Through Machine Learning: A New Composition for Cello and Live Electronics". The piece premiered in Vigo, Spain, on the 14th of November 2024.
This project aims to harness the capabilities of machine learning to conceive and implement cohesive and symbiotic modes of interaction between a human performer, a musical instrument, and a computer program. A common paradigm in mixed music (i.e., works for instruments and electronics) involves a computer music designer performing or controlling the electronic part. This might include triggering events (e.g., sound files or electronic treatments), continuously changing certain parameters (e.g., using faders and knobs to modify the behaviour of the electronic part), or a combination of both. In this scenario, synchronisation challenges between the performer and the computer music designer often arise, affecting musical expression and interaction.
In Transnodal, the electronic part is automatic and operates without the need for a computer music designer (except for minor level adjustments, in part depending on the performance space). This removes issues associated with synchronisation, allowing the performer to have full control over the electronics, tempo, and musical expression. This paradigm draws on the idea of score follower methods, with the distinction that here the model is purely based on sound features rather than symbolic representations.
The live electronics are programmed in a Max/MSP patch that includes two levels of instrumental technique recognition using machine learning: an instantaneous gesture recognition engine trained with 10 different cello sounds, and a classification algorithm to recognise percussive sounds. The patch adapts in real-time to the results of instrumental recognition, modifying its internal configuration by changing the routing and parameters of the electronic processes. This creates a cohesive interaction between the cellist and the electronics.
Galaz, Pablo
bafb374e-3ed5-4766-8ce6-c6ea34b8fac8
Piel, Thomas
7a83e0ad-c8a6-4576-b5d5-4e992f0ea074
11 December 2024
Galaz, Pablo
bafb374e-3ed5-4766-8ce6-c6ea34b8fac8
Piel, Thomas
7a83e0ad-c8a6-4576-b5d5-4e992f0ea074
Galaz, Pablo and Piel, Thomas
(2024)
Transnodal, for cello and live electronics.
Abstract
Transnodal is a 16-minute composition for cello and live electronics and the output of the research project "Enhancing Interactivity Through Machine Learning: A New Composition for Cello and Live Electronics". The piece premiered in Vigo, Spain, on the 14th of November 2024.
This project aims to harness the capabilities of machine learning to conceive and implement cohesive and symbiotic modes of interaction between a human performer, a musical instrument, and a computer program. A common paradigm in mixed music (i.e., works for instruments and electronics) involves a computer music designer performing or controlling the electronic part. This might include triggering events (e.g., sound files or electronic treatments), continuously changing certain parameters (e.g., using faders and knobs to modify the behaviour of the electronic part), or a combination of both. In this scenario, synchronisation challenges between the performer and the computer music designer often arise, affecting musical expression and interaction.
In Transnodal, the electronic part is automatic and operates without the need for a computer music designer (except for minor level adjustments, in part depending on the performance space). This removes issues associated with synchronisation, allowing the performer to have full control over the electronics, tempo, and musical expression. This paradigm draws on the idea of score follower methods, with the distinction that here the model is purely based on sound features rather than symbolic representations.
The live electronics are programmed in a Max/MSP patch that includes two levels of instrumental technique recognition using machine learning: an instantaneous gesture recognition engine trained with 10 different cello sounds, and a classification algorithm to recognise percussive sounds. The patch adapts in real-time to the results of instrumental recognition, modifying its internal configuration by changing the routing and parameters of the electronic processes. This creates a cohesive interaction between the cellist and the electronics.
This record has no associated files available for download.
More information
Published date: 11 December 2024
Venue - Dates:
Simbiose, SVT Espacio de Arte, Vigo, Spain, 2024-11-14
Identifiers
Local EPrints ID: 501762
URI: http://eprints.soton.ac.uk/id/eprint/501762
PURE UUID: 24e9607a-4bfd-44fa-9ed5-dd6038e7dd6c
Catalogue record
Date deposited: 09 Jun 2025 18:06
Last modified: 09 Jun 2025 18:06
Export record
Contributors
Author:
Pablo Galaz
Performer:
Thomas Piel
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics