Human-like inclinations: composing a multimedia live installation with AI
Human-like inclinations: composing a multimedia live installation with AI
Human-like is a 30-minute interdisciplinary work for performer, violin, bass clarinet, Seaboard, live electronics, and live video. Situated at the intersection of spatialised concert, live installation, and experimental musical theatre, the piece explores metaphors associated to machine learning’s training process through musical composition and performance.
The work investigates the central question: What would it mean for a machine to understand what it is like to be the kind of creature we are? It constructs a figurative sonic and visual representation of a training process, where the imaginary “dataset” consists of features associated with human nature. The compositional sound world integrates acoustic instruments, heartbeats (derived from actual medical datasets) and electronic glitches that evoke the electrical apparatus of a computer (many of which were captured from actual computers using electromagnetic microphones in analogy to medical auscultation).
The piece features four musicians spatially distributed in a cross formation around the audience. The solo performer stands before a smartphone with face-tracking capabilities, triggering and manipulating samples through facial gestures, such as blinking and smiling. The piece unfolds through repetition and variation, reflecting epochs in a metaphorical training process. As the work progresses, new sounds and gestures emerge. The performer evolves from only using her eyes to incorporating mouth and arm movements, culminating in a moment where the machine appears to have achieved human-like presence.
The reliance on repetition reflects both conceptual ideas and practical constraints: musicians perform in darkness, spaced apart, with the performer's eyes closed for long durations, requiring extensive memorisation and creating synchronisation difficulties, which repetition helps to address. Staging decisions and musical structure are intricately connected.
Human-like contributes to discourse on human-machine interaction, interrogating the extent to which AI can approximate human behaviour. It critically engages with the current debates on AI’s capabilities, inviting reflection on whether machines can truly capture human essence, and, more crucially, on the role such technologies should play in our lives.
music composition, AI, human-machine interaction, Live Electronics, installation, performance
Galaz, Pablo
bafb374e-3ed5-4766-8ce6-c6ea34b8fac8
Melen, Christopher
6706e5ef-75d6-493d-a28e-b4e1ebb62249
Lee, Tiarna
714301ff-daab-4cb2-b79c-6405427a42ca
Hui Wai Nok, Angela
8ed2f8ea-0973-4584-9cc3-c98daed427f9
Kulina, Natálie
eb3a697a-f8cd-49ad-9114-e59254f0a8b8
Danesi, Marco
8ca9a42e-bc5e-4c37-8bf0-316f90cafbe4
Gorini, Paolo
5caa32fa-1e21-4a89-a19f-89f58f2d50e7
2 June 2024
Galaz, Pablo
bafb374e-3ed5-4766-8ce6-c6ea34b8fac8
Melen, Christopher
6706e5ef-75d6-493d-a28e-b4e1ebb62249
Lee, Tiarna
714301ff-daab-4cb2-b79c-6405427a42ca
Hui Wai Nok, Angela
8ed2f8ea-0973-4584-9cc3-c98daed427f9
Kulina, Natálie
eb3a697a-f8cd-49ad-9114-e59254f0a8b8
Danesi, Marco
8ca9a42e-bc5e-4c37-8bf0-316f90cafbe4
Gorini, Paolo
5caa32fa-1e21-4a89-a19f-89f58f2d50e7
Galaz, Pablo, Melen, Christopher, Lee, Tiarna, Hui Wai Nok, Angela, Kulina, Natálie, Danesi, Marco and Gorini, Paolo
(2024)
Human-like inclinations: composing a multimedia live installation with AI.
Abstract
Human-like is a 30-minute interdisciplinary work for performer, violin, bass clarinet, Seaboard, live electronics, and live video. Situated at the intersection of spatialised concert, live installation, and experimental musical theatre, the piece explores metaphors associated to machine learning’s training process through musical composition and performance.
The work investigates the central question: What would it mean for a machine to understand what it is like to be the kind of creature we are? It constructs a figurative sonic and visual representation of a training process, where the imaginary “dataset” consists of features associated with human nature. The compositional sound world integrates acoustic instruments, heartbeats (derived from actual medical datasets) and electronic glitches that evoke the electrical apparatus of a computer (many of which were captured from actual computers using electromagnetic microphones in analogy to medical auscultation).
The piece features four musicians spatially distributed in a cross formation around the audience. The solo performer stands before a smartphone with face-tracking capabilities, triggering and manipulating samples through facial gestures, such as blinking and smiling. The piece unfolds through repetition and variation, reflecting epochs in a metaphorical training process. As the work progresses, new sounds and gestures emerge. The performer evolves from only using her eyes to incorporating mouth and arm movements, culminating in a moment where the machine appears to have achieved human-like presence.
The reliance on repetition reflects both conceptual ideas and practical constraints: musicians perform in darkness, spaced apart, with the performer's eyes closed for long durations, requiring extensive memorisation and creating synchronisation difficulties, which repetition helps to address. Staging decisions and musical structure are intricately connected.
Human-like contributes to discourse on human-machine interaction, interrogating the extent to which AI can approximate human behaviour. It critically engages with the current debates on AI’s capabilities, inviting reflection on whether machines can truly capture human essence, and, more crucially, on the role such technologies should play in our lives.
This record has no associated files available for download.
More information
Published date: 2 June 2024
Venue - Dates:
AI Arts Festival 2024, The ARC and Theatre Royal Winceshter, Winchester, United Kingdom, 2024-06-02 - 2024-06-02
Keywords:
music composition, AI, human-machine interaction, Live Electronics, installation, performance
Identifiers
Local EPrints ID: 491001
URI: http://eprints.soton.ac.uk/id/eprint/491001
PURE UUID: cd65b936-978e-44b6-8556-d8dd545194d0
Catalogue record
Date deposited: 11 Jun 2024 16:36
Last modified: 24 Jun 2025 16:51
Export record
Contributors
Composer:
Pablo Galaz
Programmer:
Christopher Melen
Other:
Tiarna Lee
Performer:
Angela Hui Wai Nok
Performer:
Natálie Kulina
Performer:
Marco Danesi
Performer:
Paolo Gorini
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics