Score
184 pages
A3-portrait
Program notes
The composition IDEA (2018) was commissioned by Time of Music festival and Institut de Recherche et de Coordination Acoustique/Musique (IRCAM). The commission was partly funded by the ULYSSES project of the European Commission (Creative Europe Programme).
IDEA is in three movements that are played without pause: I Neural Networks, II Synaptic Signals, III Reborn Resolutions. The duration of the work is approximately 27 minutes and it is composed for an ensemble of 11 musicians and 8-channel electronics. The 8-channel surround sound encircling the audience consists of both sound files and live electronics treating the instruments in real time.
During the performance, electronics is implemented using Max/MSP, Antescofo, Ircam-Spat and Sublime Text softwares, and Mubu For Max toolbox. Max (cycling74.com/products/max) is a visual programming language for music and multimedia. Antescofo (forumnet.ircam.fr/product/antescofo-en/) seeks to aid with the synchronization of electronics in live performances. Sublime Text (www.sublimetext.com) is a source code editor. Ircam-Spat (forumnet.ircam.fr/product/spat-en/) is a software suite for spatialization of sound signals in real-time intended for musical creation, postproduction, and live performances. Mubu For Max (forumnet.ircam.fr/product/mubu-en/) is a toolbox for multimodal analysis of sound and motion, interactive sound synthesis and machine learning.
As an integral part of this commission, there were six work periods at IRCAM's studios in Paris for a total of nine weeks. The studio work was done in collaboration with Serge Lemouton. He is a computer music designer, Réalisateur en Informatique Musicale (RIM). The computer music design was produced in IRCAM-Centre Pompidou studios. The work was composed both in IRCAM's studios in Paris and in a personal workspace in Helsinki. For this project I had purchased a sound card, four studio speakers and one subwoofer in order to have a surround-sound system for my personal workspace.
IDEA (2018) is my third composition containing electronics. The oldest of these three works is Maailmamaa (in english: Worldland, 2010, 40 min) for mixed choir and 2-channel tape commissioned by the Helsinki Chamber Choir. The second piece with electronics is Logo (2013, 6 min) for violin and 9-channel electronics. From the point of view of electronics, these compositions share some mindsets. In each of these three works, for example IRCAM has been involved in one way or another: During the academic year 2009-10, partly while composing Maailmamaa(2010), I lived in the Cité Internationale des Arts in Paris and attended certain electronic-music courses at IRCAM taught by Mikhail Malt, Alexis Baskind and Eric Daubresse. Logo (2013) was composed during Cursus, a year-long practical training course in composition and computer music at IRCAM in 2012-13, which was taught by Gregoire Lorieux (tutor), Éric Daubresse, Jean Lochard, Mikhail Malt, Alexander Mihalic and Mauro Lanza. Thus it is probably natural, that from the electronics point of view, it is possible to find some stylistic similarities in these three compositions (Maailmamaa, Logo and IDEA): for instance, approximately half of the Max modules of IDEA are partly based on those familiar from Logo.
Generally speaking, attempts have been made to build IDEA’s live-electronic Max modules as if they were instruments that could offer potential for nuanced musical expression. One of the ideals has been, that the Max modules would enable, each in their own genre, a wide sound spectrum. Therefore, the extremities of a module's sound spectrum may be very contrasting. In IDEA, the module presets are defined, for each situation, in advance. For instance, if the situation so requires, when a particular Max module is turned on, the module's presets could have been defined as an event series, in which the live electronics would change over a certain period, for example processively, from one type of music to another.
In the electronic concert-music context, the issue of triggering sound files and live-electronic events usually brings along the problematics of timing: How does the computer know where in the piece the musicians are at any given timepoint? During the performance, by what method acoustic and electronic worlds are synchronized together? To date, many different methods have been developed to draw 'how much is the clock of the composition?'. The most common of these methods are probably the keystrokes on the computer keyboard, pushing down keys on the MIDI keyboard, or pressing down the MIDI foot pedal at certain points in the work.
In order to choose a method, naturally it is good to consider many different things, including how many times, during a performance, the computer needs to know where in the composition the musicians are at any given timepoint. For example, during the performance of Logo (for violin and electronics), the computer needs to know this timepoint 39 times. In the case of Logo, the timing problematics of electronics has been solved so that the violinist has a MIDI foot pedal that she/he presses down during the performance altogether 39 times at the certain points while playing the violin. Instead, in IDEA, the computer needs to know approximately 1300 times the timepoint (where in the piece the musicians are at a given timepoint). Since the amount is relatively so large, naturally, the question arises, that the method should somehow be automated.
In spring 2017, I visited the Haus der Musik music museum in Vienna, where a conducting simulation Virtual Conductor was on exhibition. The simulation included a baton, with which the museum visitor could conduct, in his/her preferred tempo, the virtual Vienna Philharmonic Orchestra shown on screen. The repertoire included beloved classics such as the wiener waltz The Blue Danube. This museum experience gave support for an idea of trying to use motion detection of a conductor, also with changing time signatures – and also in a live-concert situation.
Detection and analysis of the conductor’s movements during the performance is central in the temporal realization (synchronization) of the electronics in IDEA. A sensor (R-IoT) is attached to the conductor's (right) hand (or baton), which detects the conductor's movements. The information is sent to a computer (Max, Antescofo, Mubu For Max), which deduces where in the piece the musicians are at any given timepoint, by comparing the conductor’s live movements to the corresponding movements stored in advance. Therefore this method, if working properly, is some kind of automated solution for the timing problematics of the electronics of IDEA. Although this method has been chosen, in theory it is possible, that the synchronization between the electronics and ensemble could be implemented by another method, because the method itself is not supposed to affect the sound of the composition.
In January 2018, the first motion-capturing test, related to this project, began, involving three conductor students from the Paris Conservatory (CNSMDP) from Alain Altinoglu's conductor class. The first tests were held at IRCAM in a studio environment without an orchestra. Later in January 2018 the tests continued, this time at the Paris Conservatory in front of an orchestra. During these tests, Serge Lemouton stored the conductor students’ movements to a computer.
After the preliminary tests were over, conductor Christian Karlsen visited IRCAM during the studio periods for a total of four days in May-June 2018. He conducted IDEA through for a few times, in the studio environment without an ensemble, and Serge Lemouton stored Karlsen's right-hand movements to a computer. These recordings were used during the premiere of the work on July 5, 2018. The data were compared to Karlsen's live-conducting motion-detection data during the premiere, in order to try to make it possible for the computer to know where in the composition the musicians were at any given beat. In IDEA, the durations of beats vary relatively greatly; there is a rather large selection of various micro and additive time signatures, as well as temporal accelerations and decelerations.
The motion-detection system is, to some extent, experimental, it has its own limits. Naturally, there is, for example, the risk that the conductor's interpretation in a live performance, in front of an ensemble, would be very different from his/her interpretation in a studio environment. In that case the motion-detection system could naturally have challenges to recognize, at which point the musicians are in the composition. However, in the long run, the system will probably become more comprehensive so that it can adapt to even more varied interpretations. In the even longer run, a kind of final goal is, that the conductor's movements wouldn’t be needed to be stored in advance at all. That is to say, hopefully some day it wouldn’t be a necessity to compare the live motion-captured data to the pre-stored data, but instead the computer would be able to understand instantly the grammar of the personal movements of (basically) any conductor. Such a day may occur in the future, as artificial intelligence continues to evolve. After all, for example speech recognition, has been profoundly developed over the last few decades.
At IRCAM, the motion-capturing technology (in the field of music) has been developed since the early 2000s. Beginning from 2008, Frederic Bevilacqua has been the Head Researcher of Sound Music Movement Interaction team (ismm.ircam.fr). He visited the studio every once in a while during the IDEA's IRCAM work periods. He has created the software called Gesture follower (the first version in 2005, http://ismm.ircam.fr/gesture-follower/), which was later implemented into Max's Mubu toolbox (http://forumnet.ircam.fr/product/mubu-en/) by Riccardo Borghesi, Diemo Schwarz and Norbert Schnell. Emmanuel Flety has developed a physical motion-capturing sensor, called R-IoT. Computer music designer Serge Lemouton has worked with motion-capture technology on multiple projects prior to IDEA, but this was the first time that he worked with conductor’s motion detection.
In the case of IDEA, the responsibilities of electronics were divided as follows: Serge Lemouton was responsible for the motion-capturing technology, as well as for the overall architecture of the Max concert patch. With Lemouton, we were both responsible for building the live-electronic Max modules. When the live-electronic Max modules were completed, it was composer's responsibility to make the live-electronic events, in other words to determine when and how the Max modules are used. The sound files were also my responsibility.
For the motion capturing, Serge Lemouton used Max and Antescofo softwares, and Mubu For Max toolbox. In collaboration with Lemouton we used Max for preparing the live-electronic modules. During a performance of IDEA, the instruments are real-time processed for example by the following Max modules: frequency shifting delay, cluster harmonizer delay, frequency modulation harmonizer delay, spectral delay, playback speed control, reverb and spatialization. For the realization of the sound files I used the following computer softwares: Sound Studio, AudioSculpt, Pro Tools, GRM Tools, Spear, Ircam-Spat, Max/MSP, OpenMusic, CSound, SuperVP Trax and cataRT. Self-recorded everyday sounds and IRCAM sound libraries were used as raw material for the sound files. Altogether, there are over 1000 sound files and more than 500 live-electronic events, that are triggered during the course of the work in certain measures, on particular beats.
It was a unique experience and great privilege to work nine weeks in the IRCAM’s studios with computer music designer Serge Lemouton. He worked wisely, endeavoring to create a calm and concentrated atmosphere in the studio through the whole project. Lemouton has over 25 years of experience as a computer music designer at IRCAM and has done nearly 100 cooperation projects with various composers over the years. During those years, he has naturally encountered with a wide range of composers, many different personalities, who may well have had quite different sound ideals and compositional goals.
IDEA was premiered by International Contemporary Ensemble, conducted by Christian Karlsen in Time of Music Festival in Viitasaari Church on July 5, 2018. The French premiere of this co-commissioned composition was performed by Ensemble Court-Circuit conducted by Jean Deroyer in festival Présences in Paris at Maison de la Radio on February 13, 2019. In both premieres the sound design was done by Luca Bagnoli (IRCAM) and computer music design by Serge Lemouton (IRCAM). This project was quite international, involving institutions and people from multiple countries. It was very inspiring to have this magnificent opportunity to compose IDEA!
Sampo Haapamäki
Read the program notes in Finnish here: http://www.sampohaapamaki.com/html/IDEA.html
fl+pic+afl, cl+bcl, trp, trb, perc, hp, 2vln, vla, vlc, db, electronics
Electro-acoustic Works, Works for Orchestra or Large Ensemble
International Contemporary Ensemble, Christian Karlsen, conductor, Serge Lemouton, computer music design, Luca Bagnoli, sound engineer, Time of Music festival, Viitasaari, July 5, 2018
1. Neural networks, 2. Synaptic signals, 3. Reborn resolutions
Commissioned by Time of Music festival and IRCAM-Centre Pompidou (Serge Lemouton, computer music designer, Computer Music Design produced in the IRCAM - Centre Pompidou studios)
MF33071
184 pages
A3-portrait
401 pages
11'x14'-portrait
184 pages
A4-portrait