eTu{d,b}e: DEVELOPING AND PERFORMING SPATIALIZATION MODELS FOR IMPROVISING MUSICAL AGENTS

Authors

  • Kasey Pocius CIRMMT/IDMIL, Schulich School of Music, McGill University, Canada
  • Tommy Davis CIRMMT/IDMIL, Schulich School of Music, McGill University, Canada
  • Vincent Cusson CIRMMT/IDMIL, Schulich School of Music, McGill University, Canada

DOI:

https://doi.org/10.17501/23572744.2023.10102

Keywords:

Musical agents, augmented instrument, improvised performance, Spatial Audio

Abstract

The eTube is a simple acoustic instrument outfitted with a microphone and a two-button controller which we program to facilitate interaction between an improviser and improvising musical agents. Through an iterative research-creation process, we develop and perform various etudes with the eTube and musical agents, generating new knowledge through musical creation. Through this process, we have developed two interactive spatialization systems for the eTube.

We begin by describing the eTu{d,b}e framework which refers to the eTube instrument and a series of improvised etudes based on human-computer musical interactions. An overview of the instrument is presented, as well as the existing systems as they were when Pocius began the spatialization portion of this project. Secondly, we outline two interactive spatialization systems designed for improvised performance with the eTube and musical agents. An overview of the updates made to the eTu{d,b}e framework is presented, followed by a description of the two spatialization systems. Finally, six different performances are presented as case studies to emphasize the advancements and challenges of the project as new approaches are developed.

Downloads

Download data is not yet available.

References

Albrecht, Robert, Jussi Jaatinen, and Tapio Lokki (2017). Electronic hearing protection for musicians. In Proceedings Sound and Music Computing Conference, 306–13.

Aska, Alyssa, and Martin Ritter. (2016). Approaches to real time ambisonic spatialization and sound diffusion using motion capture. In Proceedings of the International Computer Music Conference.

Bauer, Valentin, Dimitri Soudoplatoff, Leonard Menon, and Amandine Pras. (2022). Binaural headphone monitoring to enhance musicians’ immersion in performance. In Advances in Fundamental and Applied Research on Spatial Audio, edited by Brian F. G. Katz and Piotr Majdak, 193–219. Rijeka: IntechOpen.

Bowers, John, and Phil Archer. (2005). Not hyper, not meta, not cyber but infra-instruments. In Proceedings of the International Conference on New Interfaces for Musical Expression, 5–10.

Bown, Oliver, Arne Eigenfeldt, Aengus Martin, Benjamin Carey, and Philippe Pasquier. (2013). The Musical Metacreation weekend: Challenges arising from the live presentation of musically Metacreative systems. In Proceedings of the Artificial Intelligence and Interactive Digital Entertainment Conference, 27–34.

Braasch, Jonas. (2019). Hyper-specializing in saxophone using acoustical insight and Deep Listening skills. Edited by Rolf Bader, Marc Leman, and Rolf-Inge Godoy. Vol. 6. Current Research in Systematic Musicology. Cham, Switzerland: Springer.

Brummer, Ludger, Gotz Dipper, David Wagner, Holger Stenschke, and Jochen Arne Otto. (2014). New developments for spatial music in the context of the ZKM Klangdom: A Review of technologies and recent productions. Divergence Press 3 (1).

Carpentier, Thibaut, Markus Noisternig, and Olivier Warusfel. (2015). Twenty years of Ircam Spat: Looking back, looking forward. In Proceedings of the International Computer Music Conference, 270–77.

Cook, Perry. (2001). Principles for designing computer music controllers. In Proceedings of the International Conference on New Interfaces for Musical Expression, 1–4.

Cusson, Vincent, and Tommy Davis. 2022. eTu{d,b}e: A Preliminary Conduit. In Proceedings of the International Conference on New Interfaces for Musical Expression.

Davis, Tommy, Kasey Pocius, Vincent Cusson, Philippe Pasquier, and Marcelo M. Wanderley. (2023). eTu{d,b}e: Case studies in playing with musical agents. In Proceedings of the International Conference on New Interfaces for Musical Expression.

Federman, Jeremy, and Todd Ricketts. (2008). Preferred and minimum acceptable listening levels for musicians while using floor and in-ear monitors. Journal of Speech, Language, and Hearing Research 51 (1): 147–59.

Hunt, Andy, Marcelo M Wanderley, and Matthew Paradis. 2003. The importance of parameter mapping in electronic instrument design. Journal of New Music Research 32 (4): 429–40.

Kafejian, Sergio. (2023). The use of an interactive system in different improvisational contexts. Pre-print, received via personal communication with Tommy Davis

Meneses, Eduardo. (2022). Iterative design in DMIs and AMIs: Expanding and embedding a high-

level gesture vocabulary for T-Stick and GuitarAMI. PhD thesis, McGill University (Canada).

Miranda, Eduardo Reck, and Marcelo M Wanderley. (2006). New Digital Musical Instruments: Control and interaction beyond the keyboard. Vol. 21. AR Editions, Inc.

Nika, Jérôme, Ken Déguernel, Axel ChemlaRomeu-Santos, Emmanuel Vincent, and Gérard Assayag. (2017). DYCI2 agents: Merging the ‘free,’ ‘reactive,’ and ‘scenario-based’ music generation paradigms. In Proceedings of the International Computer Music Conference.

Nika, Jérôme, Augustin Muller, Joakim Borg, Gérard Assayag, and Matthew Ostrowski. (2022). Dicy2 for Max. Ircam UMR STMS 9912. https://hal.science/hal-03892611.

Østern, Tone Pernille, Sofia Jusslin, Kristian Nødtvedt Knudsen, Pauliina Maapalo, and Ingrid Bjørkøy. (2023). A performative paradigm for post-qualitative inquiry. Qualitative Research 23 (2): 272–89.

Pocius, Kasey. (2023). Expanding spatialization tools for various DMIs. Master’s thesis, McGill University.

Rowe, Robert. (2004). Machine musicianship. MIT Press.

Tatar, Kivanc, and Philippe Pasquier. (2017). MASOM: A musical agent architecture based on Self-Organizing Maps, affective computing, and variable Markov models. In Proceedings of the International Workshop on Musical Metacreation, 1–8.

Tatar, Kivanç, and Philippe Pasquier. (2019). Musical agents: A typology and state of the art towards Musical Metacreation. Journal of New Music Research 48 (1): 56–105.

Thelle, Notto J. W., and Philippe Pasquier. (2021). Spire Muse: A virtual musical partner for creative brainstorming. In Proceedings of the International Conference on New Interfaces for Musical Expression.

Thelle, Notto J. W., and Bernt Isak Wærstad. (2023). Co-Creative Spaces: The machine as a collaborator. In Proceedings of the International Conference on New Interfaces for Musical Expression.

Tremblay, Pierre Alexandre. (2017). Tuning to trust: System calibration as creative enabler. In Proceedings of the International Computer Music Conference, 179–84.

Tremblay, Pierre Alexandre, Nicolas Boucher, and Sylvain Pohu. (2007). Real-time processing on the road: A guided tour of [Iks]’s Abstr/Cncr Setup. In Proceedings of the International Computer Music Conference.

Tremblay, Pierre Alexandre, Gerard Roma, and Owen Green. (2021). Enabling programmatic data mining as musicking: The Fluid Corpus Manipulation Toolkit. Computer Music Journal 45 (2): 9–23.

Downloads

Published

2024-01-23

How to Cite

Pocius, K., Davis, T., & Cusson, V. (2024). eTu{d,b}e: DEVELOPING AND PERFORMING SPATIALIZATION MODELS FOR IMPROVISING MUSICAL AGENTS . Proceeding of the International Conference on Arts and Humanities, 10(1), 20–37. https://doi.org/10.17501/23572744.2023.10102