From the same archive

Mettre en temps une structure musicale : l'activité de composition de Voi(rex) par Philippe Leroux - Nicolas Donin, Jacques Theureau

April 14, 2005 01 h 01 min

Mettre en temps une structure musicale : l'activité de composition de Voi(rex) par Philippe Leroux - Nicolas Donin, Jacques Theureau

April 14, 2005 24 min

L'estimation de fréquences fondamentales multiples

May 12, 2005 52 min

La harpe électroacoustique

February 4, 2005 01 h 18 min

Utilisation de Modalys pour le projet VoxStruments, lutherie numérique intuitive et expressive - Nicholas Ellis, Joël Bensoam

October 17, 2007 49 min

Présentation des travaux l'équipe PdS dans le cadre du projet européen CLOSED : "Closing the Loop of Sound Evaluation and Design" - Olivier Houix

June 27, 2007 01 h 12 min

Sparse overcomplete methods, matching pursuit and basis pursuit - Bob L. Sturm

July 11, 2007 48 min

Transformations de type et de nature de la voix - Snorre Farner, Axel Roebel, Xavier Rodet

September 12, 2007 01 h 07 min

Segmentations et reconnaissances automatiques de phonèmes de la voix, temps différé, temps réel - Pierre Lanchantin, Julien Bloit, Xavier Rodet

September 19, 2007 01 h 13 min

Synthèse de la parole à partir du texte et construction d'une base de données d'unités de la voix - Christophe Veaux, Grégory Beller, Xavier Rodet

September 26, 2007 01 h 00 min

Projet ECOUTE - Jerome Barthelemy, Nicolas Donin, Geoffroy Peeters, Samuel Goldszmidt

October 3, 2007 01 h 12 min

Projet MusicDiscover - David Fenech Saint Genieys

October 10, 2007 01 h 10 min

Projet CASPAR - Jerome Barthelemy, Alain Bonardi

October 24, 2007 50 min

Projet CONSONNES 1ère partie - René Caussé, Vincent Freour, David Roze

November 21, 2007 57 min

GesTCom: A sensor-based environment for the analysis, processing and real-time control of complex piano notation through multimodal recordings

0:00/0:00

In my PhD thesis I proposed a novel paradigm of pianists’ interaction with complex music notation by the name embodied navigation. Its novelty lies in rethinking the classic notion of interpretation as interaction, and performance itself as a dynamic system. The primacy of performers’ embodied experience and the inherent plasticity of music notation are the paradigm’s central features: Embodiment is shown to shape constantly the comprehension of the notation and to transform notation in real time.

The GesTCom (Gesture Cutting through Textual Complexity) has been developed at IRCAM since 2014 in collaboration with the Interaction-Son-Musique-Mouvement team. It materializes the embodied navigation paradigm into a dedicated interactive system. It is a modular sensor-based environment for the analysis, processing and real-time control of complex piano notation through multimodal recordings. In terms of hardware, it comprises systems for the capture of movement, audio, video, MIDI and capacitive data from sensors on the piano keys. In terms of software, it is equipped with modules for the capture, analysis and control of the multimodal data; and modules for the augmentation and interactive control of music notation. Each of these systems functions both as stand-alone and integrated in the general methodology of embodied navigation.

After an overview of the theoretical framework in embodied cognition and human machine interaction, the talk will focus on GesTCom. I will present its technical features, its initial goals as to representation and interaction, its current applications, including: performance analysis, embodied interactive learning, contemporary composition, free improvisation, piano pedagogy and score-following; and its future directions, including: dissemination in selected communities of performers, web-based collaborative learning, further refinement with the integration of machine learning and MIR techniques, application in studies of sensorimotor learning and prediction, and creation of interactive systems which learn along with the performer in a human, embodied way.

speakers

information

Type
Séminaire / Conférence
performance location
Ircam, Salle Igor-Stravinsky (Paris)
duration
55 min
date
October 15, 2018

Pavlos Antoniadis présente sa thèse soutenue en juin 2018

Abstract:
In my PhD thesis I proposed a novel paradigm of pianists’ interaction with complex music notation by the name embodied navigation. Its novelty lies in rethinking the classic notion of interpretation as interaction, and performance itself as a dynamic system. The primacy of performers’ embodied experience and the inherent plasticity of music notation are the paradigm’s central features: Embodiment is shown to shape constantly the comprehension of the notation and to transform notation in real time.
The GesTCom (Gesture Cutting through Textual Complexity) has been developed at IRCAM since 2014 in collaboration with the Interaction-Son-Musique-Mouvement team. It materializes the embodied navigation paradigm into a dedicated interactive system. It is a modular sensor-based environment for the analysis, processing and real-time control of complex piano notation through multimodal recordings. In terms of hardware, it comprises systems for the capture of movement, audio, video, MIDI and capacitive data from sensors on the piano keys. In terms of software, it is equipped with modules for the capture, analysis and control of the multimodal data; and modules for the augmentation and interactive control of music notation. Each of these systems functions both as stand-alone and integrated in the general methodology of embodied navigation.
After an overview of the theoretical framework in embodied cognition and human machine interaction, the talk will focus on GesTCom. I will present its technical features, its initial goals as to representation and interaction, its current applications, including: performance analysis, embodied interactive learning, contemporary composition, free improvisation, piano pedagogy and score-following; and its future directions, including: dissemination in selected communities of performers, web-based collaborative learning, further refinement with the integration of machine learning and MIR techniques, application in studies of sensorimotor learning and prediction, and creation of interactive systems which learn along with the performer in a human, embodied way.

IRCAM

1, place Igor-Stravinsky
75004 Paris
+33 1 44 78 48 43

opening times

Monday through Friday 9:30am-7pm
Closed Saturday and Sunday

subway access

Hôtel de Ville, Rambuteau, Châtelet, Les Halles

Institut de Recherche et de Coordination Acoustique/Musique

Copyright © 2022 Ircam. All rights reserved.