
A long-lasting project on Interactive Music and Improvised Composition. This work investigates compositional and performative strategies for the establishment of a musical collaboration between improviser and electronics.
The system relies on a set of musical interactions based on the multimodal analysis of the instrumentalist’s behaviour: observation of embodied motion qualities (upper-body motion tracking - EyesWeb) and sonic parameters (audio features analysis - MaxMSP). Expressive cues are computed by comparing the multimodal data. The analysed musical information organises and shapes the sonic output of the system influencing various decision-making processes.
The affiliations between the features analysed and the music generated were developed and refined through years of practice. The musical script enclosed in IM|S|IC can be then considered as an auto-ethnographic quest on the relations to instrumental and sonic interactions design practices.
Documentation on previous iterations of the system:
This projected started at STEIM and at the Institute of Sonology during the Master by Research "Instruments & Interfaces". The movement analysis implemented in InMuSIC is based on the EyesWeb platform. Various free improvisers had a try with the system. The musical results of this encounter opened up many interesting conversation on the system potential to be used with a broader range of instruments.
Listen to some of excerpts from recordings of a few brilliant musicians and improvisers trying the system for the first time.
Semay Wu - Cello
Pete Furniss - Clarinet
Tomer Baruch - Rhodes