DYCI2 Project

Topics: Human-machine co-improvisation. Automatic chord extraction, discovery and inference of harmonic progressions in real-time audio signals.

ANR Project 2017-2020

The collaborative research and development project DYCI2, Creative Dynamics of Improvised Interaction (Ircam, Inria Nancy, EHESS, UCSD, Univ. La Rochelle), focuses on conceiving, adapting, and bringing into play efficient models of artificial listening, learning, interaction, and generation of musical contents. It aims at developing creative and autonomous digital musical agents able to take part in various human projects in an interactive and artistically credible way; and, in the end, at contributing to the perceptive and communicational skills of embedded artificial intelligence.

The concerned areas are live performance, production, pedagogy, and active listening. Three main research issues of this project: conceiving multi-agent architectures and models of knowledge and decision in order to explore scenarios of music co-improvisation involving human and digital agents.

The objective is to merge the usually exclusive “free”, “reactive”, and “scenario-based” paradigms in interactive music generation to adapt to a wide range of musical contexts involving hybrid temporality and multimodal interactions.

Youtube playlist of the last DYCI2 experiments