As a researcher at Ircam, Jérôme Nika’s work focuses on how to model, learn, and navigate an “artificial musical memory” in creative contexts. In opposition to a “replacement approach” where AI would substitute for human, this research aims at designing novel creative practices involving a certain level of symbolic abstraction such as “interpreting / improvising the intentions” and “composing the narration“. The latest environments resulting from his research are Dicy2 for Max and Dicy2 for Ableton Live.

See all the posts in the “Research” category.

The design of AI-empowered musical practices involves researches on the integration of temporal specifications (scenarios) in music generation processes; on the dialectic between reactivity and planning in human-computer creative interactions; as well as real-time detection and inference of underlying structure in an audio stream. This research is conducted in constant and long-term interaction with expert musicians and encompasses the entire process from modeling, development, artistic validation, up to implementation in ambitious artistic productions.

Jérôme Nika’s PhD work Guiding Human-Computer music improvisation (Young Researcher Prize in Science and Music, 2015; Young Researcher Prize awarded by the French Association of Computer Music, 2016) focused on the introduction of authoring, composition, and control in human-computer music co-improvisation.

Then, he developed the Dicy2 for Max and Dicy2 for Ableton Live generative musical agents combined machine learning models and generative processes with reactive listening modules. This library offers a collection of “agents/instruments” embedding a continuum of strategies ranging from pure autonomy to meta-composition thanks to an abstract “scenario” structure.