Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Demonstration
Tue Dec 04 07:45 AM -- 04:30 PM (PST) @ Room 510 ABCD #D8
A model-agnostic web interface for interactive music composition by inpainting
Gaëtan Hadjeres · Théis Bazin · Ashis Pati

We present a web-based interface that allows users to compose symbolic music in an interactive way using generative models for music. We strongly believe that such models only reveal their potential when used by artists and creators. While generative models for music have been around for a while, the conception of interactive interfaces designed for music creators is only burgeoning. We contribute to this emerging area by providing a general web interface for many music generation models so that researchers in the domain can easily test and promote their works. We hope that the present work will contribute in making A.I.-assisted composition accessible to a wider audience, from non musicians to professional musicians. This work is a concrete application of using music inpainting as a creative tool and could additionally be of interest to researchers in the domain for testing and evaluating their models. We show how we can use this system (generative model + interface) using different inpainting algorithms in an actual music production environment. The key elements of novelty are: (a) easy-to-use and intuitive interface for users, (b) easy-to-plug interface for researchers allowing them to explore the potential of their music generation algorithms, (c) web-based and model-agnostic framework, (d) integration of existing music inpainting algorithms, (e) novel inpainting algorithm for folk music, (f) novel paradigms for A.I.-assisted music composition and live performance, (g) integration in professional music production environments.