A Loopy Framework and Tool for Real-time Human-AI Music Collaboration
Abstract
We propose that looping provides an especially effective framework for real-time human–AI musical collaboration. Within this setting, we introduce SmartLooper, a system designed to support improvisation through responsive and evolving loop generation. The musician first records a personal dataset of musical fragments. During performance, the musician initiates a starting point, and the system uses this to traverse the dataset via a stochastic process, selecting loop segments based on distances computed in an embedding space derived from a pretrained diffusion model. This enables smooth yet varied transitions, allowing the system to continually evolve while retaining the performer’s stylistic identity. The musician can then layer new lines and textures over the evolving loop, creating a fluid and co-creative improvisation.