DAWZY: A New Addtion to AI powered "Human in the Loop" Music Co-creation
Abstract
Digital Audio Workstations (DAWs) offer fine control, but mapping high-level intent (e.g., “warm the vocals”) to low-level edits breaks creative flow. Existing artificial intelligence (AI) music generators are typically one-shot, limiting opportunities for iterative development and human contribution. We present \textit{DAWZY}, an open-source assistant that turns natural-language (text/voice/hum) requests into reversible actions in REAPER. \textit{DAWZY} keeps the DAW as the creative hub with a minimal GUI and voice-first interface. \textit{DAWZY} uses LLM-based code generation as a novel way to significantly reduce the time users spend familiarizing themselves with large interfaces, replacing hundreds of buttons and drop downs with a chat box. {DAWZY} also uses three Model Context Protocol tools for live state queries, parameter adjustment, and AI beat generation. It maintains grounding by refreshing state before mutation; and ensures safety and reversibility with atomic scripts and undo. In evaluations, DAWZY performed reliably on common production tasks and was rated positively by users across Usability, Control, Learning, Collaboration, and Enjoyment.