Timezone: »

 
Spotlight
Museformer: Transformer with Fine- and Coarse-Grained Attention for Music Generation
Botao Yu · Peiling Lu · Rui Wang · Wei Hu · Xu Tan · Wei Ye · Shikun Zhang · Tao Qin · Tie-Yan Liu

Thu Dec 08 05:00 PM -- 07:00 PM (PST) @

Symbolic music generation aims to generate music scores automatically. A recent trend is to use Transformer or its variants in music generation, which is, however, suboptimal, because the full attention cannot efficiently model the typically long music sequences (e.g., over 10,000 tokens), and the existing models have shortcomings in generating musical repetition structures. In this paper, we propose Museformer, a Transformer with a novel fine- and coarse-grained attention for music generation. Specifically, with the fine-grained attention, a token of a specific bar directly attends to all the tokens of the bars that are most relevant to music structures (e.g., the previous 1st, 2nd, 4th and 8th bars, selected via similarity statistics); with the coarse-grained attention, a token only attends to the summarization of the other bars rather than each token of them so as to reduce the computational cost. The advantages are two-fold. First, it can capture both music structure-related correlations via the fine-grained attention, and other contextual information via the coarse-grained attention. Second, it is efficient and can model over 3X longer music sequences compared to its full-attention counterpart. Both objective and subjective experimental results demonstrate its ability to generate long music sequences with high quality and better structures.

Author Information

Botao Yu (Nanjing University)

Master student at Nanjing University. My research interest is deep learning in NLP, music, and other domains, trying to advance the AI technology in various applications. I am looking for a PhD opportunity around the world.

Peiling Lu (Microsoft)
Rui Wang (Microsoft Research Asia)
Wei Hu (Nanjing University)
Xu Tan (Microsoft Research)
Wei Ye (Peking University)

Dr. Wei Ye is now an associate professor at National Engineering Research Center for Software Engineering, Peking University. In 2011, he obtained a doctorate degree from the School of Electronics Engineering and Computer Science, Peking University, working with Prof. Shikun Zhang. Wei Ye has a broad interest in real-world problems related to programming languages, natural languages, and knowledge graphs. More specifically, he is now conducting research into information extraction, natural language generation, and deep-learning-based program analysis.

Shikun Zhang (Peking University)
Tao Qin (Microsoft Research)
Tie-Yan Liu (Microsoft Research)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors