BriefGPT.xyz
Feb, 2020
Music2Dance:音乐驱动的舞蹈生成DanceNet
Music2Dance: Music-driven Dance Generation using WaveNet
HTML
PDF
Wenlin Zhuang, Congyi Wang, Siyu Xia, Jinxiang Chai, Yangang Wang
TL;DR
该论文提出了一种基于自回归生成模型的新方法DanceNet,以音乐的风格、节奏和旋律为控制信号生成具有高度真实感和多样性的3D舞蹈动作,并通过专业舞者捕捉了多组同步的音乐舞蹈配对数据集以提高模型性能,实验结果表明所提出的方法达到了最先进的效果。
Abstract
In this paper, we propose a novel system, named as Music2Dance, for addressing the problem of fully automatic
music
and choreography. Our key idea is to shift the WaveNet, which is originally designed for speech generation, to the human
→