arXiv Open Access 2021

Dance2Music: Automatic Dance-driven Music Generation

Gunjan Aggarwal Devi Parikh
Lihat Sumber

Abstrak

Dance and music typically go hand in hand. The complexities in dance, music, and their synchronisation make them fascinating to study from a computational creativity perspective. While several works have looked at generating dance for a given music, automatically generating music for a given dance remains under-explored. This capability could have several creative expression and entertainment applications. We present some early explorations in this direction. We present a search-based offline approach that generates music after processing the entire dance video and an online approach that uses a deep neural network to generate music on-the-fly as the video proceeds. We compare these approaches to a strong heuristic baseline via human studies and present our findings. We have integrated our online approach in a live demo! A video of the demo can be found here: https://sites.google.com/view/dance2music/live-demo.

Topik & Kata Kunci

Penulis (2)

G

Gunjan Aggarwal

D

Devi Parikh

Format Sitasi

Aggarwal, G., Parikh, D. (2021). Dance2Music: Automatic Dance-driven Music Generation. https://arxiv.org/abs/2107.06252

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2021
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓