Stylevr: Stylizing character animations with normalizing flows

Published in IEEE Transactions on Visualization and Computer Graphics, 2023

The significance of artistry in creating animated virtual characters is widely acknowledged, and motion style is a crucial element in this process. There has been a long-standing interest in stylizing character animations with style transfer methods. However, this kind of models can only deal with short-term motions and yield deterministic outputs. To address this issue, we propose a generative model based on normalizing flows for stylizing long and aperiodic animations in the VR scene. Our approach breaks down this task into two sub-problems: motion style transfer and stylized motion generation, both formulated as the instances of conditional normalizing flows with multi-class latent space. Specifically, we encode high-frequency style features into the latent space for varied results and control the generation process with style-content labels for disentangled edits of style and content. We have developed a prototype, StyleVR, in Unity, which allows casual users to apply our method in VR. Through qualitative and quantitative comparisons, we demonstrate that our system outperforms other methods in terms of style transfer as well as stochastic stylized motion generation.

Recommended citation: Ji, B., Pan, Y., Yan, Y., Chen, R., & Yang, X. (2023). Stylevr: Stylizing character animations with normalizing flows. IEEE Transactions on Visualization and Computer Graphics.
Download Paper