Real-time facial animation for 3d stylized character with emotion dynamics

Published in Proceedings of the 31st ACM International Conference on Multimedia, 2023

Our aim is to improve animation production techniques’ efficiency and effectiveness. We present two real-time solutions which drive character expressions in a geometrically consistent and perceptually valid way. Our first solution combines keyframe animation techniques with machine learning models. We propose a 3D emotion transfer network makes use of a 2D human image to generate a stylized 3D rig parameter. Our second solution combines blendshape-based motion capture animation techniques with machine learning models. We propose a blendshape adaption network which generates the character rig parameter motions with geometric consistency and temporally stability. We demonstrate the effectiveness of our system by comparing it to a commercial product Faceware. Results reveal that ratings of the recognition, intensity, and attractiveness of expressions depicted for animated characters via our systems are statistically higher than Faceware. Our results may be implemented into the animation pipeline, supporting animators to create expressions more rapidly and precisely.

Recommended citation: Pan, Y., Zhang, R., Wang, J., Ding, Y., & Mitchell, K. (2023, October). Real-time facial animation for 3d stylized character with emotion dynamics. In Proceedings of the 31st ACM International Conference on Multimedia (pp. 6851-6859).
Download Paper