为了生成自然逼真且风格可控的人体运动,提出一种新颖的风格图模型.文中将运动风格看做是同一个运动内容投影到不同行为子空间的时空特征变换,对于任意给定的人体运动数据,首先对其进行主旋转分析,从中分离出运动风格与运动内容;然后根据其运动风格对一个候选风格集进行风格筛选,并动态地构建出一个风格图;最后将用户选定的风格变换实时地作用在原始运动内容上,生成风格可控的人体运动.利用文中模型将一组行走运动的风格集重用在多种运动内容上(如拍球、滑翔等)的实验结果表明,合成出的人体运动既保留了原始运动内容的细节,又能够体现出用户指定运动风格的特征,可用于生成风格可控的人体运动.
This paper presents a novel model called a style graph for generating style-controllable human motions. Styles are considered as the space-time transformations that project a single content into different behavioral subspaces. For an input human motion, style and content can be extracted via a proposed principal rotation analysis. Then a candidate style set is pruned according to the input style, and then a style graph can be automatically constructed from the remaining styles. Finally, users are allowed to control the style of the input motion with the style graph. We evaluate our model by controlling the styles of various human motions (such as dribble and airplane motions) with a single style graph that is constructed from a stylistic walking database. The synthesized motions own the characteristics of user-specific styles while preserving the contents of the input motions. Experiments demonstrate that style graph is effective for generating style-controllable human motions.