Wednesday, November 14, 2007

Automated Anticipation?

I came across a paper entitled "Anticipation Effect Generation for Character Animation". Basically, these researchers were looking at automated ways to add anticipation to existing animation (presumably 3d animation). It's an interesting notion, and seems to tackle the subtleties of animation from the opposite perspective as motion capture. This technique aims to make it easier to improve keyframe animation. Most animators would object to this approach on the grounds that anticipation is subjective and not easily derived automatically. The researchers found the same thing - they could not automate the duration of the anticipation and needed human intervention.

Here's the abstract:
"According to the principles of traditional 2D animation techniques, anticipation makes an animation convincing and expressive. In this paper, we present a method to generate anticipation effects for an existing animation. The proposed method is based on the visual characteristics of anticipation, that is, “Before we go one way, first we go the other way [1].” We first analyze the rotation of each joint and the movement of the center of mass during a given action, where the anticipation effects are added. Reversing the directions of rotation and translation, we can obtain an initially guessed anticipatory pose. By means of a nonlinear optimization technique, we can obtain a consequent anticipatory pose to place the center of mass at a proper location. Finally, we can generate the anticipation effects by compositing the anticipatory pose with a given action, while considering the continuity at junction and preserving the high frequency components of the given action. Experimental results show that the proposed method can produce the anticipatory pose successfully and quickly, and generate convincing and expressive anticipation effects."

The entire paper can be found at:
http://www.springerlink.com/content/h42451j2j38l5216/ and the pdf is at
http://www.springerlink.com/content/h42451j2j38l5216/fulltext.pdf

(You may need to be on USC campus to be able to access the links)

No comments: