Face Animation in OGRE

Source: Internet
Author: User
[Preface: In the Facial Demo of Ogre, a vertex Animation (Pose Animation) is used to Animation Facial expressions and pronunciation on the face. This article briefly introduces the concept of vertex Animation, combined with Facial Demo, we will discuss how to implement Pose Animation.]

Vertex animation directly uses vertices to make mesh animation. Each group of actions corresponds to a vertex data entity in the vertex animation. Vertex animation is stored in the. mesh file because it is closely linked to the vertices of the mesh. In fact, vertex animation is divided into two seed types.

1. Morph Animation (deformation Animation)

Deformation animation relies on the snapshot (snapshot) that stores and inserts the absolute position of vertices between and between each key frame ). Deformation animation is useful when the skeleton animation cannot properly process the animation object. When the structure and shape of the animation part must be fundamentally changed, the skeleton animation is not suitable.

Because of the use of absolute position data, it is impossible to mix more than one deformation animation in the same vertex data; if you want to use animation blending, you should use bone animation because it is more effective. If you activate multiple animations contained in the same vertex data, only the last animation is valid. That is to say, the "weight" option of the animation state cannot be used for deformation animation.

Deformation Animation and bone Animation can be applied in combination (see Ogre Manual 8.3.3 Combining Skeletal and Vertex Animation ). Meanwhile, deformation Animation can also be implemented in hardware using Vertex Renderer (see Ogre Manual Morph Animation in Vertex Programs)

2. Pose Animation)

Pose animation allows you to combine multiple vertex poses that are potentially different performance levels into the final vertex state. This animation is usually used for Face Animation. In this animation, each facial expression is used as an independent animation. We can mix one facial expression with another, if each pose only affects a part of the face, you can also combine all the expressions.

To generate pose animation, You need to reference a set of action sets that are pre-contained in the mesh. These action sets are represented by the offset from the source vertex. However, there is no need for an offset for each vertex. when the data is processed by software, vertices without an offset will be ignored. If hardware is used for processing, vertices without offset are automatically filled with 0.

Once the pose is defined, You can reference them in the animation. Each pose animation track corresponds to a separate geometric data set (corresponding to an object mesh or a sub-mesh ), each key frame in an animation can reference one or more pose, and each has an influence value ). You can define many key frames and use a mixture of multiple gestures to generate animations for coordinated motion of multiple parts.

Be careful when multiple gestures are applied at the same time. When processing Pose Animation in hardware (refer to Ogre Manual Pose Animation in Vertex Programs), each activated action requires an additional Vertex buffer to be added to the Renderer (shader, if you use software to process an animation, the more activation posture you process, the longer it takes. That is to say, if there are two attitudes in a key frame and two in the next frame, there are actually four activated key frames in the transition between them.

You can apply pose Animation and skeleton Animation in combination. For more information, see Ogre Manual 8.3.3 Combining Skeletal and Vertex Animation. You can also use hardware to accelerate the shader) (see Ogre Manual Pose Animation in Vertex Programs ).

3. Pose Animation xml structure

<Mesh>

<Submeshes>

<Submesh material = "submesh material"...>

<Faces count = "Number of faces">

<Face.../>

</Faces>

<Geometry vertexcount = "Number of vertices">

<Vertexbuffer...> // vertex buffer content

<Vertex>

<Position vertex position>

<Normal vertex normal vector>

<Texcoord texture coordinate>

</Vertex>

</Submesh>

</Submeshes>

<Submeshnames> // mainly provides index numbers for pose and animation.

<Submeshname = "subgrid name" index = "subgrid index number (starting from 0)">

</Submeshnames>

<Poses> // these parameters are mainly valid for opponent transfer.

<Pose target = "submesh or mesh" index = "" name = "">

<Poseoffset index = "vertex index" x y z relative offset/>

</Pose>

</Poses>

<Animations> // The following animations are valid for automatic playback.

<Animation name = "animation name" length = "frame length">

<Tracks> // animation trace

<Track target = "submesh or mesh" index = "pose index number" type = "animation type, pose">

<Keyframes> // Key Frame

<Keyframe time = "time">

<Poseref poseindex = "pose Index" influence = "impact value [0-1]"/>

</Keyframe>

</Keyframes>

<Track>

</Tracks>

</Animation>

</Animations>

</Mesh>
4. How to Implement

4.1 Support for Pose Animation by modeling tools

The header model used in Facial Demo is an XSI Facial Animation model authorized by SoftImage. Not all export tools support Pose Animation, currently, SoftImage XSI 5.0 Exporter v1.2.3, oFusion Pro for 3ds max, and Maya Ogre are supported.

4.2 search for key vertices

By analyzing the poseoffset offset in xml, we can see a lot of very small data, which is generated by the model export tool. In fact, they can be completely ignored. Through experiments, I reduced the number of poseoffset data of the original 604 vertices to 75, and the effect is not much different from the original. It means that the vertices corresponding to the 75 poseoffset are the key vertices of the animation.

4.3 how to read Pose Animation in a program

The following code creates an animation in the createScene function:

// Load the mesh containing pose animation

MeshPtr mesh = MeshManager: getSingleton (). load ("aaa. mesh", ResourceGroupManager: DEFAULT_RESOURCE_GROUP_NAME );

// Create an animation and name it smile. initialize the animation length to 0.

Animation * anim = mesh-> createAnimation ("smile", 0 );

// Create a vertex animation trace. Here, only the Pose animation with an index value of 0 is used. Therefore, the first parameter is 1 [1].

// These animation pose are defined in the <poses> label.

VertexAnimationTrack * track = anim-> createVertexTrack (1, VAT_POSE );

// Create a manual animation and set the starting position of the key frame to 0

ManualKeyFrame = track-> createVertexPoseKeyFrame (0 );

// Create a poses reference for manual animation and initialize it to 0

ManualKeyFrame-> addPoseReference (0, 0.0f );

// Create an object (this is Dr. Bunsen's profile picture)

Entity * head = mSceneMgr-> createEntity ("Head1", "aaa. mesh ");

// Obtain the animation parameter named action from the mesh (defined in the <animations> label)

ActionAnimState = head-> getAnimationState ("action ");

// Whether the animation can be played automatically

ActionAnimState-> setEnabled (true );

// Obtain the animation smile obtained from the mesh file and set the starting position of the key frame to 0.

ManualAnimState = head-> getAnimationState ("smile ");

ManualAnimState-> setTimePosition (0 );
The following code automatically plays an animation in the frameStarted function that is called at the beginning of each frame rendering:

// Modify the key frame position based on the number of seconds consumed since the previous frame.

ActionAnimState-> addTime (evt. timeSinceLastFrame );

--------------------------------------------------------------------------------

[1] the Ogre API specifies the meaning of the first parameter createVertexTrack: Handle to give the track, used for accessing the track later. Must be unique within this Animation, and is used
Identify the target. For example when applied to a Mesh, the handle must reference
Index of the geometry being modified; 0 for the shared geometry, and 1 + for SubMesh
Geometry with the same index-1.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.