Ogre Reference Manual (12) 8 animations

Source: Internet
Author: User

8 animations

The ogre provides a flexible animation system that supports the following types of animations:

8.1 Skeletal Animations

Skeletal animation is achieved by moving the tree structure bone in the mesh, and the vertices are moved accordingly through the bound bones. Skeletal animation is also called skin animation. Typically created by modeling tools such as Softimagexsi,milkshape3d,blender,3d Studio,maya. Ogre provides an export tool to convert these models to the Ogre engine's own format

You can provide different levels of support for skeletal animations, and not all engines, including modeling tools, support full functionality. Ogre supports the following:

L Each mesh can be linked to a bone

L A skeleton can support infinite bones.

Forward kinematics of L-tree structure

L One skeleton supports multiple named animations

L Support unlimited keyframes for each animation

L linear or spline key frame difference values

L vertices can bind multiple bones and set weights for smooth skin

The L grid can apply multiple animations simultaneously using weights,

Bones and accompanying animations are saved in the. skeleton file that is generated by the Ogre Export tool. When you create an entity based on mesh, the associated bones are loaded automatically. You can then use Amimationstate to apply animations

Skeletal animations can be implemented either through software or through shaders (hardware skin/skins). There is no doubt that the hardware is better because you can move some of the work from the CPU to the GPU, and it also means that you do not need to re-upload the vertex data at each frame. This is especially important for large, high-detail models. You should use the hardware skin as much as possible, which requires you to set the vertex program for the material. Refer to the skeletal animation in the vertex program. Skeletal animations can be blended with vertex animations at the same time, referencing mixed bones and vertex animations

8.2 Animation status (animation state)

A Animationstate object is generated for each animation when an entity that contains any type of animation is created

You can get the animationstate pointer through entity::getanimationstate, and then use that pointer to update the animation state (typically in the Framestated event). Animationstate needs to be set to valid by setenabled, and the weight and time position can be set by the appropriate method. You can also change the animation position incrementally by using Addtime, which automatically processes loops, and addtime can use negative values for inverse animations.

8.3 Vertex animation

Vertex animation moves the vertices of the mesh directly. Each vertex animation track requires a Vertexdata instance. Vertex animations are saved in a. mesh file because they are closely related to the vertex structure of the mesh.

Vertex animations are divided into two types:

L deformation animation (morph) deformation animation is implemented by simple techniques of direct interpolation of key frames, and morphing animations are directly linked to the ancient college-style animations that preceded the extensive use of skeletal animation.

L Attitude Animation (pose) Attitude animation is implemented by mixing multiple unrelated gestures, and the attitude is shifted relative to the base vertex data, and the final effect is generated by different weights. Gesture animation The most obvious use when the facial animation

Why is it divided into two seed types

In fact, both vertex animations can be implemented using gesture animations. Independent morph animations are provided because the grouping of Morph animations is easy to define, while the need for hardware shaders is lower.

Morph animations provide a series of vertex data snapshots for difference values, which is simple to animate. Because it's just a linear difference between keyframes, you can quickly apply animations to the entire grid. However, this simple approach does not support multiple Morph animation blends. If you need to blend animations, it is recommended that you use skeletal animations when applying animations to the entire grid, or use gesture animations when part of the mesh or not for skeletal animation, such as face animation. When animating in a shader, a warp animation requires only two vertex buffers to hold absolute positional data, as well as a difference factor. Each trajectory of a morph animation corresponds to a unique vertex data.

Gesture animations are more complex, like Morph animations, where each track requires a unique set of vertex data. But the difference is that each keyframe can refer to 1 to multiple gestures, using their respective influence coefficients. Gestures are offsets relative to the underlying vertex data and can be sparse (no reference to all vertices). Because it is offset, it can be blended between the same track or multiple animations. These features are especially suitable for facial animations.

For example, you create a facial model that defines a set of gestures that represent a variety of voices. You can then define an animation called "SayHello", which contains a trace of the underlying data of the face, which contains a series of keyframes, each of which refers to 1 to multiple face positions and uses different influence factors, and these combinations of time form the expression "Hello". Because gestures are stored only once, and can be used in multiple animations, it is a powerful way to build a voice system.

The disadvantage of gesture animation is that it is difficult to establish, and it is necessary to define the various posture of keyframe reference. Also, because more buffers are used (for the underlying data and for each active gesture), you need to consider the number of simultaneous gestures when using the hardware vertex shader to process the animation. By using the material script includes_pose_animation, the vertex program definition can specify the maximum number of supports, and refer to using gesture animations in the vertex program.

This is done by dividing the vertex animations into two implementations, keeping the simple warp technique easy to use and allowing powerful gesture animation techniques. Note: Morph animations cannot be blended with other types of vertex animations, and gesture animations can be blended with gesture animations. At the same time, all Morph animations can be implemented in a gesture animation (more complex way), not in turn.

Subtype application based on trajectory

The main note is that subtypes are at the track level, not animation or grid level. Because the trajectory maps to the Vertexdata example, this means that if the mesh is divided into multiple sub-meshes (Submesh) and not used for the respective geometry, you can apply the pose animation in a sub-grid, one using the Morph animation (or non-vertex animation).

For example, for a complex role of the common settings, the head is divided into separate sub-grid, the use of attitude animation, others using skeletal animation

Vertex buffer Schedule

When using software to process vertex animations, place the vertex's position in a separate hardware buffer. This avoids uploading other vertex data while updating, avoiding GPU bandwidth saturation. When you create a. mesh format using the Ogre Export tool, Ogre automatically does this. If you create a buffer yourself, you need to work on the layout

Ogre::mesh provides a set of functions to assist with these. Ogre::vertexdeclaration::getautoorganiseddeclaration () can convert vertex declarations to the recommended way, Ogre::vertexdata::reorganisebuffers () The contents of the buffer can be re-organized.

8.3.1 morph Animation (morph)

Morph Animations Store absolute vertex positions at each keyframe, implemented by interpolation. Used when the skeletal animation is not appropriate, usually the structure and shape of the object has been completely changed in the animation, not suitable for skeletal animation.

Because an absolute position is used, it is unlikely that a morph animation with the same vertex data will be blended. If you need to mix, you should use more effective skeletal animations. If you activate multiple morph animations of the same vertex data, only the last one is valid. This means that Animationstate's weight option is useless for morph animations

Morph animations can be blended with bone animations, for reference to mixed bones and vertex animations. Morph animations can also be implemented through vertex shaders, reference: Warp animations in vertex programs

8.3.2 Attitude Animation (pose)

Gesture animations allow multiple different influence coefficients to be blended into the final vertex state. Typically used for facial animations, where each expression acts as a standalone animation, by weighting each other or combining them when they affect different parts.

To achieve these, gesture animations are defined by referencing the attitude relative to the original data offset. Gesture animations do not need to specify offsets for so vertices. When mixed with software, these vertices are skipped completely. Automatically created vertex items do not include 0-offset vertices when using hardware, which corresponds to vertex items for each vertex.

A well-defined gesture can be referenced in an animation. Each animation track refers to a single geometry (shared geometry or sub-mesh-specific geometry), and the keyframe in the trajectory grams refers to one or more gestures, each of which corresponds to a different influence factor. The weight of the entire animation also scales these influence factors. There is a reference to the adjacent keyframe and the current keyframe has no reference to the attitude influence factor to do 0 processing.

You should be aware of the number of gestures that are enabled at the same time. When using hardware to handle attitude animations, each active gesture requires a vertex buffer. When you use two gestures in a frame, and the next frame uses a different two, it actually takes four active gestures for the difference.

You can mix gesture animations and skeletal animations, refer to mixed bones and vertex animations, and you can also use hardware acceleration to refer to the gesture animations in the vertex program

8.3.3 mixed bones and vertex animations

You can use both bone and vertex animations for an entity. The vertex animation is applied at this point before the skeletal animation. This allows for the use of gesture vertex animations on the character's face while using skeletal animations to achieve the main motion animation.

From a user's point of view, blending is as simple as enabling two animations at the same time. With this useful feature, however, there are a few issues to note:

Hybrid Hardware Skins

For complex roles, it is a good idea to implement hardware skins through a vertex program. Reference: Animations in vertex programs

You need to implement the animation in the same way as Ogre, which is to animate the vertex before applying the skeletal animation. Remember that the Morph animation requires two absolute position snapshot buffers and a difference parameter corresponding to the start/end keyframes, and the gesture animation requires a base vertex data, n attitude offset buffers, and n weight parameters (n is the number of active gestures).

Sub-grid Division

If you mix vertices and bones with only a small portion of the mesh, such as the face, you should divide the mesh into two parts, one for the mix (head), and the other to be mixed. This reduces the computational load while reducing the use of vertex buffers, because vertex keyframes and gesture buffers are also smaller. This allows you to implement two separate vertex programs, one for skeletal animation only, and the other for skeletal and vertex animations

8.4 Scene Node Animation

Scene node animations are created by Scenemanager to move the associated objects while moving the nodes. You can learn about this by using the example Cameratrack in the camera and the fish in the Fresnel.

Scene node animations are basically the same as the code for skeletal animations. After you create an active drawing from scenemanager::createanimation, you can create nodeanimationtrack for each node, creating keyframes that control position, orientation, and scaling for linear or spline differences. You can use animationstate in the same way as skeletal animation, except that it is obtained by Scenemananger, not by entity. Animations are automatically applied at each frame, and you can control them manually via Scenemanager::_applysceneanimations ().

8.5 Numerical animations

In addition to the several commonly used animation types described earlier, you can also use animations to change any values exposed through the Animableobject interface. (This type of animation does not change the geometric mesh, it just changes the value, which is generally the property value of the object)

Animableobject

Animableobject is an abstract interface for maintaining and accessing Animablevalue. Animableobject maintains a dictionary of all available animated properties, which can be obtained through getanimablevaluenames. You can use Createanimablevalue to create Animablevalue objects. Animablevalue is used to correlate animation interfaces and specific object property values (initiated by Numericanimationtrack and eventually called Animablevalue::applydeltavalue).

An example is the Ogre::light class, which extends the Animableobject interface (by inheriting Movableobject, Movableobject directly inherits Animableobject) and provides a set of Animablevalue objects for modifying properties such as Diffusecolour\attenuation. You can create numericanimationtrack and modify the light's properties through these value objects. Other custom objects can support animated properties in the same way.

Animablevalue

When implementing custom animation properties, you need to implement the Animable interface. They are not pure virtual methods, because you only need to implement methods that animate the corresponding types of properties. can refer to the Ogre::light class implementation

Ogre Reference Manual (12) 8 animations

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.