Chapter 8 rendering objects in Pro Ogre 3D Programming

Source: Internet
Author: User
Tags change settings win32 window

OGRE rendering object
The render target in Ogre is only an abstraction of a region that shares the AGP memory or video memory. This region stores 2D rendering results for all or some scenes. The most common render target is the main rendering window, which is the main window of the application. Using this render target requires no additional effort. ogre can help us create it. You can also render all or part of the scene (or even invisible part of the scene) to a texture, which can be used by other polygon in the scene. The target hardware buffer is abstracted as a render target in ogre. Its Physical display can be a primary rendering window, a secondary window, or a non-visual texture.
The rendering object is also the event source. If the program registers this event, ogre will notify the program during pre-and postrender, which gives the program the opportunity to change settings and respond. The above notification mechanism is available even for pre-and post-rendering at the viewport level. Ogre provides the ability to calculate rendering state information, such as how long a frame is rendered and how many triangles are rendered on the object.
Rendering window
Windows are created and maintained by the rendering system (a standard Win32 Window under D3D9 ). Ogre allows you to configure a few rendering Windows, including the size and title bar text. If you want to develop programs such as menus and toolbar, output the ogre rendering
This is possible in the customer area of the window. Ogre provides the system-related window handle to the rendering window, and also allows you to provide the handle to the parent window. The first created ogre rendering window is called the main window. A secondary window is created. Windows are not as important as the main window. If there are three windows in the program, the other two sub windows must be destroyed before the main window is destroyed for proper resource cleaning.
View
A rendering window contains one or more views. The viewport is a rectangular area, and the scenario manager shows the scene content visible from a camera.
In this area. A camera will be referenced when a view is created, but this is not a static attribute of the view, and can be changed at any time
The camera that renders the viewport. Each view has a z-ordinal number, and the height of the z-ordinal value is above that of the Z-ordinal value. The z-order 0 is always owned by a view that can overwrite the entire rendered object. By default, ogre clears the color and depth buffer of the view. However, you can disable these buffer cleanup operations. A viewport is the unique interaction point between a rendering object (that is, a specific rendering system, such as OpenGL) and a camera (that is, the scenario manager and its content. The view does not need to occupy the surface of the entire rendered object. A 3D rendering camera is not a real camera. All the geometric levels of compensation are calculated based on the camera's position. Therefore, you cannot simply change the camera's focus attribute to achieve a "closer" rendering effect.
Rendering to texture
The procedure is as follows: Create a texture rendering object, configure its rendering properties, add the texture to the rendering Object List, and set the texture to the used material. The texture object is updated prior to other rendering object types, which ensures that objects with rendered textures are correctly rendered. Rendered textures can be processed like other normal textures. For rendering textures that do not need to be updated at each frame, you can disable automatic frame-by-frame update and use manual update. From the bottom layer, the texture object is a hardware buffer. For the sake of performance, we think they are only written and static. Rendering to the texture will render the ry in the scene, which takes some time to execute, and will reduce the Frame Rate of the application. However, many useful technologies cannot be used to render textures. Real-time shadow and reflection. Ogre supports rendering to multiple textures. The only limit is that they have the same size.
Rendering object class
RenderTarget is the base class of RenderWindow, MultiRenderTarget, and RenderTexture.
RenderWindow is implemented by subclass: D3D9RenderWindow, GLXWindow, etc. In windows, because Direct3D 9 or OpenGL can be used, D3D9RenderWindow and Win32Window can be used respectively.
RenderSystem: createRenderWindow () to create a window.
Virtual RenderWindow * Ogre: RenderSystem: createRenderWindow (const String & name,
Unsigned int width,
Unsigned int height,
Bool fullScreen,
Const NameValuePairList * miscParams = 0
)
The last parameter allows you to set window attributes.
Code for embedding a rendering window into an external window
NameValuePairList params; // is just a typedef std: map <std: string, std: string>
// Set external window handle -- assume that you have
// Already created a window to embed the Ogre render window, and its handle is
// Stored in an integer variable called "mParent"
Params ["external1_whandle"] = StringConverter: toString (mParent );
// Window can be resized later to fit within parent client area if needed
RenderWindow * window = createRenderWindow ("MyWindow", 800,600, false, & params );
Rendering to texture Demo
Demo creates a skewed plane at the origin, sets the camera, and sets the scene (the scene is composed of a devil's head and several rings)
The reflection relative to the plane is rendered to the texture. The painttexture and the static texture that has been applied in the plane are mixed to achieve the reflection effect.
Create a rendering texture:
TexturePtr texture = TextureManager: getSingleton (). createManual ("RttTex ",
ResourceGroupManager: DEFAULT_RESOURCE_GROUP_NAME, TEX_TYPE_2D,
512,512, 0, PF_R8G8B8, TU_RENDERTARGET );
Create a camera and a viewport for rendering to a texture:
MReflectCam = mSceneMgr-> createCamera ("ReflectCam ");
MReflectCam-> setNearClipDistance (mCamera-> getNearClipDistance ());
MReflectCam-> setFarClipDistance (mCamera-> getFarClipDistance ());
MReflectCam-> setAspectRatio (
(Real) mWindow-> getViewport (0)-> getActualWidth ()/
(Real) mWindow-> getViewport (0)-> getActualHeight ());
Viewport * v = rttTex-> addViewport (mReflectCam );
V-> setClearEveryFrame (true );
V-> setBackgroundColour (ColourValue: Black );
Create a material that uses the rendered texture:
MaterialPtr mat = MaterialManager: getSingleton (). create ("RttMat ",
ResourceGroupManager: DEFAULT_RESOURCE_GROUP_NAME );
TextureUnitState * t = mat-> getTechnique (0)-> getPass (0)-> createTextureUnitState ("RustedMetal.jpg ");
T = mat-> getTechnique (0)-> getPass (0)-> createTextureUnitState ("RttTex ");
// Blend with base texture
T-> setColourOperationEx (LBX_BLEND_MANUAL, LBS_TEXTURE, LBS_CURRENT, ColourValue: White,
ColourValue: White, 0.25 );
T-> setTextureAddressingMode (TextureUnitState: TAM_CLAMP );
T-> setProjectiveTexturing (true, mReflectCam );
RttTex-> addListener (this );
The material name is RttMat, which contains a technology with a channel. The channel has two texture units, one being a static texture (rustedmetal.jpg)
And the other Texture unit is the rendering texture. The two textures are mixed proportionally. A proper texture addressing mode is set and perspective texture support is enabled.
And registers a listener for the rendered texture (which is the rendering object.
Materials Used for camera inversion:
// Set up linked reflection
MReflectCam-> enableReflection (mPlane );
// Also clip
MReflectCam-> enableCustomNearClipPlane (mPlane );
}
// Give the plane a texture
MPlaneEnt-> setMaterialName ("RttMat ");
After the camera is set in this way, the rendering will be the reflection of the scene, which is based on the specified plane. The role of customizing the latest cropping plane is:
Objects lower than the reflection surface will not be rendered. When rendering a texture, you do not want to render the plane as well. You can register it on the top.
The listener has some hands and feet:
Void preRenderTargetUpdate (const RenderTargetEvent & evt)
{
// Hide plane
MPlaneEnt-> setVisible (false );

}
Void postRenderTargetUpdate (const RenderTargetEvent & evt)
{
// Show plane
MPlaneEnt-> setVisible (true );
}
The last note is that the positions of the two cameras must be consistent with the orientation of each frame:
FrameStarted:
MReflectCam-> setOrientation (mCamera-> getOrientation ());
MReflectCam-> setPosition (mCamera-> getPosition ());

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.