Reprinted please indicate the source for the klayge game engine, the permanent link of this article is http://www.klayge.org /? P = 2137
Klayge 4.1 supports screen space reflection provided by Wang Qingyuan. However, because it is only screen space, the reflection direction is limited to the same direction as the line of sight. In the upcoming klayge 4.2, this feature has been extended to support multiple directions and has become a real-time full-direction non-plane reflection. (Although this function has been implemented by the end of July, it is only now time to sort out the demo ...)
Multi-direction reflection must be rendered multiple times in different directions, and each time must have the effect of lighting and shading. Therefore, the deferred rendering layer introduces the multi-viewport function. You only need to define several different viewports to get the rendering results for different viewpoints at the same time, including the RGB/depth of the current frame, and the RGB/depth of the previous frame. This function can be used not only for reflection, but also for split-screen display, multi-view display, thumbnails, and other common game scenarios. This is equivalent to extending the deferred rendering pipeline several times. Fortunately, the resolution can be reduced in general, and the final overhead is not too large.
In theory, we can create a cubemap six times from the center of an object and index the reflected color, just like the traditional method for real-time multi-direction reflection. However, with SSR, rendering is not required for the line of sight. At the same time, because the greatest contribution to the final effect is in the opposite direction of implementation, we only need to rotate from the center of the object along the opposite direction of Sight (calledBack side cameraWhich is calledFront side camera). Then we can use the SSR tracking method to achieve quite good results. With the help of Multi-viewport, this process becomes quite simple and flexible. For pixels that are not tracked, the color of the Environment cubemap is taken directly. As follows:
As for the back side camera selection, you can choose a spherical object from the center of the object, along the opposite direction of sight. For objects with a partial plane, you can obtain a mirror reflection from the point of view and line of sight to generate a new camera. In practice, this method is enough to deal with a large number of outdoor scenarios.
Because it is the pixel obtained by SSR tracking, rather than the simple cubemap ing, the reflection form of the dinosaur on the teapot is almost real. At the same time, note that the self-reflection can be generated on the teapot by the mouth and lid. With such a framework, reflection of most shapes can be well solved.