In OSG, write the following simple code
osg::ref_ptr<osgviewer::viewer> Viewer = new Osgviewer::viewer ();
Viewer->setscenedata (Osgdb::readnodefile ("GLIDER.OSG"));
Viewer->run ();
Running can see the glider in the scene and manipulate it with the mouse gesture. Did you find it strange that when you write code in the render function using OpenGL code, the resulting scene is still, and the event handler function is required to control the change of the scene.
In fact, with such a simple code, OSG did a lot of work for us, including adding a robot to manipulate the scene. Let's take a look at what the bots are about:
Basic theory
First of all, to understand the concept of the Model view matrix in OpenGL, in OpenGL, through the Model view matrix, we transform the vertex data from the local coordinate system to the camera coordinate systems, OpenGL Model View matrix is one, or can be understood as a whole. But in OSG it separates the two, OSG the model matrix through the tree structure of the scene (through the matrixtransform in the scene), puts the view matrix in the rover, and creates the Model view matrix by both. With regard to the view matrix, there is an important conclusion: the position attitude matrix of the camera in the world coordinate system equals the inverse matrix of the camera Observation matrix (view matrix) . The position attitude matrix of the camera can be understood as the transformation matrix of the vertex to the world coordinate system under the camera coordinate system, while the observation matrix (view matrix) is understood as the transformation matrix of the vertex of the world coordinates to the camera coordinate. For the derivation of this theory, you can refer to: Deduce the camera transformation matrix
A few other questions to understand are: 1) What is the OSG in the end. 2) When the rover was added. 3) How the Rover works. 4) How should we write our own bots. Let's analyze each of these
OSG Bots are what bots are implemented in the Osgga library, which is primarily designed to handle user interactions with three-dimensional scenes (including mouse, keyboard, gesture, joystick, etc.), providing a large number of bots to display, The track Ball operators (Trackballmanipulator), which are added by default in the code at the beginning of the article, inherit from Osgga::cameramanpualtor
Osgga::cameramanipulator inherits from Osgga::guieventhandler, and we know that Guieventhandler is the class used to handle events, From this we can know that the rover is actually a class that interactively modifies the position and posture of nodes in the scene, except that it modifies the topmost camera node, which affects the entire scene. The OSG Viewer (view) manages the list of Guieventhanlder, which is generally added by using: addEventHandler this way, it doesn't make sense to use this method when adding a bot, Because the Getmatrix and Getinversematrix functions used to express the camera's viewing orientation are never called by the viewer view, the correct way to add the bots is:
Viewer.setcameramanipulator (new Osgga::trackballmanipulator);
When the rover was added to the sample code at the beginning of the article, we didn't call Setcameramanipulator's code, how was the operator added? The answer is in the Viewer->run () line of code, by looking at its code and knowing
int Viewer::run ()
{
if (!getcameramanipulator () && Getcamera ()->getalloweventfocus ())
{
Setcameramanipulator (New Osgga::trackballmanipulator ());
}
Setreleasecontextatendofframehint (false);
return Viewerbase::run ();
}
When we do not set the bots before calling run, Osgviewer::viewer will set a trackball for us by default.
In addition, through the implementation of the Setcameramanipulator function:
void View::setcameramanipulator (osgga::cameramanipulator* manipulator, bool resetposition)
{
_ Cameramanipulator = manipulator;
if (_cameramanipulator.valid ())
{
_cameramanipulator->setcoordinateframecallback (new Viewercoordinateframecallback (this));
if (Getscenedata ()) _cameramanipulator->setnode (Getscenedata ());
if (resetposition)
{
osg::ref_ptr<osgga::guieventadapter> dummyevent = _eventqueue->createevent () ;
_cameramanipulator->home (*dummyevent, *this);}}
You can see that this function sets the home position at the same time, that is, if our own bots want to have an initial position, then we can re-implement the virtual function of home to achieve this purpose.
How the bot works when the rover is added, it must handle the input events, and update the scene, to understand that the content needs to go into the OSG in the drawing of each frame in the code, in the OSG frame of the painting will undergo event traversal, update traversal, rendering these three processes, Detailed code can be see the code frame function for each frame:
void Viewerbase::frame (double simulationtime)
{
if (_done) return;
osg_notice<<std::endl<< "Compositeviewer::frame ()" <<std::endl<<std::endl;
if (_firstframe)
{
viewerinit ();
if (!isrealized ())
{
realize ();
}
_firstframe = false;
}
Advance (simulationtime);
Eventtraversal ();
Updatetraversal ();
Renderingtraversals ();
}
It is clear that the bot's handling of the event should be in the Eventtraversal function, and the code for the rover update should be in Updatetraversal, which is actually true: After Evnettraversal finishes processing event callbacks for events EventHandler and all nodes in the scene, call the event handler of the bot at the end of the function:
for (Osgga::eventqueue::events::iterator ITR = Events.begin ();
ITR! = Events.end ();
++ITR)
{
osgga::event* Event = Itr->get ();
if (Event && _cameramanipulator.valid ())
{
_cameramanipulator->handle (event, 0, _ Eventvisitor.get ());
}
}
Also after updating the update callback and traversal of the node in the scene, the update of the rover is processed at the end of the function:
if (_cameramanipulator.valid ())
{
setfusiondistance (Getcameramanipulator ()->getfusiondistancemode (),
getcameramanipulator ()->getfusiondistancevalue ());
_cameramanipulator->updatecamera (*_camera);
}
The Updatecamera function is called as follows (the default implementation)
virtual void Updatecamera (osg::camera& Camera) {Camera.setviewmatrix (Getinversematrix ());}
Directly calling the camera's observation matrix, the matrix is obtained from the Getinversematrix function of the rover, which is the key function we write our own bots, which is a pure virtual function in all the base class cameramanipulator of the rover, which we need to implement, The location where it was called is here. There are several ways to implement your own bots: You can start writing directly from the base class Osgga::cameramanipulator, or you can inherit from its subclasses such as Standardmanipulator, Drivemanipulator, and so on. A simple walkthrough example is provided below, which implements zoom in and zoom out along the mouse position when the user scrolls with the mouse wheel, with the code as follows:
#include <osgGA/CameraManipulator>
//define operator
class Zoommanipulator:public Osgga::cameramanipulator
{public
:
//constructor incoming node compute bounding box
zoommanipulator (Osg::node *node);
~zoommanipulator ();
4 pure virtual functions that all bots must implement
virtual void Setbymatrix (const osg::matrixd& matrix) {}
virtual void Setbyinversematrix (const osg::matrixd& matrix) {}
Virtual Osg::matrixd Getmatrix () Const{return Osg::matrix ( );}
Virtual OSG::MATRIXD Getinversematrix () const;
Gets the incoming node used to use Computehomeposition in Cameramanipulator
virtual const osg::node* GetNode () const {return _root;}
Virtual osg::node* GetNode () {return _root; }
virtual bool Handle (const osgga::guieventadapter& ea,osgga::guiactionadapter& us);
OSG::VEC3 _eye; Viewpoint position
osg::vec3 _direction; Viewpoint Direction
osg::vec3 _up; Upward direction
osg::node* _root;
};
By calculating the world coordinates at the mouse point in the implementation code, the viewpoint moves along the line with the mouse point world coordinates
#include <osgViewer/Viewer> zoommanipulator::zoommanipulator (Osg::node *node) {_root = Node;
Computehomeposition ();
_eye = _homeeye;
_direction = _homecenter-_homeeye;
_up = _homeup;
} zoommanipulator::~zoommanipulator () {} OSG::MATRIXD Zoommanipulator::getinversematrix () const {Osg::Matrix mat;
Mat.makelookat (_eye, _eye + _direction, _up);
return mat; } bool Zoommanipulator::handle (const osgga::guieventadapter& ea,osgga::guiactionadapter& us) {switch ( Ea.geteventtype ()) {case (Osgga::guieventadapter::scroll): {osgviewer::viewer *viewer = Dynamic_cast<osgviewe
R::viewer*> (&us);
Osg::camera *camera = Viewer->getcamera (); Osg::matrix MVPW = Camera->getviewmatrix () * Camera->getprojectionmatrix () * Camera->getviewport ()
Computewindowmatrix ();
Osg::matrix INVERSEMVPW = Osg::matrix::inverse (MVPW);
OSG::VEC3 Mouseworld = OSG::VEC3 (Ea.getx (), ea.gety (), 0) * INVERSEMVPW; OSG::VEC3 Direction = Mouseworld-_eye;
Direction.normalize ();
if (ea.getscrollingmotion () = = osgga::guieventadapter::scroll_up) {_eye + = direction * 20.0;
} else if (ea.getscrollingmotion () = = Osgga::guieventadapter::scroll_down) {_eye-= direction * 20.0;
}} Default:return false;
}
}