We know that for Visual C + +, the implementation of the interaction is done through a message map. Similarly, X3D's interactive implementation is also through a similar mapping mechanism. The generation of user events is generated by the corresponding sensor, and the route statement is the equivalent of an event mapping, which is mapped to another input event.
In the X3D, the user event sensors are: keysensor (keyboard sensor), Stringsensor (string sensor), Touchsensor (touch sensor), Planesensor (translation sensor), spheresensor (around the point of the rotary detector ) and Cylindersensor (Y-axis rotation sensor) and so on. Because the BS browsing plug-in to X3d new Keysensor and Stringsensor node support is not good enough, so here is not discussed.
One, Touchsensor (Touch Sensor) node
The Touchsensor (touch Sensor) node produces events based on a fixed-point input device (usually a mouse). These events indicate whether the user is clicking on a geometry and where the user is, and when the key to the fixed-point device is pressed. Its primary domain or event is:
Description--The text hint for this node function.
Enabled-Sets whether the sensor node is valid. The default value is true.
IsActive-Sends events when you click or move the mouse (pointing device). Isactive=true when you press the primary key of the mouse, release
Isactive=false.
Isover-Sends events when the pointing device moves over the sensor surface.
Hitpoint_changed-The location of the click Point of the event output in the Sub-node local coordinate system.
Hitnormal_changed-event outputs the normal vector of the surface of the click Point.
Hittexcoord_changed-event outputs the texture coordinates of the surface of the click Point.
Touchtime-Generates a time event when the sensor is clicked by the pointing device.
What needs to be stated is:
(1) If the fixed-point device does not point to the sensor's sibling geometry, and the user begins to move the fixed-point device to the sensor's sibling geometry, the sensor generates a Isover event and sets its value to true; Conversely, if the fixed-point device has pointed to the sensor's sibling geometry, When the user moves the fixed-point device out of the sensor's sibling geometry, the sensor generates a Isover event and sets its value to false.
(2) When a user moves a fixed-point device from one point of the geometry to another, the sensor sends a series of events: Hitpoint_changed, hitnormal_changed, hittexcoord_changed, respectively, indicating the location of the user, The normal vector and texture coordinates of the point.
(3) When the user clicks on the object being monitored by the touchsensor, the sensor will produce a isactive event with a value of true, and the sensor will produce an event isactive false when the user releases the key of the fixed-point device.
(4) If the user presses the mouse button while pointing to the geometry, and then releases the mouse button while still pointing to the geometry (or back to the geometry), the sensor will send a Touchtime event indicating when the key was released. You can use this event to simulate many common user interfaces (such as actions that occur only when a user clicks and releases a button on a fixed-point device).
Here are two examples to illustrate: