background
Ngui is doing a good job of UI and input, but now the problem is that the company introduced the sense of the body, is through the position of the hand to achieve button click operation, if I do not want to change the original design of the Ngui interface and mechanism, how to break?
input underlying mechanism of Ngui
Ngui to the mouse or touch position is through the camera to the Ngui layer of radiographic detection to obtain, and then detect the button event, Touch screen press event to achieve UI operation, from the event mechanism, Ngui although provides several Event Mechanism , but the underlying still notifies the detected control to complete an event via the camera's SendMessage, OK, and look at the code.
A fragment of radiographic detection in the NGUI Uicamera , using the current coordinate position to convert to a world coordinate beam
Static Public BOOLRaycast (Vector3 inPos) { for(inti =0; i < list.size; ++i) {Uicamera cam=List.buffer[i]; //Skip Inactive Scripts if(!cam.enabled | |! Nguitools.getactive (Cam.gameobject))Continue; //Convert to view spaceCurrentcamera =Cam.cachedcamera; Vector3 POS= Currentcamera.screentoviewportpoint (InPos);
..
..
}
}
The Processmouse () method in the NGUI Uicamera uses a ray to detect the button of the current response and notifies the control through the Notify ().
//No need to perform raycasts every frame if(ispressed | | poschanged | | mnextraycast <realtime.time) {mnextraycast= Realtime.time +0.02f; if(! Raycast (input.mouseposition)) Hoveredobject =Fallthrough; if(Hoveredobject = =NULL) Hoveredobject =GenericEventHandler; for(inti =0; I <3; ++i) Mmouse[i].current =Hoveredobject; }
...
...
... ..//The button is released over a different object--remove the highlight from the previous if((justpressed | |!ispressed) && mhover! =NULL&&highlightchanged) {Currentscheme=Controlscheme.mouse; if(Mtooltip! =NULL) Showtooltip (false); Notify (Mhover,"Onhover",false); Mhover=NULL; }
NGUI Uicamera The notification method in the package, the camera sends different types by logical judgment, to send when clicked: Notify (currenttouch.pressed, "OnClick", null); to send when pressed: Notify (currenttouch.current, "Onhover", true);
Static Public voidNotify (Gameobject go,stringFuncName,Objectobj) { if(mnotifying)return; Mnotifying=true; if(Nguitools.getactive (GO)) {go. SendMessage (FuncName, obj, sendmessageoptions.dontrequirereceiver); if(GenericEventHandler! =NULL&& GenericEventHandler! =go) {genericeventhandler.sendmessage (funcName, obj, sendmessageoptions.dontrequirereceiver); }} mnotifying=false; }
The onclick () and ondragover () events in UIButton in NGUI are responses to Notify (currenttouch.pressed, "OnClick", null). At the same time through Eventdelegate.execute (OnClick); To implement the delegate.
/// <summary> /// Call the listener function. /// </summary> protected Virtual void OnClick () { ifnull && isenabled) { this; Eventdelegate.execute (OnClick); NULL ; } }
OK, in the final analysis, Ngui bottom or through the SendMessage to achieve, that increase an input method how to break?
Solution: Introduce a new ray inspection
A good way is to directly modify the Ngui underlying Uicamera code logic, add a body sense input, but involves too much, rather than uiroot or Uicamera add a dedicated to the body sense input method, the position of the hand as a mouse, adding a ray detection mechanism on the NGUI layer, Send SendMessage message to the detected button, of course, send the same content as in Ngui, can guarantee not modify Ngui UI script and so on, realize the somatosensory input. code example:
/// <summary> ///drizzle, Address:http://www.cnblogs.com/zsb517/ ///Camera-ray detection, which re-sets the state of the detected control without affecting the mouse state/// </summary> Private voidonraycollision () {if(Uicamera = =NULL|| CURSORGRP = =NULL) { return; } Vector3 RealPos=Screentoworldpoint (); Ray Raycamera=Uicamera.screenpointtoray (RealPos); Raycasthit hit; if(Physics.raycast (Raycamera, outHit , 1000f, Laymaskcollis.value)) { if(Hit.collider = =NULL|| Hit.collider.gameObject = =NULL) { return ; } if(Curbutton = =NULL ) { //Debug.Log (hit.collider.name);Curbutton = hit.collider.gameobject.getcomponent<uibutton>(); if(Curbutton! =NULL) {curbutton.setstate (UIButtonColor.State.Hover,false); //curbutton.sendmessage ("Onhover", true, sendmessageoptions.dontrequirereceiver); } return; } Else if(Curbutton! =NULL) { if(Curbutton! = hit.transform.gameobject.getcomponent<uibutton>()) { //Restore Previous button, and make New button to hover status//curbutton.sendmessage ("Onpress", false, Sendmessageoptions.dontrequirereceiver);Curbutton.setstate (UIButtonColor.State.Normal,false); Curbutton= hit.transform.gameobject.getcomponent<uibutton>(); if(Curbutton) {curbutton.setstate (UIButtonColor.State.Hover,false); //curbutton.sendmessage ("Onhover", true, sendmessageoptions.dontrequirereceiver); } return; } Else { } } } Else { if(Curbutton! =NULL) {curbutton.setstate (UIButtonColor.State.Hover,false); //curbutton.sendmessage ("Onpress", false, Sendmessageoptions.dontrequirereceiver); //curbutton.sendmessage ("Onhover", false, Sendmessageoptions.dontrequirereceiver);Curbutton =NULL; return; }} onreset (); } PrivateVector3 Screentoworldpoint () {Vector3 Wpos=Vector3.zero; if(ismouseorkinect) {Wpos=NewVector3 (input.mouseposition.x, INPUT.MOUSEPOSITION.Y,0); } Else { if(Nihandinput.getinstance ()! =NULL) {Vector2 pos= Nihandinput.getinstance (). Screenpos;//the position of the hand in the somatosensory inputWpos =NewVector3 (Pos.x, Pos.y,0); }} Debuger.Log (Wpos); returnWpos; }
Conclusion
Unity input itself provides a mechanism for input problems, but it does not do much research, but it is really a tangle of things to keep the input soft together, and to think more about it, not just the code, but the whole experience of the game.
Drizzle flag: Unity
Unity adds somatosensory input to Ngui