Various events in Android are implemented by different listeners. For example, the key event is monitored by onclicklistener, And the touch event is listened by ontouchlistener.
First set the listener, and then input the event to be listened on
Public class touch913mainactivity extends activity {relativelayout RL; textview TV, TV2; @ override public void oncreate (bundle savedinstancestate) {super. oncreate (savedinstancestate); setcontentview (R. layout. activity_touch913_main); RL = (relativelayout) findviewbyid (R. id. relativelayout1); TV2 = (textview) findviewbyid (R. id. textview2); TV = (textview) findviewbyid (R. id. textview1); RL. setontouchlistener (New ontouchlistener () {@ overridepublic Boolean ontouch (view V, motionevent event) {// todo auto-generated method stubsystem. out. println ("Touch relativelayout"); Return false ;}}); TV. setontouchlistener (New ontouchlistener () {@ overridepublic Boolean ontouch (view V, motionevent event) {system. out. println ("Touch textview"); int num = event. getaction (); int pointer = event. getpointercount (); stringbuffer STR = new stringbuffer (); For (INT I = 0; I <pointer; I ++) {Str. append ("X:" + event. getx (I); Str. append ("Y:" + event. gety (I); Str. append ("Duration:" + event. geteventtime (); Str. append ("" + num) ;}tv2.settext (STR); Return false ;}});}
Each time you touch a space in logcat, the corresponding touch information is displayed.
Speech recognition:
First, implementTexttospeech. oninitlistener,// This API is called during speech engine initialization.
And texttospeech. onutterancecompletedlistener
// The word has been read
Then, call the texttospeech () function to read the words in textview, and then call the READ function speek ()
/*** Texttospeech is a tool used by computers to read words. ** Android 1.6 and later versions start to support * Languages: English and European. The Asian language is not supported. */Public class myttsactivity extends activity implements texttospeech. oninitlistener, // call texttospeech when the Speech engine is initialized. onutterancecompletedlistener {// textview TV; button BTN; texttospeech TTS; @ override public void oncreate (bundle savedinstancestate) {super. oncreate (savedinstancestate); setcontentview (R. layout. main); TV = (textview) findviewbyid (R. id. textview1); BTN = (button) findviewbyid (R. Id. button1); TTS = new texttospeech (this, this); BTN. setonclicklistener (New onclicklistener () {@ overridepublic void onclick (view v) {TTS. speak (TV. gettext (). tostring (), texttospeech. queue_add, null) ;}}) ;}// call @ overridepublic void oninit (INT status) When initializing the engine {// read language int result = TTS. setlanguage (locale. US); // set the language for reading TTS. setspeechrate (10); // set the read speed if (result = texttospeech. lang_missing_data | res Ult = texttospeech. lang_not_supported) {toast. maketext (this, "unsupported language! ", Toast. length_short). Show () ;}// call @ overridepublic void onutterancecompleted (string utteranceid) {// todo auto-generated method stub }}
the above is just an introduction to some events that I learned about Android. As a summary and record of the learning process, Article , you need to go deep dive.