Apple's iphone has Google's technology for voice recognition, and Android, Google's push, will naturally implant its core technology into Android and build it up with Google's cloud technology.
So Google Voice recognition's implementation on Android became extremely easy.
Speech recognition, with the help of cloud technology to identify the user's voice input, including voice control technology, the following we will use the API provided by Google to achieve this function.
The function point is: To recognize the user input speech through the user voice and print it on the list.
* Copyright (C) 2008 the Android Open Source Project * Licensed under the Apache License, Version 2.0 (the "License");
* You could not use this file, except in compliance with the License. * Obtain a copy of the License at * * <a href= "http://www.apache.org/licenses/LICENSE-2.0" rel= "nofollow" Targe t= "_blank" >http://www.apache.org/licenses/LICENSE-2.0</a> * * unless required by applicable, or agreed to in W Riting, Software * Distributed under the License is distributed on ' as is ' basis, * without warranties OR CONDITIONS of
Any KIND, either express or implied.
* The License for the specific language governing permissions and * limitations under the License.
* * Package Com.example.android.apis.app;
Import COM.EXAMPLE.ANDROID.APIS.R;
Import android.app.Activity;
Import android.content.Intent;
Import Android.content.pm.PackageManager;
Import Android.content.pm.ResolveInfo;
Import Android.os.Bundle;
Import android.speech.RecognizerIntent; Import ANDROID.VIew.
View;
Import Android.view.View.OnClickListener;
Import Android.widget.ArrayAdapter;
Import Android.widget.Button;
Import Android.widget.ListView;
Import java.util.ArrayList;
Import java.util.List;
/** * Sample code that invokes the speech recognition intent API. * * Public class Voicerecognition extends activity implements Onclicklistener {private static final int voice_recognition
_request_code = 1234;
Private ListView mlist;
/** * Called with the ' activity is ' a-created.
* * @Override public void onCreate (Bundle savedinstancestate) {super.oncreate (savedinstancestate);
Inflate our UI from its XML layout description.
Setcontentview (r.layout.voice_recognition);
Get display items for later interaction Button Speakbutton = (button) Findviewbyid (r.id.btn_speak);
Mlist = (ListView) Findviewbyid (r.id.list);
Check to the If a recognition activity is present packagemanager PM = Getpackagemanager (); List activities = pm.queryintentactivities (new Intent (recognizerIntent.action_recognize_speech), 0);
if (activities.size ()!= 0) {Speakbutton.setonclicklistener (this);} else {speakbutton.setenabled (false);
Speakbutton.settext ("Recognizer not Present");
}/** * Handle The click on the Start Recognition button.
*/public void OnClick (View v) {if (V.getid () = R.id.btn_speak) {startvoicerecognitionactivity ();}
}/** * Fire an intent to start of the speech recognition activity.
* * private void startvoicerecognitionactivity () {Intent Intent = new Intent (Recognizerintent.action_recognize_speech);
Intent.putextra (Recognizerintent.extra_language_model, recognizerintent.language_model_free_form);
Intent.putextra (recognizerintent.extra_prompt, "Speech recognition demo");
Startactivityforresult (Intent, Voice_recognition_request_code);
}/** * Handle the results from the recognition activity. * * @Override protected void Onactivityresult (int requestcode, int resultcode, Intent data) {if (Requestcode = = Voice_rec Ognition_request_code &Amp;& ResultCode = = RESULT_OK) {//Fill the list view with the strings the recognizer thought it could have heard
Ylist matches = Data.getstringarraylistextra (recognizerintent.extra_results); Mlist.setadapter New Arrayadapter (this, Android.
R.layout.simple_list_item_1, matches));
} super.onactivityresult (Requestcode, ResultCode, data);
}
}
The above is the entire content of this article, I hope to help you learn, but also hope that we support the cloud habitat community.