Complete Android static image face recognition demo (with complete source code)

Source: Internet
Author: User

Demo function: uses the facial recognition provided by android to identify the eye and face position. Click the button to perform face recognition. After that, the image view is displayed.

Part 1: Layout file activity_main.xml

 

     
      
   
    
   
  
 

Note: The padding around ImageView is determined by the four statements in the layout file:

 

 

    android:paddingBottom=@dimen/activity_vertical_margin    android:paddingLeft=@dimen/activity_horizontal_margin    android:paddingRight=@dimen/activity_horizontal_margin    android:paddingTop=@dimen/activity_vertical_margin

The preceding two margin values are defined in the dimens. xml file:

 

 

     
      
  
   16dp
      
  
   16dp
  
 

The default values are used here. You can ignore them!

 

Part 2: MainActivity. java

 

Package org. yanzi. testfacedetect; import org. yanzi. util. imageUtil; import org. yanzi. util. myToast; import android. app. activity; import android. graphics. bitmap; import android. graphics. bitmap. config; import android. graphics. bitmapFactory; import android. graphics. canvas; import android. graphics. color; import android. graphics. paint; import android. graphics. point; import android. graphics. pointF; import android. graphics. rect; import android. media. faceDetector; import android. media. faceDetector. face; import android. OS. bundle; import android. OS. handler; import android. OS. message; import android. util. displayMetrics; import android. util. log; import android. view. menu; import android. view. view; import android. view. view. onClickListener; import android. view. viewGroup; import android. view. viewGroup. layoutParams; import android. widget. button; import android. widget. imageView; import android. widget. progressBar; import android. widget. relativeLayout; public class MainActivity extends Activity {static final String tag = yan; ImageView imgView = null; FaceDetector faceDetector = null; FaceDetector. face [] face; Button detectFaceBtn = null; final int N_MAX = 2; ProgressBar progressBar = null; Bitmap srcImg = null; Bitmap srcFace = null; Thread checkFaceThread = new Thread () {@ Overridepublic void run () {// TODO Auto-generated method stubBitmap faceBitmap = detectFace (); mainHandler. sendEmptyMessage (2); Message m = new Message (); m. what = 0; m. obj = faceBitmap; mainHandler. sendMessage (m) ;}}; Handler mainHandler = new Handler () {@ Overridepublic void handleMessage (Message msg) {// TODO Auto-generated method stub // super. handleMessage (msg); switch (msg. what) {case 0: Bitmap B = (Bitmap) msg. obj; imgView. setImageBitmap (B); MyToast. showToast (getApplicationContext (), detected); break; case 1: showProcessBar (); break; case 2: progressBar. setVisibility (View. GONE); detectFaceBtn. setClickable (false); break; default: break ;}};@ Overrideprotected void onCreate (Bundle savedInstanceState) {super. onCreate (savedInstanceState); setContentView (R. layout. activity_main); initUI (); initFaceDetect (); detectFaceBtn. setOnClickListener (new OnClickListener () {@ Overridepublic void onClick (View v) {// TODO Auto-generated method stubmainHandler. sendEmptyMessage (1); checkFaceThread. start () ;}}) ;}@ Overridepublic boolean onCreateOptionsMenu (Menu menu) {// Inflate the menu; this adds items to the action bar if it is present. getMenuInflater (). inflate (R. menu. main, menu); return true;} public void initUI () {detectFaceBtn = (Button) findViewById (R. id. btn_detect_face); imgView = (ImageView) findViewById (R. id. imgview); LayoutParams params = imgView. getLayoutParams (); DisplayMetrics dm = getResources (). getDisplayMetrics (); int w_screen = dm. widthPixels; // int h = dm. heightPixels; srcImg = BitmapFactory. decodeResource (getResources (), R. drawable. kunlong); int h = srcImg. getHeight (); int w = srcImg. getWidth (); float r = (float) h/(float) w; params. width = w_screen; params. height = (int) (params. width * r); imgView. setLayoutParams (params); imgView. setImageBitmap (srcImg);} public void initFaceDetect () {this. srcFace = srcImg. copy (Config. RGB_565, true); int w = srcFace. getWidth (); int h = srcFace. getHeight (); Log. I (tag, image to be tested: w = + w + h = + h); faceDetector = new FaceDetector (w, h, N_MAX); face = new FaceDetector. face [N_MAX];} public boolean checkFace (Rect rect) {int w = rect. width (); int h = rect. height (); int s = w * h; Log. I (tag, face width w = + w + high h = + h + face area s = + s); if (s <10000) {Log. I (tag, invalid face, discard .); return false;} else {Log. I (tag, valid face, save .); return true ;}} public Bitmap detectFace () {// Drawable d = getResources (). getDrawable (R. drawable. face_2); // Log. I (tag, Drawable size w = + d. getIntrinsicWidth () + h = + d. getIntrinsicHeight (); // BitmapDrawable bd = (BitmapDrawable) d; // Bitmap srcFace = bd. getBitmap (); int nFace = faceDetector. findFaces (srcFace, face); Log. I (tag, detected face: n = + nFace); for (int I = 0; I
 
  
Pay attention to the following points for the above Code:
  

 

1. initialize the UI layout in the initUI () function, mainly to set the aspect ratio of ImageView. Based on the srcImg aspect ratio and screen width, set the ImageView width to the screen width, and then obtain the ImageView height based on the ratio. Set Bitmap to ImageView. Once the length and width of the ImageView are set, Bitmap automatically scales and fills in, so Bitmap does not need to be scaled.

2. initialize the variables required for face recognition in the initFaceDetect () function. First, convert the ARGB format of Bitmap to the RGB_565 format. this is the image format required by android facial recognition. this must be converted: this. srcFace = srcImg. copy (Config. RGB_565, true );

Then instantiate these two variables:

FaceDetector faceDetector = null; FaceDetector. Face [] face;

FaceDetector = new FaceDetector (w, h, N_MAX); face = new FaceDetector. Face [N_MAX];

FaceDetector is a class used for face recognition, and face is used to store the recognized face information. N_MAX is the maximum number of faces allowed.

3. Real face recognition in the custom method detectFace (), the core code is faceDetector. findFaces (srcFace, face ). After recognition, use Face f = face [I]; To obtain each person's Face f, use float dis = f. eyesDistance (); get the distance between two eyes, f. getMidPoint (midPoint); obtain the coordinates of the face center. The following two sentences show the coordinates of the left and right human eyes:

 

Point eyeLeft = new Point((int)(midPoint.x - dis/2), (int)midPoint.y);Point eyeRight = new Point((int)(midPoint.x + dis/2), (int)midPoint.y);

The following is the rectangle of the face:

 

 

Rect faceRect = new Rect((int)(midPoint.x - dd), (int)(midPoint.y - dd), (int)(midPoint.x + dd), (int)(midPoint.y + dd));

Note that the four parameters of Rect are the x, y, and x and y coordinates of the top left vertex of the rectangle frame.

 

4. in actual application, it is found that a false positive will occur in face recognition. Therefore, the checkFace (Rect rect) function is added to determine whether a face is invalid when the area pixel of the Rect is too small. Here the threshold is set to 10000. In fact, this value can be roughly estimated by the size of the entire image.

5. To enable the user to see the recognized reminder, add a ProgressBar dynamically here. The Code is as follows:

 

public void showProcessBar(){RelativeLayout mainLayout = (RelativeLayout)findViewById(R.id.layout_main);progressBar = new ProgressBar(MainActivity.this, null, android.R.attr.progressBarStyleLargeInverse); //ViewGroup.LayoutParams.WRAP_CONTENTRelativeLayout.LayoutParams params = new RelativeLayout.LayoutParams(ViewGroup.LayoutParams.WRAP_CONTENT, ViewGroup.LayoutParams.WRAP_CONTENT);params.addRule(RelativeLayout.ALIGN_PARENT_TOP, RelativeLayout.TRUE);params.addRule(RelativeLayout.CENTER_HORIZONTAL, RelativeLayout.TRUE);progressBar.setVisibility(View.VISIBLE);//progressBar.setLayoutParams(params);mainLayout.addView(progressBar, params);}

In fact, this ProgressBar does not have a very good visual effect. It would be better to use ProgressDialog. Here is just a method to dynamically add ProgressBar.

 

6. The checkFaceThread thread is set in the program to detect faces. mainHandler is used to control UI updates. Here we will focus on the Thread construction method. Here we will simulate the method of opening Camera in the source code. If a thread only needs to be executed once, this method is the best and concise. If the Thread needs to be re-executed or re-constructed after execution, we do not recommend this method. We recommend that you use a custom Thread to make the program logic easier to control. After the thread is executed, the button cannot be clicked. Otherwise, the thread starts again and then fails.

 

Thread checkFaceThread = new Thread(){@Overridepublic void run() {// TODO Auto-generated method stubBitmap faceBitmap = detectFace();mainHandler.sendEmptyMessage(2);Message m = new Message();m.what = 0;m.obj = faceBitmap;mainHandler.sendMessage(m);}};
7. Check the recognition results:

 

Source image:

After recognition:

In the end, when the human eye distance is less than 100 pixels, it cannot be identified. If the size of the static image is small and the cell phone's densityDpi is relatively high, the face cannot be detected when the image is placed in the drawable-hdpi folder, if the same test image is placed in drawable-mdpi, it can be detected normally. The reason is that Bitmap has different sizes after being loaded in different folders.

In the future, we will launch Real-Time Detection and drawing of face frames in Camera to further study the blinking detection and the demo of controlling pictures in the twinkling of an eye. If you think the author is writing a blog seriously, please vote for me.

 

 

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.