Deep learning matlab to C + + on iOS test for CNN Hand type recognition

Source: Internet
Author: User

1 Preface

In my previous blog, I introduced some of the ways to run CNN on iOS. But, in general, we need a powerful machine to run the CNN, we just need to use the resulting results for the mobile side. Before the code modified using UFLDL in MATLAB ran the 3-layer CNN of hand recognition, here we consider porting Matlab to Xcode.

Step 1:matlab Turn C

The first thing to do is to make sure the code runs and runs, for example on my side, CNN identifies the hand type as follows:

load(‘./opt_parameters/opttheta_8epoches_cnn.mat‘);cnnPredict(imread(‘./data/test_five1 (1).bmp‘),parameters.opttheta)ans =     5

As you can see, I've identified 5 fingers. OK,CNN no problem. Now it's time to turn the Cnnpredict function to C, where you can see that the function contains input data and trained parameters.

function labels = cnnPredict(images,opttheta)

The basic approach is to use the tool that comes with Matlab: Coder.
In the Command window, enter coder:

Create a new project:

Here I have imported the file I want to go cnnpredict.m, there are two input variables, I need to define its variable type, here I use autodefine types, is to write a script to run this function, it is OK. That is, the code that I put in the beginning, identified after this:

Here you can see that my side of the CNN parameter is not many, that is, 190,000 parameters.
The next step is build, where the C/C + + Static library is selected, and the code is output only:

The build results are as follows:

It is possible that you will fail the build, and this may be a data type issue that can be modified to the point of success, depending on the situation.
The generated code is in the folder's CodeGen folder:

Step 2: Will. Mat parameter exported to. txt format

In training, our CNN parameters are stored in the. MAT, so, in order to be able to use in Xcode, we need to export the parameters, here I choose to export to. txt format.
The export method is very simple, a code;

save(‘opttheta.txt‘,‘opttheta‘,‘-ASCII‘%将opt theta参数保存为opttheta.txt

Step 3: Create a new iOS project, import cnnpredict code

This step is simple, pull the entire folder in the OK.
Note the CnnPredict.h code, which is the function we're going to use:

/* * file:cnnpredict.h * * MATLAB Coder version:2.7 * C + + source code generated on:16-jul-2015 16:2 2:01 * *#ifndef __cnnpredict_h__#define __CNNPREDICT_H__/ * Include Files * /#include <math.h>#include <stddef.h>#include <stdlib.h>#include <string.h>#include "rt_nonfinite.h"#include "rtwtypes.h"#include "cnnpredict_types.h"/ * Function Declarations * /externDoubleCnnpredict (Const Doubleimages[9216],Const Doubleopttheta[195245]);#endif/ * * File trailer for cnnPredict.h * * [EOF] */

Note that there is a interface folder inside the import run that will cause the run to fail and should be removed without affecting the other.

Step 4 Importing parameters in Xcode

This step is to read the data in the TXT file and dump it into a double array, directly paste the code:

NSString*filepath = [[NSBundleMainbundle] pathforresource:@"Opttheta"oftype:@"TXT"];NSString*teststring = [NSStringStringwithcontentsoffile:filepath encoding:nsutf8stringencoding Error:Nil];Nsmutablearray*thetastring = (Nsmutablearray*) [TestString componentsseparatedbystring:@"\ n"]; [Thetastring Removelastobject];NSLog(@"Theta1 count:%lu",(unsigned Long) thetastring. Count); for(inti =0; I < thetastring. Count; i++) {NSString*data = [thetastring objectatindex:i];    Theta[i] = [data doublevalue]; }

As you can see from the code, it's very simple to split the data with ' \ n '.

Step 5 Convert a picture to a double array

In order to use the function, we must convert the picture to an array. We are obviously using grayscale images here, the code for the conversion is as follows:

UIImage*image = [UIImageimagenamed:@"One.bmp"];CgimagerefImageref = [ImageCgimage];CgdataproviderrefProvider =Cgimagegetdataprovider(IMAGEREF);NSData* Data = (ID)cfbridgingrelease(cgdataprovidercopydata (provider) );NSLog(@"Image:%lu", (unsigned long) data. length);Constuint8_t *bytes = [ data bytes];

This is converted to an array of uint8, and next I need to transpose the grayscale matrix of the image as needed:

double newBytes[9216];    for (inty0y96y++) {        for (intx0x96x++) {            newBytes[x*96y] = bytes[y*96x];        }    }
Step 6: Run CNN

With the above treatment, this step runs directly cnnpredict

double result = cnnPredict(newBytes, theta);NSLog(@"result:%f",result);

The output is direct:

Did you see it? The identified result is 1, which means the thumb.
Actually see here, I am a little excited. Especially cool is not, the iOS running on the CNN direct recognition gesture, although the picture here is black and white relatively simple.

Summary

This article summarizes how to convert CNN's MATLAB code to C + + code and then run it directly on iOS. Hope to be inspired by fellow people!

Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.

Deep learning matlab to C + + on iOS test for CNN Hand type recognition

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.