This is Tensorflow-imageclassifier's sample, which runs today.
The function of this sample is that when the LED is on, click button, take photos, the system will analyze the image, the contents of the picture.
The display of pictures and results, the need to connect the display, the analysis results and instructions commands, can be connected by speaker or headphones.
Source Address: Https://github.com/androidthings/sample-tensorflow-imageclassifier
About the contents of the assembly, we can refer to the above source code address inside the official introduction.
In addition, this sample assembly, I think is another sample of the upgrade, so small white can first run the following sample.
Http://www.cnblogs.com/minminjy123/p/8120165.html
If the display and camera assembly is not very familiar with, you can look at the following link, there are photos.
Https://developer.android.google.cn/things/hardware/imx7d-kit.html
In the test process, seemingly the results are not very ideal, there are many can not be analyzed, but water bottle can be analyzed.
In addition, if you do not follow the instructions in the process to click the button, or multiple clicks on the button, the camera will not be able to use the situation.
Finally, po a picture of my ugly assembly.
Android Things sample (Sample-tensorflow-imageclassifier) test