Kinect's Learning Notes development (i) Kinect Introduction and application
First, the Kinect brief introduction
Kinectfor Xbox 360, called Kinect, is a peripheral device developed by Microsoft and applied to the Xbox 360 console. It allows the player to use voice commands or gestures to manipulate the Xbox360 system interface without having to hand or trample the controller. It also captures the player's whole body movements, plays with the body, and gives the player "no controller-free gaming and entertainment experience".
It was listed in the U.S. on November 4, 2010 and is priced at $149.
Kinect sold 8 million in the first 60 days of sales and has now filed for the world record to become the world's fastest-selling consumer electronics.
February 1, 2012, Microsoft officially released the Kinect version number for Windows system "Kinect for Windows", the proposed price of $249. Later in 2012, Microsoft will also announce a special edition of Kinect for "educational users".
(The above from the Wiki encyclopedia)
The Kinect has three lenses, and the middle lens is an RGB color camera used to set up color images. The left and right sides of the lens are the infrared emitter and infrared CMOS camera composed of 3D structure light depth sensor, used to set the depth of data (the scene of the object to the camera distance).
The color camera supports up to 1280*960 resolution imaging, and the IR camera supports 640*480 imaging maximum. The Kinect is also paired with the Chase technology, and the base motor rotates with the focusing object moving. The Kinect also has a built-in array microphone, which is radio at the same time by four microphones. Eliminate the murmur after comparison. and the voice recognition and the source location through its collection sound.
1.2. Software development environment
1.2.1, unofficial combination
When Microsoft launched Kinect on the Xbox 360, it started. And there is no development package in Windows. And because of the Kinect's powerful features and relatively low prices, geeks have expressed the hope that they will be able to use it on their computers.
So there are a number of Daniel developed the drive, at the moment I learned that there are three:
1) CL NUI Platform
Developed by Nui's Daniel Alexp (other well-known products he developed include PS3 's Windows drivers). You can download it here . The target platform is windows7, which can obtain color camera, depth sensor and accelerometer data. Simple and convenient to use.
Launched by Hector Martin, the first to hack Kinect, to download here , the target platform is Linux and MAC, which is said to have been successfully ported to Windows. Because so many geek are fans of Macs, there are numerous development participants, not just a driver to get the data. Geek also wrote other high-level things. For example, the skeletal I mentioned earlier. The color camera as a texture to the depth of the data and so on, very attractive.
Openni (Opennatural Interface Open Natural interaction) is a multi-lingual. A cross-platform framework that defines APIs that write applications and take advantage of their natural interactions. To be able to here download .
Inferring the ultimate goal from the name is about the effect of achieving a minority report. Some of the demos from now, I'm afraid, have surpassed this ultimate goal.
It is not specifically developed for Kinect, but has the support of Kinect's manufacturer PrimeSense. This feeling is also a relatively more informal combination ofSensorkinect + NITE + openni; Sensorkinect is the Kinect driver. Nite is a middleware provided by PrimeSense, capable of analyzing the data the Kinect reads, outputting human motion, and so on.
1.2.2, Microsoft Official SDK
The Kinect somatosensory game was very well rated on the Xbox 360, but the development on the Windows platform has been able to use only unofficial solutions, such as Nkinect with the CL NUI SDK, but Microsoft eventually launched the Kinec in June 2011. T forwindows SDK Beta, in particular, can be developed using C # with the. NETFramework 4.0来. The Kinect for Windows SDK is designed primarily for Windows7. Contains drivers, a rich raw data flow development interface, natural user interface, installation files and reference routines. The Kinect for Windows SDK makes it easy for programmers using C + +, C #, or VisualBasic languages with MicrosoftVisualStudio2010 tools.
Now the newest is V1.6.
Kinectfor the Windows SDK:
The Kinect SDK now only supports Windows 7, which is divided into x86 and x64 two version numbers. Development tools also require the support of the. NET Framework 4.0 and Visual Studio 2010 (minimum Express version number). This is explained later in the configuration of the development environment.
Advantages and disadvantages of 1.2.3, unofficial and official development kits
1) Official SDK:
Provides audio support, adjustable tilt motor, and body tracking for bone tracking: non-standard posture detection (surrender posture relative to Openni ...) ）。 Details such as head, hands, feet, clavicle detection and joint occlusion are more carefully handled (but the accuracy is not determined).
In addition, multi-sensor support (multiple Kinect);
Microsoft's restrictions on non-commercial use.
In addition, gesture recognition and tracking are not available. The cross-alignment of RGB image/depth images is not implemented, only the alignment of the individual coordinate system is provided. In full-body skeleton tracking, the SDK only calculates the position of the joint and does not derive its rotational angle. From a portable point of view, the SDK beta can only be used for the kinect/win7 platform. Openni also supports at least ASUS Wavi xtion Body-sensing devices. There may be many other hardware platforms supported in the future. In contrast, SDK Beta does not support the Unity3d game engine, does not support recording/playback data to disk, does not support the original infrared video data stream, and does not support event response mechanisms like Openni's role entry and exit.
2) Unofficial Openni/nite:
It can be used for commercial development, including gesture recognition and tracking, self-aligning depth image and RGB image, body tracking, joint rotation angle calculation, good performance, many game product applications, supporting record/playback data writing to disk, supporting raw infrared video data stream, Support the event response mechanism for role entry and exit. Support PrimeSense and ASUS Wavi Xtion hardware platforms and software platforms such as Windows, Linux, and Macs. The code comes with full support for the Unity3d game engine.
No audio function, rotation motor not supported to adjust tilt, and body tracking for bone tracking: the rotation of the head, hands, feet and clavicle cannot be traced. A standard posture test is required (that is, the famous surrender posture ...). ), there appears to be an algorithmic bug in the processing of details such as joint occlusion. You cannot proactively install and identify the Kinect multi-machine environment. The installation process is more cumbersome. In particular nite to apply for development certificate code. Openni also does not provide an event-triggering mechanism for available video and depth-graph inputs (but Openni provides functions that are similar to those used, although not callback functions. But also very useful).
Openni's biggest advantage is agreeing to cross-platform multi-device, as well as commercial applications.
But from the collection of raw data and preprocessing technology. Microsoft's SDK seems to be more stable, and it also provides good bone and voice support. For part of the body parts recognition function. Sdkbeta does not provide local identification and tracking, which requires its own development (at least for a certain period of time Microsoft may not provide such functionality). Openni/nite while providing gesture recognition and tracking, there are many other Microsoft products that are used to identify and track skeletal posture.
So, according to the current performance in the community. The SDK beta and openni/nite is not a good or bad to be sure. And as more and more developers add to Microsoft's side, the popularity of the SDK beta may be faster. But at a higher level of application. The choice of the two is often a need for some wisdom.
(This part: http://www.hanyi.name/blog/?p=330)
Second, Kinect application development Summary
This article (Baidu Library above the content, source unknown) summarizes the current use of Kinect SDK for Windows developed a variety of applications, click on the appropriate link to see the corresponding Kinect app demo video.
The Kinect fitting mirror, a fantastic fitting mirror based on the Kinect somatosensory technology, allows customers to try clothes at high speed, improving sales efficiency and corporate image.
Http://v.youku.com/v_show/id_XMjU4MjExNjgw.htmlKinect homemade Application 3D fitting room
3D camera with two Kinect to achieve the basic effect of 3D camera.
Sculpture tools. Lidengkequ Kinect into the street high-speed portrait sculpture tool, using Kinect to 3D modeling the human body, and then according to the human body's 3D information, connect the corresponding molding equipment, mold the body statue.
Using the Kinect to manipulate the RC helicopter
Kinect Robo, using Kinect as the head of the robot, detects the surrounding environment through the Kinect and carries out 3D modeling to guide the robot's actions.
Kinect control up to robots
Air guitar. Manipulate virtual guitars with Kinect gestures to play music.
The Kinect plays ancient Chinese instruments and can play the sounds of different ancient Chinese instruments by changing gestures.
Kinect cracked "Hatsune". Apply the somatosensory control to the comic character-the first tone.
Turn into Ultraman. Captures player skeleton data. Virtual for Ultraman's form and its follow-up, and add some special effects.
Kinect cracked play lightsaber, Kinect detects the player's movements. The image of a virtual lightsaber, with which it moves.
The Kinect gesture Action browser pages The browser through the Kinect gesture. Drop, indent, and so on.
Air Presenter, make your speech unique. The software for speaking with Kinect.
Kinect Multi-touch. Use Kinect to make multi-touch, browse images, maps, and more.
The Kinect somatosensory control watch, the four military Medical University of Western Beijing Hospital orthopedic surgeons will crack the Kinect application in the operating room, in surgery. The patient is able to view the patients ' images through somatosensory control. The great generosity of the doctor operated, reducing the movement of people in the operating room.
Kinect crayon Physics, using Kinect gestures to paint. The picture is drawn by the somatosensory control and has the physical properties. For example, Gravity. Attraction and so on.
Control Lightning with Kinect (Tesla coils)
Kinect hack play Max Payne
Kinect Hack Play Survival Tour 2 body feel Zombie
Kinect hack play World of Warcraft
Kinect Hack play Street Fighter
Kinect hack play Super Mario Bros
Kinect cracked the Ranger paradox
Kinect Hack play Modern War
A group of students from Carnegie Mellon used the Kinect interactive feature to achieve 18 interesting ideas of various kinds. They took only two weeks, ranging from sophomores to graduate students. Let's take a look at the fantastic ideas they've come to realize!
Source and video Information link http://golancourses.net/2011spring/projects/project-3-interaction/
1. Comic Kinect
This demo sample mainly applies Kinect's skeleton tracking technology and player segmented data to show the interaction of boxing and kick through visual comic effects. It also emits some quasi-sound effects synchronously.
2. Mario Boo
When the Kinect sensor detects that someone is out of sight, a ghost remains behind, moving with the movement of the person, and changing its size depending on the depth of the information.
Magrathea uses Kinect to dynamically generate topographic maps based on whatever objects are on the table. The camera reads the changing depth of the object on the table, revealing a process similar to the gradual evolution of the Earth's topography.
4. We are monsters
Inspired by the noon lion dance and using Kinect's skeleton tracking technology, two of people sit on their limbs and manipulate the limbs and tail of the virtual monster.
5. Mix&match Interaction Toy
Using the Kinect/openni skeleton technology, the body of 3 cards is able to follow the player. and change the picture by sliding the hand.
6. Kinect Flock
The author creates a particle system. When the user moves. Things like cotton-wool are going to surge, and when the user is smart, they are clustered into the depth area of the participant.
The Roboscan is a 3D model + scanner that secures a Kinect device to an ABB 4400 manipulator. The set action and the operator control the 3D position of the robot and the camera at the same time.
The Kinect's depth data is used to produce a model of an accurate external environment.
Neurospasta is a free-form game platform that requires full-body input. Participants are able to control their own Kinect-based puppets, as well as to control the avatar of others through functional settings.
This design is full of magical colors. The player is able to control a glowing sphere, and the sphere moves according to the player's hand movement. To become larger or smaller depending on the depth information.
Ten. Balloon Grab
By checking the palms open or clenched gestures. The author developed a simple mini-game based on simulated balloon flight.
The software uses gestures to control the visualization of the audio, combining the depth of the data detected in the scene with the hand-distance Kinect. The position of the participant's hand. The speed and other parameters are used to create a visual effect of an interactive sound.
Feel the Kinect the app also occasional word: Kinect the app relies on your imagination!
Kinect's Learning Notes development (i) Kinect Introduction and application