IOS converts the video data captured by the camera into JPEG format.

Source: Internet
Author: User

You can use uiimagepickercontroller to record or take a video from the camera. However, uiimagepickercontroller will display its own interface, but sometimes we do not want to display this interface, you can use another method to obtain the data obtained by the camera.

First, you need to introduce a package # import <avfoundation/avfoundation. h>. Next, your class needs to implement the avcapturevideodataoutputsamplebufferdelegate protocol. You only need to implement a method in the protocol to get the data captured by the camera.

-(Void) captureoutput :( avcaptureoutput *) captureoutput didoutputsamplebuffer :( partial) samplebuffer fromconnection :( avcaptureconnection *) connection {// create a uiimage from the sample buffer data uiimage * image = [self accept: samplebuffer]; mdata = uiimagejpegrepresentation (image, 0.5); // The mdata here is an nsdata object, and the following 0.5 represents the quality of the generated image}

The following is the imagefromsamplebuffer method. After a series of conversions, the method converts cmsamplebufferref into a uiimage object and returns this object:

// Create a UIImage from sample buffer data- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer {    // Get a CMSampleBuffer's Core Video image buffer for the media data    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);     // Lock the base address of the pixel buffer    CVPixelBufferLockBaseAddress(imageBuffer, 0);     // Get the number of bytes per row for the pixel buffer    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);     // Get the number of bytes per row for the pixel buffer    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);     // Get the pixel buffer width and height    size_t width = CVPixelBufferGetWidth(imageBuffer);     size_t height = CVPixelBufferGetHeight(imageBuffer);     // Create a device-dependent RGB color space    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();     // Create a bitmap graphics context with the sample buffer data    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,  bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);     // Create a Quartz image from the pixel data in the bitmap graphics context    CGImageRef quartzImage = CGBitmapContextCreateImage(context);     // Unlock the pixel buffer    CVPixelBufferUnlockBaseAddress(imageBuffer,0);    // Free up the context and color space    CGContextRelease(context);     CGColorSpaceRelease(colorSpace);    // Create an image object from the Quartz image    //UIImage *image = [UIImage imageWithCGImage:quartzImage];UIImage *image = [UIImage imageWithCGImage:quartzImage scale:1.0f orientation:UIImageOrientationRight];    // Release the Quartz image    CGImageRelease(quartzImage);    return (image);}

But to make the camera work, you have to do some work:

// Create and configure a capture session and start it running-(void) setupcapturesession {nserror * error = nil; // create the session avcapturesession * session = [[avcapturesession alloc] init] autorelease]; // configure the session to produce lower resolution video frames, if your // processing algorithm can handle. we'll specify medium quality for the // chosen device. session. sessionpreset = Av Capturesessionpresetmedium; // find a suitable avcapturedevice * Device = [avcapturedevice defaultdevicewithmediatype: avmediatypevideo]; // The rear camera is used by default, you can change it to the front camera // create a device input with the device and add it to the session. avcapturedeviceinput * input = [avcapturedeviceinput deviceinputwithdevice: Device error: & error]; If (! Input) {// handling the error appropriately .} [session addinput: Input]; // create a videodataoutput and add it to the session avcapturevideodataoutput * output = [[[avcapturevideodataoutput alloc] init] autorelease]; [session addoutput: output]; // configure your output. dispatch_queue_t queue = dispatch_queue_create ("myqueue", null); [Output setsamplebufferdelegate: Self queue: queue]; dispatch_release (Queue); // specify the pixel format output. videosettings = [nsdictionary Syntax: [nsnumber numberwithint: Random], numeric, [nsnumber numberwithint: 320], (ID) kcvpixelbufferwidthkey, [nsnumber numberwithint: 240], (ID) numeric, nil]; avcapturevideopreviewlayer * prelayer = [avcapturevideopreviewlayer layerwithsession: session]; // prelayer = [avcapturevideopreviewlayer layerwithsession: session]; prelayer. frame = cgrectmake (0, 0,320,240); prelayer. videogravity = avlayervideogravityresizeaspectfill; [self. view. layer addsublayer: prelayer]; // if you wish to cap the frame rate to a known value, such as 15 FPS, set // minframeduration. output. minframeduration = cmtimemake (1, 15); // start the session running to start the flow of data [session startrunning]; // assign session to an Ivar. // [self setsession: session];}

The prelayer is an interface for previewing videos. If you want to preview yourself, you can set the position or position in the prelayer. Frame. Here, we will emphasize output. videosettings. You can configure output data, such as width and video format. You can call the (void) setupcapturesession method in your controller initialization, so that the camera starts to work, and there is no way to close it. You can check the documentation.

Debugging can only be performed on a real machine, but cannot be performed on a simulator.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.