iOS Video development Experience

Source: Internet
Author: User

iOS Video development experience

Mobile phone than PC advantages in addition to portable, I think the most important thing is to quickly and easily create multimedia works. Photo sharing, voice input, video recording, Geolocation. A successful mobile app has one or more of these from the product form, such as Instagram. If the Web2.0 interactive experience is copied to the mobile phone is dead end. When the smartphone meets the video like Pan Jinlian met Simon Qing, picking hit it off, want to not happen something is difficult. Their crystallization is micro-video. Micro-video can be said to the phone video recording and fragmentation time two features to the extreme, video-related apps now no temperature and not fire reasons I think with the pit daddy operator. Although mobile network traffic is slow at present, it does not prevent us from accumulating the technology first.

This article mainly introduces my experience in iOS video development.

Video essence:

Pure video (excluding audio) is essentially a set of frame pictures that are video encoded into video files and some audio files and subtitle files assembled together to become the video (movie) files we see. The number of pictures that appear in 1 seconds is the frame rate, the smaller the picture interval the smoother the picture, so the higher the frame rate effect is better, the need for more storage space.

Video encoding:

Because the amount of video data that is not encoded is very large, it can cause difficulties in storage and transmission, so the video file needs to be encoded after the recording is complete. Video encoding mainly compresses data from two dimensions.

    • 1, a single image of a region adjacent pixels similar, such as a piece of red only recorded red color values and areas, not to record the area of every pixel.
    • 2, adjacent image content is similar, because two adjacent frames to create a continuous effect, so the content between two frames is generally very close. At present, the mainstream video coding technology is to encode the first frame with an image encoding method, and then describe in some way what the next frame is different from the adjacent frame.
Video format:

MP4, MOV, AVI, RMVB These playback formats are in fact packaging format, in addition to rmvb more special, other formats in the format of the video encoding is h264,h264 to high compression rate, compression efficiency is more than MEPG-2, but the world has no best of both worlds, H264 is 3 times times more difficult to decode.

Video bitrate:

The size of the video file divided by the length of the video is defined as the bitrate.

The relationship between bitrate and resolution and video quality:

    • The code rate can be understood as the sampling rate, the higher the sampling rate per unit time, the higher the precision, and the larger the volume.
    • When the video is not encoded, the higher the resolution, the clearer the detail of the video image.
    • But if the video is encoded and limited to a certain bit rate, the encoder must discard some of the details.
    • So the resolution and bitrate are all related to sharpness.
soft decoding and hard decoding:

The H264 video decoding to the CPU caused a great burden, so the mobile phone engineers to this part of the work is more adept at processing simple work but a large amount of data gpu.

    • GPU decoding is called hard decoding.
    • CPU decoding is soft decoding.
    • iOS provided by the player class is hard decoding, so video playback on the CPU will not be a lot of pressure, but the supported playback format is relatively single, generally is MP4, MOV, M4V these several.
HTTP Live Streamingabout HLS

HTTP Live Streaming (abbreviated as HLS) is an HTTP-based streaming network Transfer protocol proposed by Apple. It works by dividing the entire stream into small HTTP-based files to download, one at a time.
When media streaming is playing, clients can choose to download the same resources at different rates from many different alternative sources, allowing streaming media sessions to accommodate different data rates. The supported video stream encoding is H. The m3u8 suffix that we see on the video site is a video that uses the HLS protocol.

HLS Benefits
    • 1, after reading a section of the cache, to prevent only watching a video but the entire video files are cached down the user, reduce server pressure and save traffic.
    • 2, according to the user speed switch different rate, taking into account the process and clarity.
HLS Support Situation
    • Versions for IOS 3.0 and later
    • Android 3.0 and later versions
    • HTML5.
selection of the terminal playback format
    • Android only supports HLS after 3.0, so Android2.3 can only use MP4.
    • Android3.0 and later support HLS. Can be used in m3u8, MP4 format
    • iOS supports HLS, but Flash is not supported. Can be used in m3u8, MP4 format
    • Browsers that support HTML5 can use m3u8.
    • Browsers that do not support HTML5 can only play SWF with Flash.

For these reasons, it is currently not possible to implement a playback address that is common across all platforms.

iOS video playback:

iOS provides the MPMoviePlayerController class for playback, supporting streaming media and file playback. The video content will be rendered to his view, which can be placed anywhere you want, and it's easier to use. This class is not designed to be reasonable in that the video playback status and video loading status are notified by notification, not by block or delegate.

iOS video recording:

There are two ways to achieve the same video recording function as taking pictures

    • 1, Uiimagepickerviewcontroller
    • 2, Avfoundation.

This is only the avfoundation framework, which is the underlying multimedia framework provided by Apple, used for audio and video capture, audio and video decoding, video editing, and so on, which basically relies on the avfoundation framework.

Video recording and photo shoot need to do a similar job, mainly with the following 5 steps:

    • 1. Create session avcapturesession to control the flow of input to output.
    • 2, acquisition device Avcapturedevice, camera for video capture, microphone for audio acquisition.
    • 3. Create input device Avcapturedeviceinput, bind device to input port and add to session
    • 4, create output avcaptureoutput, can output to file and screen. Avcapturemoviefileoutput output a movie file Avcapturevideodataoutput output processing video frame to display the video avcaptureaudiodataoutput output audio data that is being recorded
    • 5, audio and video synthesis into a file
iOS for video real-time processing:

If you need to process your video in real time (or you won't see what you're recording), you'll need to work directly with the video stream in the camera buffer.

      • 1. Define a video data output (avcapturevideodataoutput) and add it to the session.
      • 2. Set the accepted controller as the proxy for the video data output buffer (sample buffer).
      • 3. Implement Agent Method
        -(void) Captureoutput: (avcaptureoutput ) captureoutput didoutputsamplebuffer: (cmsamplebufferref) SampleBuffer Fromconnection: (avcaptureconnection ) connection
        Avfoundation calls this method when the data buffer has data in it. In this proxy method, we can get the video frame, process the video frame, and display the video frame. Real-time filters are handled here. In this method, the video data in the buffer (that is, the frame picture) is output to the layer to be displayed.

iOS Video development Experience

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.