On iOS video development

Source: Internet
Author: User
Tags coding standards

This period of time on the video development of some understanding, here to share with you my own learning steps and materials, I hope to those interested in the video of friends some help.

One, the iOS system comes with the player

To understand iOS video development, we first from the system comes from the player, one, we can directly play the video, see the effect, or do not play a half-day video, will let everyone lose interest. Second, in fact, for many needs, the system player can be competent. Simply introduce the following

1.MPMoviePlayerController

Playing videos in iOS can be done using the MPMoviePlayerController class, with general player controls such as play, pause, stop, and more. But Mpmediaplayercontroller itself is not a complete view controller, if you want to show the video in the UI you need to add the View property to the interface

2.MPMoviePlayerViewController

MPMoviePlayerController inherits from Uiviewcontroller, the default is full-screen mode display, auto-play after pop-up, as modal window display if click "Done" button will automatically exit the modal window, etc.

3.AVPlayer

MPMoviePlayerController is strong enough and complex. Customizing the style of the player, using MPMoviePlayerController is not appropriate, only with Avplayer.

Avplayer itself does not display video, and it does not have a view property like MPMoviePlayerController. If Avplayer wants to show that a player layer must be created Avplayerlayer for presentation, the player layer inherits from Calayer, with Avplayerlayer added to the layer in the Controller view.

4.AVFoundation

In-depth learning of audio and video playback requires in-depth learning of the avfoundation framework

But either MPMoviePlayerController or Avplayer supported video encoding formats are limited: H. MPEG-4, extension (compressed format):. mp4,. mov,. M4V,. m2v,. 3gp,. 3g2, etc.

Ii. use of third-party Kxmovie

1. Configure Kxmovie

git clone https://github.com/kolyvan/kxmovie.git

CD Kxmovie

git submodule update--init

sudo rake//error will occur, see error 1

2. Problems encountered and solutions:

A. Executing sudo rake when abort

Under the Kxmovie directory

Perform Vim Rakefile

Find Sdk_version, Xcode_path two lines, change to the following

sdk_version= ' 9.2 '

Xcode_path= '/applications/xcode.app/contents/developer/platforms '

Explanation: sdk_version= ' 9.2 ' in 9.2 is your current SDK version can be executed

cd/applications/xcode.app/contents/developer/platforms/iphoneos.platform/developer/sdks/

To view the current version of the SDK, change to the corresponding version

b.undefined Symbols for Architecture x86_64

Kxmovie should not support 64 for the emulator to run, can not run on the emulator above iphone5s. can run on 5 .

Third, the video basic knowledge introduction

1. Video Player principle

    • Through the streaming media protocol such as RTSP+RTP, HTTP, MMS and other downloaded data through the solution protocol to obtain encapsulated format data, what is encapsulated format data. such as AVI, MP4, FLV and so on;
    • For encapsulating the data in the format, extract the video stream, audio stream, and subtitle stream for the next step to prepare for processing,
    • After the separation to obtain audio and video file encoding files (audio and video files too large need to be compressed for transmission, that is, encoding), common encoding such as H. Encode video stream and AAC encoded audio stream. Compressed encoded video data output becomes non-compressed color data, such as Yuv420p,rgb and so on; compressed audio data output becomes non-compressed audio sampling data, such as PCM data.
    • Audio synchronization, video streaming, audio streaming, subtitle streaming and other simultaneous playback.

2. Streaming Media Transfer Protocol

Generally on-demand http, and live, most still use rtmp or private protocol, because the delay will be relatively small, rtmp itself is for live design

    • RSVP: Resource Reservation protocol
    • RTP: Real-time Transport protocol
    • RTCP: Real-time Transmission Control Protocol
    • MMS: Microsoft Streaming Services Protocol
    • RTSP: Real-time Streaming protocol
    • MIME: Multi-purpose Internet e-mail Extension protocol
    • RTMP (rtmpe/rtmps/rtmpt): Adobe Real-time Messaging protocol cluster
    • Rtmfp:adobe implementation of the Message Flow Protocol (peer protocol)
    • HLS (Http Live streaming)

Introduction to Streaming Media Protocol (RTP/RTCP/RTSP/RTMP/MMS/HLS) http://blog.csdn.net/tttyd/article/details/12032357

The difference of video stream transmission protocol rtp/rtcp/rtsp/http http://blog.csdn.net/yangxt/article/details/7467457

3. Package format

Encapsulation Format (also known as container) is the main function of the video stream and audio streams in a certain format stored in a file.

Common formats

AVI: The packaging standard that Microsoft created in the early 90 was the one that was launched to combat QuickTime Format (MOV) and only supported sound files with fixed CBR constant bit rate encoding.

FLV: The format for the H.263 family.

MKV: Universal packaging, with good compatibility and cross-platform, error correction, can be with external subtitles.

Mov:mov is a QuickTime package.

MP4: Mainly used in MPEG4 encapsulation.

Rm/rmvb:real Video, developed by RealNetworks, is used in RMVB and RM.

Ts/ps:ps package can only be used in Hddvd original.

WMV: Microsoft launched the competition as a market.

4. Coding standards

The main function of video coding is to compress video pixel data (RGB,YUV, etc.) into video stream, thus reducing the amount of video data. If the video is not compressed, the volume is usually very large, and a movie may take hundreds of g of space.

Video Coding standard Summary and comparison http://blog.csdn.net/leixiaohua1020/article/details/12031631

AV codec Technology 0 Basic Learning method http://blog.csdn.net/leixiaohua1020/article/details/18893769

5. Playback mode

Video Live, is the video source of real-time viewing, not fast-forward and other operations, pay attention to real-time, network latency requirements are relatively high, equivalent to the broadcast of video

Video on demand, is to replay the previous video source, can perform fast-forward and backward operations

6.FFmpeg

Http://ffmpeg.org/doxygen/2.8/examples.html Official Website Introduction

http://blog.csdn.net/leixiaohua1020/article/details/44084321 Blog Address

the basic concept of http://blog.csdn.net/beitiandijun/article/details/8280448 ffmpeg

The Multimedia video processing tool FFmpeg has the very powerful function including the video collection function, the video format conversion, the video capture picture, gives the video watermark and so on.

Basic concepts of FFmpeg:

Container (Container): Is the file format, in the FFmpeg, the container used to abstract the file format is avformatcontext;

Stream: The data stream is the multimedia data stream we see, it contains several basic data streams, including: video stream, audio stream, subtitle stream, as I understand it, the abstraction of data flow in FFmpeg is Avstream.  

Multiplexer or shunt (demuxer): FFmpeg will be processed multimedia file as a multimedia data stream, first put the multimedia data stream into the container (Avformatcontext), and then the data stream into the multiplexer (demuxer), Demuxer in FFmpeg in the abstraction of Avinputformat, I would like to Demuxer called a shunt, because Demuxer is the interleaving of the various basic data stream recognition and then separate processing, separate data streams are sent to video, audio, subtitle codec processing.

Data packets (packet) Of course, the separate traffic before sending to the codec processing, first put in the cache, while adding some ancillary information such as timestamp, so that later processing, then this cache space is the packet, because the data flow is interleaved on the timeline, so all the video, audio, Subtitles are divided into a period of data, these pieces of data from the data stream after parsing, is stored in the respective packet, then here to explain, simple video packet, a video packet can be stored a video frame, for simple audio frame, If the sampling rate (sample-rate) is fixed, an audio packet can hold several audio frames, and if the sampling rate is variable, then one packet can hold only one audio frame.

Four, Kxmovie source analysis Simple analysis

The whole idea is to decode the video file into YUV or RGB file (image file) by Kxmoviedecoder the video file or network address using ffmpeg decoding. The Kxmovieglview then renders the YUV or RGB file. Kxaudiomanager for playback management, such as Paly,pause, Kxmovieviewcontroller use the above API to build the player interface

1.KxMovieDecoder File  

Kxmoviedecoder provides a decoding API, which is decoded in vedio to YUV or RGB files.

Start with the public API and analyze it. The following analysis only extracts the operations of the vedio.

A. Open the file and do the following

+ (ID) Moviedecoderwithcontentpath: (NSString *) path error: (NSERROR * *) perror

  1. To open the network stream, add the function Avformat_network_init () to the front.
  2. Avformatcontext: The basic structure that governs the whole world. Mainly used for processing encapsulation formats (FLV/MKV/RMVB, etc.). Avformatcontext initialization Method Avformat_alloc_context ()
  3. Open the input stream, four parameters are the address of the Ps:avformatcontext object, FileName: The file name of the input stream, FMT: If not NULL, this parameter enforces a specific input format. Otherwise, the format is automatically adapted. int Avformat_open_input (Avformatcontext **ps, const char *filename, Avinputformat *fmt, avdictionary **options);
  4. Read the packet to get the information of the stream media file, each Avstream store a video/audio stream of relevant data; Each avstream corresponds to a avcodeccontext that stores data about the video/audio stream using the decoding method. int Avformat_find_stream_info (Avformatcontext *ic, avdictionary **options);
  5. Find the right decoder,
  6. Avcodeccontext *codecctx = _formatctx->streams[videostream]->codec; Avcodec *codec = Avcodec_find_decoder (codecctx->codec_id);
  7. Initialize the Avcodeccontext to use the given avcodec. Return zero on success, a negative value On Error avcodec_open2 (CODECCTX, codec, NULL);

B.-(BOOL) OpenFile: (NSString *) path error: (NSERROR * *) perror;

compared to method A, method A is more than this method initialization method Kxmoviedecoder *mp = [[Kxmoviedecoder alloc] init];

C.-(void) closefile;

End

Av_frame_free (&pframe);

Avcodec_close (PCODECCTX);

Avformat_close_input (&PFORMATCTX);

D.-(BOOL) Setupvideoframeformat: (kxvideoframeformat) format;

Enumeration is set to Kxvideoframeformatrgb or KXVIDEOFRAMEFORMATYUV,

e.-(Nsarray *) Decodeframes: (cgfloat) minduration;

Reads the frames through the Avformatcontext object. The operation of method A is required to pave the way.

  1. Read the next avpacket from Avformatcontext. int Av_read_frame (Avformatcontext *s, Avpacket *pkt)
  2. Decoding is converted from Avpacket *avpkt to Avframe *picture.   int Avcodec_decode_video2 (Avcodeccontext *avctx, avframe *picture, int *got_picture_ptr, const avpacket *AVPKT);
  3. Frame rate Control attribute_deprecated int avpicture_deinterlace (avpicture *dst, const Avpicture *src, enum avpixelformat pix_fmt, int width, int height )      
  4. Returns an array of frames.

2.KxAudioManager

Playback management, such as Paly,pause,

3.KxMovieGLView

Kxmoviedecoder provides a decoding API, which is decoded in vedio to YUV or RGB files. Kxmovieglview uses Opengles (drawing technology) to present YUV files.

4.KxMovieViewController

Build the player interface using the above API

V. Summary

My steps in learning

1. First learn to use the system player for video playback

2. Learn to use third party Kxmovie

Learn these two to be able to cope with basic video development

3. In-depth learning avfoundation Framework I bought this book AV Foundation development Cheats: Practice Mastering Audio-visual processing technology for iOS & OS X applications I haven't read it yet.

4. A more ffmpeg framework is required in-depth. Of course, we need to learn the fundamentals of audio and video development such as RGB, YUV pixel data processing, PCM audio sampling data processing, video stream analysis and so on. A lot.

Vi. Summary of resources ——— is also my own summary of the people in-depth study of some information it

HTTP Live Streaming Live (iOS live) technical analysis and implementation:http://www.cnblogs.com/haibindev/archive/2013/01/30/2880764.html

HTT Live Streaming Official document:https://developer.apple.com/streaming/

FFmpeg in-depth analysis of 0-basic http://blog.chinaunix.net/uid-26611383-id-3976154.html

A university paper that is long but makes little white understand what iOS streaming requires http://www.doc88.com/p-7098896030363.html

Introduction to Streaming Media Protocol (RTP/RTCP/RTSP/RTMP/MMS/HLS) http://blog.csdn.net/tttyd/article/details/12032357

The difference of video stream transmission protocol rtp/rtcp/rtsp/http http://blog.csdn.net/yangxt/article/details/7467457

FFmpeg Framework Interpretation http://blog.csdn.net/allen_young_yang/article/details/6576303

Stream Media blog http://blog.csdn.net/leixiaohua1020/article/details/15811977

the basic concept of http://blog.csdn.net/beitiandijun/article/details/8280448 ffmpeg

Video Coding standard Summary and comparison http://blog.csdn.net/leixiaohua1020/article/details/12031631

AV codec Technology 0 Basic Learning method http://blog.csdn.net/leixiaohua1020/article/details/18893769

Book: AV Foundation Development Cheats: Practice Mastering Audio-visual processing technology for iOS & OS X applications

Seven, the level is limited, the author is just beginning to learn, there must be a lot of incorrect places, I hope you correct, thank you.

On iOS video development

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.