recently, because of the project needs, contacted the compilation and use of FFmpeg.
Since the previous version of FFmpeg compiled libraries older, for new devices, 5s and 5s after the device support is not very good, recompile the FFmpeg static library.
One, download and run the script compilation in the terminal ffmpeg
Script reference on git: https://github.com/kewlbear/FFmpeg-iOS-build-script;
Terminal into the script folder just after the download, run sh:build-ffmpeg.sh Automatic compilation, there is a lack of yasm follow the instructions, install Yasm
Compiled is the ffmpeg2.5.3 version, Xcode6 under iOS8.1.
Follow the script after compiling the static library directory as follows:
The. A file is a static library file, and a header file is included in the Include folder
Second, the compiled FFmpeg file to drag the project, and set the appropriate path
New project, will be compiled after the include include and Lib folder dragged into the project
I will first copy the Ffmpeg-ios folder in the project directory, and rename it to Ffmpegnew, the path is as follows:
Here to modify the project's header Search Paths, otherwise will be reported
Include "libavformat/avformat.h" File not found error
According to the path of Lib in the library Search Paths:
Copy the path, add a copy to the header Search Paths, and then change the Lib to include
Change it as follows:
Third, import other library files
Where Libz.dylib Libbz2.dylib Libiconv.dylib seems to have to be imported, other as required configuration
After the personal configuration, the following for reference:
Iv. Importing third-party code into the project
Depending on the project's customization requirements, the Iframeextractor,git code reference is selected here: Https://github.com/lajos/iFrameExtractor or Rtspplayer https://github.com /sutankasturi/rtspplayer
I use the latter in the demo code, directly will (Audiostreamer Rtspplayer Utilities) Six files dragged into the project to use
Five, realize the playback, the implementation method can refer to the code in the demo
One of the self. Playurl for the address of the video stream this project is using the RTSP Data Flow example:
Self. Playurl = @ "RTSP://XXX.XXX.XXX.XXX/XXX.SDP";
Code to implement playback:
Self. Videoview = [[Rtspplayer alloc] initwithvideo:self. Playurl Usestcp:YES];
self. Videoview. Outputheight = self. Playimage. Frame. Size. Height;
self. Videoview. Outputwidth = self. Playimage. Frame. Size. width;
__weak Testviewcontroller *weakself = self ;
Dispatch_async(Dispatch_get_main_queue(), ^{
Weakself. Playtimer = [nstimer scheduledtimerwithtimeinterval:1/30.0
target:weakself
selector:@selector (displaynextframe:)
userInfo:nil
repeats:YES];
});
-(void) Displaynextframe: (Nstimer *) Timer {
if (![ Self . videoview Stepframe]) {
[Timer invalidate];
return;
}
if (Startframecount < ) {
Startframecount+ +;
} Else {
Startframecount+ +;
[self playvideo];
}
}
-(void) PlayVideo
{
NSLog (@ "%p,%d", __function__,__line__);
//Main thread Change View
Video source size is 352*288
__weak Testviewcontroller *weakself = self ;
Dispatch_async(Dispatch_get_main_queue(), ^{
Weakself.playimage. Image = Weakself.videoview. Currentimage;
NSLog (@ "%d,%d", self.videoview.sourcewidth,self.videoview.sourceheight);
});
}
Excerpt from: iOS8.1 compiling ffmpeg and integrating third-party implementations live (monitoring Class)
iOS8.1 compiling ffmpeg and integrating third-party implementations live (monitoring Class) (iii)