IOS study notes 26-Video Playback

Source: Internet
Author: User

IOS study notes 26-Video Playback
I. Video

You can use two frameworks to play videos on iOS:
1.MediaPlayerFrameworkMPMoviePlayerControllerAndMPMoviePlayerViewController
2.AVFoundationIn the FrameworkAVPlayer
3.AVKitFrameworkAVPlayerViewController[Available only after iOS8]

However, in the past two years,MediaPlayerThe framework is markeddeprcatedIt means that it is no longer maintained by Apple, and the framework is highly integrated, not as goodAVFoundationHigh flexibility.AVFoundationOfAVPlayerTo play a video,AVPlayerViewControllerIt is actuallyAVPlayer.

The application layers of the two frameworks are as follows:

Ii. AVPlayer

AVPlayerExist inAVFoundationIt is closer to the bottom layer, so it is extremely flexible.
AVPlayerThe video itself cannot be displayed. IfAVPlayerTo display, you must create a player layer.AVPlayerLayerUsed for display. The Player layer inherits fromCALayer.

AVPlayer video playback steps: create a video resource URL, which can be a web URL to create a video content object through a URL AVPlayerItem, A video corresponds to AVPlayerItemCreate AVPlayerA Video Player Object. AVPlayerItemInitialize and create AVPlayerLayerPlay layer object. Add it to the display view and play it on the player. play, Player paused pauseAfter the notification center is added to listen for video playback, KVO is used to listen for the attribute change progress bar of the playback content. AVPlayerObject method:
-(Id) addPeriodicTimeObserverForInterval :( CMTime) interval/* listener frequency */queue :( dispatch_queue_t) queue/* listener GCD thread */usingBlock :( void (^) (CMTime time) block; /* listener callback */
Test Environment setup: Use the terminal to enable the Apache service so that the mobile phone can access local resources through the network
Download the video MP4 to the Apache Web Resource Directory
The default Apache Web Resource Directory is /Library/WebServer/Documents
View the IP address of the local server
Do not forget to go to info. plist to set the HTTP network to unban
The following is a specific project: ViewController attributes
# Import "ViewController. h "# import @ interface ViewController () @ property (strong, nonatomic) AVPlayer * player; // Video player @ property (strong, nonatomic) AVPlayerLayer * playerLayer; // video playback layer @ property (strong, nonatomic) IBOutlet UIView * movieView; // playback container view @ property (strong, nonatomic) IBOutlet UIProgressView * progressView; // progress bar @ property (strong, nonatomic) IBOutlet UISegmentedControl * segmentView; // selection bar @ property (strong, nonatomic) NSArray * playerItemArray; // video playback URL list @ end
1. initialize the AVPlayerItem video content object
/* Obtain the playback content object. An AVPlayerItem corresponds to a video file */-(AVPlayerItem *) getPlayItemByNum :( NSInteger) num {if (num> = self. playerItemArray. count) {return nil;} // create URL NSString * urlStr = self. playerItemArray [num]; urlStr = [urlStr failed: Failed]; NSURL * url = [NSURL URLWithString: urlStr]; // create a playback Content Object AVPlayerItem * item = [AVPlayerItem playerItemWithURL: url]; return item ;}
2. Initialize an AVPlayer Video Player Object
/* Initialize the video player */-(void) initAVPlayer {// obtain the playback content AVPlayerItem * item = [self getPlayItemByNum: 0]; // create a video player AVPlayer * player = [AVPlayer playerWithPlayerItem: item]; self. player = player; // Add the playback progress listener [self addProgressObserver]; // Add the playback content KVO listener [self addObserverToPlayerItem: item]; // added the notification center listener to complete playback [self addNotificationToPlayerItem];}
3. initialize the AVPlayerLayer playback layer object.
# Pragma mark-initialization/* initialization of the player layer object */-(void) initAVPlayerLayer {// create a video player layer object AVPlayerLayer * layer = [AVPlayerLayer playerLayerWithPlayer: self. player]; layer. frame = self. movieView. bounds; // size layer. videoGravity = AVLayerVideoGravityResizeAspect; // Add the video fill mode to the control layer [self. movieView. layer addSublayer: layer]; self. playerLayer = layer; self. movieView. layer. masksToBounds = YES ;}
4. The notification center has listened to the video.
# Pragma mark-notification center-(void) addNotificationToPlayerItem {// Add notification center listener video playback completed [[nsicationicationcenter defaultCenter] addObserver: self selector: @ selector (playerDidFinished :) name: AVPlayerItemDidPlayToEndTimeNotification object: self. player. currentItem];}-(void) removeNotificationFromPlayerItem {// remove notification from the notification center [[nsicationicationcenter defacenter center] removeObserver: self];} /* call */-(void) playerDidFinished :( NSNotification *) notification after the playback is complete. {// automatically play the next video NSInteger currentIndex = self. segmentView. selectedSegmentIndex; self. segmentView. selectedSegmentIndex = (currentIndex + 1) % self. playerItemArray. count; [self segmentValueChange: self. segmentView];}
5. KVO attribute listening
# Pragma mark-KVO listener attribute/* Add KVO to listen to the playback status and buffer loading status */-(void) addObserverToPlayerItem :( AVPlayerItem *) item {// monitoring status attribute [item addObserver: self forKeyPath: @ "status" options: NSKeyValueObservingOptionNew context: nil]; // monitoring buffer loading attribute [item addObserver: self forKeyPath: @ "loadedTimeRanges" options: inclucontext: nil];}/* remove KVO */-(void) removeObserverFromPlayerItem :( AVPlayerItem *) item {[item removeObserver: self forKeyPath: @ "status"]; [item removeObserver: self forKeyPath: @ "loadedTimeRanges"];}/* attribute changed, KVO response function */-(void) observeValueForKeyPath :( NSString *) keyPath ofObject :( id) object change :( NSDictionary
  
   
*) Change context :( void *) context {AVPlayerItem * playerItem = (AVPlayerItem *) object; if ([keyPath isEqualToString: @ "status"]) {// status change AVPlayerStatus status = [[change objectForKey: @ "new"] integerValue]; if (status = AVPlayerStatusReadyToPlay) {NSLog (@ "playing .., the total video length is %. 2f ", CMTimeGetSeconds (playerItem. duration);} else if ([keyPath isw.tostring: @ "loadedTimeRanges"]) {// buffer area change NSArray * array = playerItem. loadedTimeRanges; CMTimeRange timeRange = [array. firstObject CMTimeRangeValue]; // buffer range: float startSeconds = CMTimeGetSeconds (timeRange. start); float durationSeconds = CMTimeGetSeconds (timeRange. duration); NSTimeInterval totalBuffer = startSeconds + durationSeconds; // The total buffer length NSLog (@ "total buffer: %. 2f ", totalBuffer );}}
  
6. progress bar monitoring
# Pragma mark-Progress listener-(void) addProgressObserver {AVPlayerItem * item = self. player. currentItem; UIProgressView * progress = self. progressView; // progress listener [self. player addPeriodicTimeObserverForInterval: CMTimeMake (1.0, 1.0) queue: dispatch_get_main_queue () usingBlock: ^ (CMTime time) {// CMTime indicates the structure of the video time information, including video time points, frames per second, and other information // gets the number of currently played seconds float current = CMTimeGetSeconds (time); // gets the total number of video playback seconds float total = CMTimeGetSeconds (item. duration); if (current) {[progress setProgress :( current/total) animated: YES] ;}}];}
7. UI click events and View Controller Loading
-(Void) viewDidLoad {[super viewDidLoad]; // attribute initialization self. segmentView. selectedSegmentIndex = 0; self. progressView. progress = 0; self. playerItemArray = @ [@ "http: // 192.168.6.147/1.mp4", @ "http: // 192.168.6.147/2.mp4", @ "http: // 192.168.6.147/3.mp4"]; // Video Player initialization [self initAVPlayer]; // Video Player display layer initialization [self initAVPlayerLayer]; // The video starts playing [self. player play];}-(void) dealloc {// remove listener and notification [self removeObserverFromPlayerItem: self. player. currentItem]; [self removeNotificationFromPlayerItem];} # pragma mark UI click/* click the play button */-(IBAction) playMovie :( UIButton *) sender {sender. enabled = NO; if (self. player. rate = 0) {// The playback speed is 0, indicating that the video is paused. titleLabel. text = @ "pause"; [self. player play]; // start playing} else if (self. player. rate = 1.0) {// The playback speed is 1.0, indicating that the video is being played by sender. titleLabel. text = @ "play"; [self. player pause]; // pause playing} sender. enabled = YES;}/* select the video playback list */-(IBAction) segmentValueChange :( UISegmentedControl *) sender {// first remove all listeners to AVPlayerItem [self removeNotificationFromPlayerItem]; [self removeObserverFromPlayerItem: self. player. currentItem]; // obtain the new playback content AVPlayerItem * playerItem = [self getPlayItemByNum: sender. selectedSegmentIndex]; // Add a property listener [self addObserverToPlayerItem: playerItem]; // Replace the video content [self. player replaceCurrentItemWithPlayerItem: playerItem]; // Add a playback completion listener [self addnotiftoplayeritem];}

3. AVPlayerViewController

A simple video player is doing this, and it is still troublesome, and many functions have not been implemented yet.
In fact, after iOS8.0, Apple encapsulatesAVPlayerAnd other video playback related classes, forming a player controller class that can be directly used simply, that isAVPlayerViewControllerIn the following section, you will find it great. The above pile can be implemented with only a small piece of code below.

Procedure: import the framework:
Add header file:
#import #import 
Create URLCreate AVPlayerCreate AVPlayerViewController

Over: a player with complete functions.

The following is all the code [/(ㄒ o workflow )/~~ Tears RUSH ]:
# Import "ViewController. h "# import @ interface ViewController () @ property (strong, nonatomic) AVPlayerViewController * playerVC; @ end @ implementation ViewController-(void) viewDidLoad {[super viewDidLoad]; // create a url nsurl * url = [NSURL URLWithString: @ "http: // 192.168.6.147/1.mp4"]; // directly create AVPlayer, which also creates AVPlayerItem first, this is just a shortcut: AVPlayer * player = [AVPlayer playerWithURL: url]; // create AVPlayerViewController * playerVC = [[AVPlayerViewController alloc] init]; playerVC. player = player; playerVC. view. frame = self. view. frame; [self. view addSubview: playerVC. view]; self. playerVC = playerVC; // call the player attribute of the Controller to start the playback method [self. playerVC. player play];} @ end

This is cool and hard to believe, but it is only available in iOS9, just to replace
MediaPlayerFrameworkMPMoviePlayerViewControllerAnd customized very convenient video play thick ky "http://www.bkjia.com/kf/ware/vc/" target = "_ blank" class = "keylink"> memory + oaM8L3A + DQo8aDMgaWQ9 "four extensions to generate video thumbnails"> 4, scale-generate video thumbnails

AVFoundationThe framework also provides a classAVAssetImageGeneratorTo obtain the video.

Application Scenario: When you play a video, you can drag the progress bar to display a video thumbnail. You can view the video to which the video is played. When you select a video to play the video, you can use the video thumbnail and click the video zoom chart, when You Need To Capture screenshots in some interesting video scenarios on the real playback video page, you can use a video thumbnail to create a thumbnail. AVURLAssetObject, which is used to obtain media information, including videos and sounds. According AVURLAssetCreate AVAssetImageGeneratorObject Method copyCGImageAtTime:Obtain
-(CGImageRef) copyCGImageAtTime :( CMTime) requestedTime/* specifies the time point at which the video is generated */actualTime :( CMTime *) actualTime/* media time when the thumbnail is actually generated */error :( NSError **) outError;/* error message */
The following is the actual code:
/* Obtain the video thumbnail */-(UIImage *) getThumbailImageRequestAtTimeSecond :( CGFloat) timeBySecond {// the URL of the video file NSURL * url = [NSURL URLWithString: @ "http: // 192.168.6.147/2.mp4 "]; // create a media information object AVURLAsset * urlAsset = [AVURLAsset assetWithURL: url]; // create a video thumbnail generator object AVAssetImageGenerator * imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset: urlAsset]; // specify the time when a video thumbnail is created, the second parameter is the number of frames per second. CMTime time = CMTimeMake (timeBySecond, 10); CMTime actualTime; // NSError * error = nil; // error message // use the object method to generate a video thumbnail. Note that CGImageRef is generated. If you want to display it on UIImageView, convert it to UIImage CGImageRef cgImage = [imageGenerator copyCGImageAtTime: time actualTime: & actualTime error: & error]; if (error) {NSLog (@ "An error occurred when capturing a video thumbnail. error message: % @", error. localizedDescription); return nil;} // CGImageRef convert to UIImage object UIImage * image = [UIImage imageWithCGImage: cgImage]; // remember to release CGImageRef CGImageRelease (cgImage); return image ;}
 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.