How to organize IOS video playback

Source: Internet
Author: User

Tag: Time main info also has a bit of the INF margin property conf

Original intention

Multimedia this entire series of articles themselves are ready to start sorting out, first from the most simple video audio and the most commonly used playback start slowly down to the bottom of the code decoding, and so on, this article from the video playback of the simplest.

How many video playback methods are available for iOS? In fact, if it is simple to play a video and there is no need for the UI is very simple, it is easy to do, but I believe that the monkey situation in addition to your demo will not appear, the definition of the playback UI and there may be a variety of different needs to correspond to you can not write a player is OK.

The most original playback

If it is not just contact with iOS development students should know mediaplayer this framework, if you want to use it to play a video, perhaps a few lines of code can be done, it has a mpmovieplayerviewcontroller, the use of simple not to.

But unfortunately since iOS 9.0, it was abandoned by Apple, after 9.0 of the project recommended that we say, if you have to maintain the project before 9.0, maybe it is necessary for you to understand, we also introduce a basic use of it, as well as the entire play code logic inside it.

36 Krypton engineers used to write a three-party, Krvideoplayer.

This player is based on the MediaPlayer framework written, the inside of the two files, the code is quite simple, you directly to the source download it down after we become a MediaPlayer to understand the demo of the simple to say. It's full of the GIF demo pictures it shows on git:

You are looking at the files inside the source: only Krvideoplayercontrolview and Krvideoplayercontroller two, simple analysis of them:

1, krvideoplayercontrolview inherit from UIView

Frankly, this file is written in the player's UI, including some play buttons, progress bars, and full screen switching, etc.

2. Krvideoplayercontroller Integrated from MPMoviePlayerController

After inheriting directly using MPMoviePlayerController to play the video, when it was initialized on the Self.view to add Krvideoplayercontrolview this custom UI, you can see the following code:

Initialize krvideoplayercontroller-(instancetype) initWithFrame: (cgrect) frame{self    = [super init];    if (self) {        self.view.frame = frame;        Self.view.backgroundColor = [Uicolor blackcolor];        Self.controlstyle = Mpmoviecontrolstylenone;        [Self.view AddSubview:self.videoControl];        Self.videoControl.frame = self.view.bounds;        [Self configobserver];        [Self configcontrolaction];    }    return self;} Lazy Load krvideoplayercontrolview-(Krvideoplayercontrolview *) videocontrol{    if (!_videocontrol) {        _ Videocontrol = [[Krvideoplayercontrolview alloc] init];    }    return _videocontrol;}

About MediaPlayer you need to take a look at the following:

1, the method of playing or suspending is in the Mpmediaplayback agreement.

2, MPMoviePlayerController is to abide by the above-mentioned Mpmediaplayback agreement, the following MPMoviePlayerController source:

3, in the category written to MPMoviePlayerController Mpmovieproperties, Mpmovieplayerthumbnailgeneration, Mpmovieplayertimedmetadataadditions contains almost all the functions of this player, Indifferent to this part of the method code is in the MPMoviePlayerController.h, interested or need to command in to understand.

4, the above introduction of the three parties to provide you with more than just a code, I hope we can understand a way of thinking, is the custom player how we should understand to do. I'll mention it later.

About MediaPlayer for the moment to mention so much, there are questions welcome to exchange.

It's time to upgrade.

Well, it's time to upgrade. Speaking of which we said before the 9.0 system after the player, which said this before, incidentally, I remember that when we developed the application of the minimum version is more than 7.0, to the first two years to develop to more than 8.0, according to my own understanding, after the 11 system release if we do new applications or old project projects Maintenance should be slowly discard 7.0 and 8.0, that is, the minimum version according to 9.0, because whether it is 7.0 or 8.0, the proportion of users is really very small, and some of the fresh features in our low version is not supported, maintenance costs will gradually become more and more large, Of course, these are not unfounded, you can surf the Internet to search for the proportion of the system before the 8.0, and 8.0, 7.0 to the overall maintenance costs, I recently visited some forums when there are peers to say this problem. Okay, back to the chase!

Say our point: After 9.0 Apple recommends: Avkit Framework, first Avkit framework is after 8.0, it is based on the familiar avfoundation framework.

When we use Avkit for video playback We sort out what we need roughly in these categories or protocols:

1, Avplayeritem (video to play the elements)

2. Avplayerlayer (the layer interface for playing the display video)

3, Avplayer (for playing audio and video)

4, Avplayerviewcontroller (Controller)

5. Avplayerviewcontrollerdelegate (Agreement)

If you want to thoroughly understand avfoundation this framework is not easy, the framework is indeed very large, there is a book called "AV Foundation development Cheats" Interested can go to buy to see, oneself also in the study, follow-up article all will organize in this series.

This article is equal to the series to open a head, the framework of the learning path should be long, but also hope that they can stick to the end of this series of articles are summed up. Let's say each of the classes described above:

1, Avplayeritem

When we use Avplayer to play video, the video information is Avplayeritem, a avplayeritem corresponds to a video URL you provide a resource, this understanding it can be compared to a model, After you initialize the Avplayeritem, it is not immediately possible to use it, because the URL of the network is related to the need for time, and so on after the Avplayeritem load good can use it, then this step we how to deal with it?

1>: The answer is to use KVO to observe the statues attribute as Avplayerstatusreadytoplay and see the definition of this attribute:

@property (nonatomic, readonly) avplayerstatus status It is a read-only property, which also needs to be noted, in fact, it also understands the reasons for using KVO.

2>: By the way, if you want to display the cache progress of the current screen, you need to monitor its loadedtimeranges properties.

2, Avplayerlayer

It is mainly responsible for the video display, inherited from Calayer, in fact, you can interpret it as our view. The controls that we customize are those that are added to it, such as the play button we can see, the Stop button, the progress bar, and so on.

3, Avplayer

It is mainly responsible for managing video playback, pausing, and so on, equivalent to a video manager, if the analogy he is a viewcontroller (of course not the real viewcontroller), these three basically includes a basic video broadcast, Based on the three, we summarize the basic process of playing a video:

    • First, get the URL of the video
    • Create avplayeritem based on URL
    • Make Avplayeritem available to Avplayer.
    • Avplayerlayer display video.
    • Avplayer control video, play, pause, jump and so on.
    • Gets the buffering progress during playback and gets the playback progress.
    • What to do when the video is finished, whether to pause or loop, or to get the last frame of the image.                     

4, Avplayerviewcontroller

It's Apple's package for us. Can be a video playback controller, it has a @property (nonatomic, Strong, nullable) Avplayer *player Properties, the front of the avplayer is like the corresponding need to assign to it, It also has some properties that we need to understand, and we'll write it out, and we'll look at the code below:

    • Player: Set player
    • Showsplaybackcontrols: Set whether the media playback component is displayed, default Yes
    • Videogravity: Setting Video stretch mode
    • Allowspictureinpictureplayback: Sets whether to allow PIP playback, default Yes
    • Delegate: Setting up the agent

5, Avplayerviewcontrollerdelegate

This agent is said before the Avplayerviewcontroller protocol, it is mainly for the painting in the setting of the agent, the previous introduction of Avplayerviewcontroller when you have seen a picture of whether to allow the properties of the picture, Specifically what is the picture in the picture I believe we all know, have seen direct friends should have seen the specific application of this technology point. Let's see what's in it. What are the main rules of the meal?

1, will begin to draw in the picture-(void) Playerviewcontrollerwillstartpictureinpicture: (Avplayerviewcontroller *) Playerviewcontroller ;//2, beginning Pip-(void) Playerviewcontrollerdidstartpictureinpicture: (Avplayerviewcontroller *) playerviewcontroller;// 3, pip failure-(void) Playerviewcontroller: (Avplayerviewcontroller *) Playerviewcontroller Failedtostartpictureinpicturewitherror: (Nserror *) error;//4, coming to an end in Pip-(void) Playerviewcontrollerwillstoppictureinpicture: (Avplayerviewcontroller *) playerviewcontroller;//5, Finish Pip-(void) Playerviewcontrollerdidstoppictureinpicture: (Avplayerviewcontroller *) Playerviewcontroller;

Let's look at a simple demo

We do not say about avfoundation complex things, because they are also learning this avfoundation, we first look at some very simple demo, simply use the Avfoundation play a video:

        We're simply going to look at the code we've written, and we'll simply use some of the above-mentioned points of knowledge:

-(void) viewdidload {[Super viewdidload];    Do any additional setup after loading the view.        Self.view.backgroundColor = [Uicolor Whitecolor];    Self.avplayeritem = [Avplayeritem playeritemwithurl:[nsurl Urlwithstring:movieurl]];    Self.avplayer = [[Avplayer Alloc]initwithplayeritem:self.avplayeritem];    Self.avplayerlayer = [Avplayerlayer PlayerLayerWithPlayer:self.avPlayer];    Self.avPlayerLayer.frame = CGRectMake (10, 100, 355, 200);    [Self.view.layer AddSublayer:self.avPlayerLayer]; Add observer [self Addobserverwithavplayeritem];} #pragma mark-#pragma mark-kvo-(void) addobserverwithavplayeritem{//Status add Observer [Self.avplayeritem Addobserver:    Self forkeypath:@ "status" options: (nskeyvalueobservingoptionnew) Context:nil]; Cache Progress Add observer [Self.avplayeritem addobserver:self forkeypath:@ "loadedtimeranges" options: Nskeyvalueobservingoptionnew Context:nil];} -(void) Observevalueforkeypath: (NSString *) KeyPath Ofobject: (ID) object change: (nsdictionary<NSKeyValueChangeKey,id> *) Change context: (void *) context{Avplayeritem * Avplayeritem = (Avplayeritem *) obje    Ct         if ([KeyPath isequaltostring:@ "status"]) {avplayerstatus status = [Change objectforkey:@ "new"] intvalue];            if (status = = Avplayerstatusreadytoplay) {NSLog (@ "Ready to play");            Cmtime duration = avplayeritem.duration;            NSLog (@ "Video Total Duration:%.2f", Cmtimegetseconds (duration));                    Play [Self.avplayer play];        }else if (status = = avplayerstatusfailed) {NSLog (@ "Video preparation error");        }else{NSLog (@ "position error"); }}else if ([KeyPath isequaltostring:@ "Loadedtimeranges"]) {//can customize the cache progress Nstimeinterval Timeint         Erval = [self alreadycachevideoprogress];    NSLog (@ "Video has been cached for the duration:%.2f", timeinterval); }} #pragma mark-#pragma mark-alreadycachevideoprogress-(nstimeinterval) alreadycachevideoprogress{//Get into its cache first Degrees Nsarray * CachEvideotime = [Self.avplayeritem loadedtimeranges]; Cmtimerange structure start duration represents the starting position and duration//Get buffer area cmtimerange timerange = [Cachevideotime.firstobject cmtime    Rangevalue];    float startseconds = cmtimegetseconds (Timerange.start);    float durationseconds = cmtimegetseconds (timerange.duration);    Calculate Total Buffer time = start + duration Nstimeinterval result = Startseconds + durationseconds; return result;}

These are the points we need to pay attention to.

1. Cmtime a structure designed to identify video time

/*! @typedefCMTime @abstractrational time value represented as Int64/int32.*/typedef struct{cmtimevaluevalue;/*! @field Value the value of the cmtime. Value/timescale = seconds. Number of frames */cmtimescaletimescale;/*! @field timescale The timescale of the cmtime. Value/timescale = seconds. 帧率(影片每秒有几帧) */

Cmtimeepoch epoch; /*! @field epoch differentiates between equal timestamps that is actually different because of looping, multi-item sequencing , etc. 'll is used during comparison:greater epochs happen after lesser ones. Additions/subtraction is only possible within a single epoch, however, and since epoch length may be unknown/variable. */
} cmtime;

In the preceding code we see a way to get the total length of the video:

      Cmtime duration = avplayeritem.duration;      NSLog (@ "Video Total Duration:%.2f", Cmtimegetseconds (duration));

You can see that the Cmtimegetseconds function converts a cmtime type to a float, and if a movie is 60 frames per second, and you currently want to jump to a position of 120 frames, that is, two seconds, you can create a cmtime type of data. It can usually be created with the following two functions.

1>: Cmtimemake (int64_t value, int32_t scale) Eg:cmtime time1 = Cmtimemake (120, 60);

2>:cmtimemakewithseconds (Flout64 seconds, int32_t scale) Eg:cmtime time2 = Cmtimewithseconds (120, 60);

The difference between cmtimemakewithseconds and Cmtimemake is that the first parameter of the first function can be a float, and the other one.

  -(ID) Addperiodictimeobserverforinterval: (cmtime) Interval queue: (nullable dispatch_queue_t) queue usingblock: (void ( ^) (cmtime time)) block;

For example: We set the interval to 1/10 seconds and then update the UI within the block. is to update the UI 10 times a second, let's verify:

  [Self.avplayer addperiodictimeobserverforinterval:cmtimemake (1, Ten) Queue:dispatch_get_main_queue () usingBlock:^ ( Cmtime time) {                  //Cmtime timescale definition helps to understand the following code          //@field timescale The timescale of the cmtime. Value/timescale = S Econds.          float currentplaytime = (double) Self.avplayeritem.currenttime.value/self.avplayeritem.currenttime.timescale;          NSLog (@ "Current playback progress:%f", currentplaytime);    }];

We randomly intercept a print log and look at the results to verify:

2, Avplayeritem video playback end notification

/* Note that nsnotifications posted by Avplayeritem is posted on a different thread from the one on which the observer was registered. *///Notifications Descriptionavf_export NS String *const avplayeritemtimejumpednotification ns_available (10_7, 5_0);//The item ' s current time has changed Discontin   Uouslyavf_export nsstring *const avplayeritemdidplaytoendtimenotification ns_available (10_7, 4_0); Item has played to its end timeavf_export nsstring *const avplayeritemfailedtoplaytoendtimenotification NS_AVAILABLE (1   0_7, 4_3); Item has failed to play to its end timeavf_export nsstring *const avplayeritemplaybackstallednotification Ns_avai    Lable (10_9, 6_0); Media did not arrive on time to continue playbackavf_export nsstring *const avplayeritemnewaccesslogentrynotification N S_available (10_9, 6_0);//A new access log entry has been Addedavf_export nsstring *const AvplayeritemnewerrorlogEntrynotification ns_available (10_9, 6_0);//A new error log entry has been added//notification UserInfo key Typeavf_export NSString *const Avplayeritemfailedtoplaytoendtimeerr   Orkey ns_available (10_7, 4_3); Nserror

3. These three-party frameworks

(1): Vkvideoplayer

(2): Almovieplayercontroller

(3): Pbjvideoplayer

(4): And this is a more powerful mobilevlckit.

On the above the three parties are given a connection, the last one to help us integrate the article, these three in the later series of the summary of the article will be a little bit slowly all say, here only to mention that there are these frameworks in, interested can first understand, after I summarize.

How to organize IOS video playback

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.