iOS Video Rewind

Source: Internet
Author: User
Tags rewind

iOS Video Rewind

Video Rewind is the video from the back to play, this only adapt to the video image, for the sound is just a noise, no meaning, so the sound is removed when inverted.

Inverted implementation

Generally H264 encoded video decoding, all from beginning to end, because the video exists I-frame, p-frame, B-frame, decoding P-frame when the need to rely on the most recent I-frame or the previous P-frame, decoding B-frame, not only to rely on the previous cache data, but also rely on the following data, This leads to the inability to really decode the decoder from the back, but to divide the video into a small enough fragment to handle each fragment individually. The specific idea is as follows: we need to seek to the last nth GOP first frame-I frame, and then the current point to the last image of the video is decoded, stored in an array inside. This n is based on the size of the decoded data, because if the decoded data is too large, memory consumption too much, it will cause the program to be killed, I am the video into a small fragment, the fragments, the reverse decoding, and then each piece of the solution out of the image, the reverse code. GitHub can be easily implemented using Avfoundation:

 //sjreverseutility.h//playback////Created by Lightning on 2018/7/12. #import <foundation/ foundation.h> #import <avfoundation/avfoundation.h>typedef Void (^reversecallback) (avassetwriterstatus Status, float progress, nserror *error); @interface sjreverseutility:nsobject-(Instancetype) Initwithasset: (Avasset *) Asset OutputPath: (NSString *) path;-(void) startprocessing;-(void) cancelprocessing; @property (nonatomic, copy) Reversecallback callBack; @property (nonatomic, assign) Cmtimerange timerange; @end  
sjreverseutility.m//playback////Created by Lightning on 2018/7/12. #import "SJReverseUtility.h" @interface sjrever Seutility () @property (nonatomic, strong) Nsmutablearray *samples; @property (nonatomic, strong) Avasset *asset;@ Property (Nonatomic, Strong) Nsmutablearray *tracks; @property (nonatomic, strong) Avmutablecomposition *composition;@ Property (Nonatomic, Strong) Avassetwriter *writer; @property (nonatomic, strong) Avassetwriterinput *writerinput;@ Property (Nonatomic, Strong) Avassetwriterinputpixelbufferadaptor *writeradaptor; @property (nonatomic, assign) UINT Frame_count, @property (nonatomic, strong) Avmutablecompositiontrack *compositiontrack; @property (nonatomic, assign) Cmtime offsettime; @property (nonatomic, assign) Cmtime intervaltime; @property (nonatomic, assign) Cmtime segduration;@ Property (Nonatomic, assign) BOOL shouldstop; @property (nonatomic, copy) NSString *path; @end @implementation sjreverseutility-(Instancetype) Initwithasset: (Avasset *) Asset OutputPath: (NSSTring *) path{self = [super init];                        if (self) {_asset = asset;        _composition = [avmutablecomposition composition]; Avmutablecompositiontrack *ctrack = [_composition addmutabletrackwithmediatype:avmediatypevideo PreferredTrackID:        Kcmpersistenttrackid_invalid];                _compositiontrack = Ctrack;        _timerange = Kcmtimerangeinvalid;        _frame_count = 0;        _offsettime = Kcmtimezero;        _intervaltime = Kcmtimezero;            [Self setupwriterwithpath:path]; } return self;} -(void) cancelprocessing{self.shouldstop = YES;} -(void) startprocessing{if (Cmtimerange_is_invalid (_timerange)) {_timerange = Cmtimerangemake (Kcmtimezero, _as    Set.duration);    } cmtime duration = _asset.duration;    Cmtime segduration = Cmtimemake (1, 1);    Self.segduration = segduration;    Nsarray *videotracks = [_asset trackswithmediatype:avmediatypevideo];    Avassettrack *track = videotracks[0]; should set before starting self.writerInput.transform = track.preferredtransform;//fix video orientation [Self.writer startwriting]; [Self.writer Startsessionatsourcetime:kcmtimezero]; Start processing//divide video into n segmentation int n = (int) (Cmtimegetseconds (duration)/cmtimegetseconds (    segduration)) + 1; if (Cmtimerange_is_valid (_timerange)) {n = (int) (Cmtimegetseconds (_timerange.duration)/cmtimegetseconds (segDuratio        N)) + 1;            Duration = Cmtimeadd (_timerange.start, _timerange.duration);    } __weak typeof (self) weakself = self;        for (int i = 1; i < n; i++) {Cmtime offset = cmtimemultiply (segduration, i);        if (cmtimecompare (offset, duration) > 0) {break;        } cmtime Start = cmtimesubtract (duration, offset);            if (Cmtimecompare (Start, _timerange.start) < 0) {start = Kcmtimezero;        Segduration = cmtimesubtract (Duration, cmtimemultiply (segduration, i-1)); } self.Compositiontrack = [_composition addmutabletrackwithmediatype:avmediatypevideo preferredtrackid:        Kcmpersistenttrackid_invalid]; [Self.compositiontrack inserttimerange:cmtimerangemake (Start, segduration) oftrack:track AtTime:kCMTimeZero Error:                NIL];                [Self generatesampleswithtrack:_composition];        [Self encodesamplebuffer];            if (self.shouldstop) {[Self.writer cancelwriting]; if ([[[Nsfilemanager Defaultmanager] Fileexistsatpath:_path]) {[[Nsfilemanager Defaultmanager] RemoveItemAt            Path:_path Error:nil]; }!weakself.callback?                        : Weakself.callback (WeakSelf.writer.status,-1, weakSelf.writer.error);        Return                } [Self.compositiontrack Removetimerange:cmtimerangemake (Start, segduration)]; !weakself.callback?    : Weakself.callback (WeakSelf.writer.status, (float) i/n, weakSelf.writer.error); } [Self.writer Finishwritingwithcompletionhandler:^{!weakself.callback?: Weakself.callback (WeakSelf.writer.status, 1.0f, WeakSelf.writer.error);    }];    }-(void) Setupwriterwithpath: (NSString *) path{nsurl *outputurl = [Nsurl Fileurlwithpath:path];        Avassettrack *videotrack = [[_asset trackswithmediatype:avmediatypevideo] lastobject];                                            Initialize the writer self.writer = [[Avassetwriter alloc] Initwithurl:outputurl    FILETYPE:AVFILETYPEMPEG4 Error:nil];                                           Nsdictionary *videocompressionprops = [Nsdictionary Dictionarywithobjectsandkeys:    @ (videotrack.estimateddatarate), Avvideoaveragebitratekey, nil];    int width = videoTrack.naturalSize.width;    int height = videoTrack.naturalSize.height;                                          Nsdictionary *writeroutputsettings = [Nsdictionary Dictionarywithobjectsandkeys: AVVideoCodecH264, AVvideocodeckey, [NSNumber numberWithInt:videoTrack.naturalSize.width], Avvideowid                                          Thkey, [NSNumber numberWithInt:videoTrack.naturalSize.height], Avvideoheightkey,                                          Videocompressionprops, Avvideocompressionpropertieskey,    NIL];                                                                     Avassetwriterinput *writerinput = [[Avassetwriterinput alloc] Initwithmediatype:avmediatypevideo                                                                   Outputsettings:writeroutputsettings    Sourceformathint: (__bridge cmformatdescriptionref) [Videotrack.formatdescriptions LastObject]];    [Writerinput Setexpectsmediadatainrealtime:no];    Self.writerinput = Writerinput; Self.writeradaptor = [[Avassetwriterinputpixelbufferadaptor alloc] Initwithassetwriterinput:writerinput    Sourcepixelbufferattributes:nil]; [Self.writer AddiNput:self.writerInput];  }-(void) Generatesampleswithtrack: (Avasset *) asset{//Initialize the reader avassetreader *reader = [[Avassetreader    Alloc] Initwithasset:asset Error:nil];        Avassettrack *videotrack = [[Asset Trackswithmediatype:avmediatypevideo] lastobject]; Nsdictionary *readeroutputsettings = [nsdictionary dictionarywithobjectsandkeys:[nsnumber numberWithInt:    Kcvpixelformattype_420ypcbcr8biplanarfullrange], Kcvpixelbufferpixelformattypekey, nil];                                                                                        avassetreadertrackoutput* readeroutput = [Avassetreadertrackoutput assetreadertrackoutputwithtrack:videotrack    Outputsettings:readeroutputsettings];    [Reader addoutput:readeroutput];        [Reader startreading];        Read in the Samples _samples = [[Nsmutablearray alloc] init];    Cmsamplebufferref sample;     while (sample = [Readeroutput Copynextsamplebuffer]) {[_samples AddObject: (__bridge ID) sample];   NSLog (@ "Count =%d", _samples.count);    Cfrelease (sample); } if (_samples.count > 0) {self.intervaltime = Cmtimemakewithseconds (Cmtimegetseconds (self.segduration)/(fl    Oat) (_samples.count), _asset.duration.timescale); }}-(void) encodesamplebuffer{for (Nsinteger i = 0; i < _samples.count; i++) {//Get the presentation Tim E for the frame cmtime presentationtime = Cmsamplebuffergetpresentationtimestamp ((__bridge Cmsamplebufferre                f) _samples[i]);                Presentationtime = Cmtimeadd (_offsettime, self.intervaltime);                size_t index = _samples.count-i-1;            if (0 = = _frame_count) {presentationtime = Kcmtimezero; index = _samples.count-i-2;                        The first frame upside down is the black Discard} cmtimeshow (Presentationtime);                Cvpixelbufferref imagebufferref = Cmsamplebuffergetimagebuffer ((__bridge cmsamplebufferref) _samples[index]); while (!_writerinput.readyformoRemediadata) {[Nsthread sleepfortimeinterval:0.1];                } _offsettime = Presentationtime;        BOOL success = [Self.writeradaptor appendpixelbuffer:imagebufferref withpresentationtime:presentationtime];        _frame_count++;            if (!success) {NSLog (@ "status =%ld", (long) self.writer.status);        NSLog (@ "status =%@", self.writer.error); }}} @end

In iOS, this code can be used to rewind any length of video. However, on each frame of the timestamp, still need to improve.

iOS Video Rewind

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.