iOS視頻倒放

來源:互聯網
上載者:User

標籤:before   compare   setup   attr   ber   存在   top   puts   key   

iOS視頻倒放

視頻的倒放就是視頻從後往前播放,這個只適應於視頻映像,對聲音來說倒放只是噪音,沒什麼意義,所以倒放的時候聲音都是去除的。

倒放實現

一般對H264編碼的視頻進行解碼,都是從頭至尾進行的,因為視頻存在I幀、P幀、B幀,解碼P幀的時候需要依賴前面最近的I幀或者前一個P幀,解碼B幀的時候,不僅要依賴前面的快取資料還要依賴後面的資料,這就導致了我們沒法真正讓解碼器從後往前解碼,只能把視頻分成很多足夠小的片段,對每一個片段單獨進行處理。具體思路如下:我們需要先seek到最後第n個GOP的第一幀-I幀,然後把當前這個點到視頻最後的映像都解碼出來,儲存在一個數組裡面。這個n是根據解碼資料大小定的,因為如果解碼出來的資料太大,記憶體佔用過多,會導致程式被殺掉,我是把視頻分成一秒一個小片段,對這些片段,倒過來進行解碼,然後把每一段解出來的映像,倒過來編碼。使用AVFoundation可以很方便的實現github:

//  SJReverseUtility.h//  playback////  Created by Lightning on 2018/7/12.#import <Foundation/Foundation.h>#import <AVFoundation/AVFoundation.h>typedef void(^ReverseCallBack)(AVAssetWriterStatus status, float progress, NSError *error);@interface SJReverseUtility : NSObject- (instancetype)initWithAsset:(AVAsset *)asset outputPath:(NSString *)path;- (void)startProcessing;- (void)cancelProcessing;@property (nonatomic, copy) ReverseCallBack callBack;@property (nonatomic, assign) CMTimeRange timeRange;@end
////  SJReverseUtility.m//  playback////  Created by Lightning on 2018/7/12.#import "SJReverseUtility.h"@interface SJReverseUtility()@property (nonatomic, strong) NSMutableArray *samples;@property (nonatomic, strong) AVAsset *asset;@property (nonatomic, strong) NSMutableArray *tracks;@property (nonatomic, strong) AVMutableComposition *composition;@property (nonatomic, strong) AVAssetWriter *writer;@property (nonatomic, strong) AVAssetWriterInput *writerInput;@property (nonatomic, strong) AVAssetWriterInputPixelBufferAdaptor *writerAdaptor;@property (nonatomic, assign) uint frame_count;@property (nonatomic, strong) AVMutableCompositionTrack *compositionTrack;@property (nonatomic, assign) CMTime offsetTime;@property (nonatomic, assign) CMTime intervalTime;@property (nonatomic, assign) CMTime segDuration;@property (nonatomic, assign) BOOL shouldStop;@property (nonatomic, copy) NSString *path;@end@implementation SJReverseUtility- (instancetype)initWithAsset:(AVAsset *)asset outputPath:(NSString *)path{    self = [super init];    if (self) {        _asset = asset;                        _composition = [AVMutableComposition composition];        AVMutableCompositionTrack *ctrack = [_composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];        _compositionTrack = ctrack;                _timeRange = kCMTimeRangeInvalid;        _frame_count = 0;        _offsetTime = kCMTimeZero;        _intervalTime = kCMTimeZero;        [self setupWriterWithPath:path];            }    return self;}- (void)cancelProcessing{    self.shouldStop = YES;}- (void)startProcessing{    if (CMTIMERANGE_IS_INVALID(_timeRange)) {        _timeRange = CMTimeRangeMake(kCMTimeZero, _asset.duration);    }    CMTime duration = _asset.duration;    CMTime segDuration = CMTimeMake(1, 1);    self.segDuration = segDuration;    NSArray *videoTracks = [_asset tracksWithMediaType:AVMediaTypeVideo];    AVAssetTrack *track = videoTracks[0];    //should set before starting    self.writerInput.transform = track.preferredTransform;//fix video orientation        [self.writer startWriting];    [self.writer startSessionAtSourceTime:kCMTimeZero]; //start processing        //divide video into n segmentation    int n = (int)(CMTimeGetSeconds(duration)/CMTimeGetSeconds(segDuration)) + 1;    if (CMTIMERANGE_IS_VALID(_timeRange)) {        n = (int)(CMTimeGetSeconds(_timeRange.duration)/CMTimeGetSeconds(segDuration)) + 1;        duration = CMTimeAdd(_timeRange.start, _timeRange.duration);            }        __weak typeof(self) weakSelf = self;    for (int i = 1; i < n; i++) {        CMTime offset = CMTimeMultiply(segDuration, i);        if (CMTimeCompare(offset, duration) > 0) {            break;        }        CMTime start = CMTimeSubtract(duration, offset);        if (CMTimeCompare(start, _timeRange.start) < 0) {            start = kCMTimeZero;            segDuration = CMTimeSubtract(duration, CMTimeMultiply(segDuration, i-1));        }        self.compositionTrack = [_composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];        [self.compositionTrack insertTimeRange:CMTimeRangeMake(start, segDuration) ofTrack:track atTime:kCMTimeZero error:nil];                [self generateSamplesWithTrack:_composition];                [self encodeSampleBuffer];        if (self.shouldStop) {            [self.writer cancelWriting];            if ([[NSFileManager defaultManager] fileExistsAtPath:_path]) {                [[NSFileManager defaultManager] removeItemAtPath:_path error:nil];            }            !weakSelf.callBack? :weakSelf.callBack(weakSelf.writer.status, -1, weakSelf.writer.error);                        return;        }                        [self.compositionTrack removeTimeRange:CMTimeRangeMake(start, segDuration)];                !weakSelf.callBack? :weakSelf.callBack(weakSelf.writer.status, (float)i/n, weakSelf.writer.error);    }    [self.writer finishWritingWithCompletionHandler:^{        !weakSelf.callBack? :weakSelf.callBack(weakSelf.writer.status, 1.0f, weakSelf.writer.error);    }];    }- (void)setupWriterWithPath:(NSString *)path{    NSURL *outputURL = [NSURL fileURLWithPath:path];    AVAssetTrack *videoTrack = [[_asset tracksWithMediaType:AVMediaTypeVideo] lastObject];        // Initialize the writer    self.writer = [[AVAssetWriter alloc] initWithURL:outputURL                                            fileType:AVFileTypeMPEG4                                               error:nil];    NSDictionary *videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:                                           @(videoTrack.estimatedDataRate), AVVideoAverageBitRateKey,                                           nil];    int width = videoTrack.naturalSize.width;    int height = videoTrack.naturalSize.height;    NSDictionary *writerOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:                                          AVVideoCodecH264, AVVideoCodecKey,                                          [NSNumber numberWithInt:videoTrack.naturalSize.width], AVVideoWidthKey,                                          [NSNumber numberWithInt:videoTrack.naturalSize.height], AVVideoHeightKey,                                          videoCompressionProps, AVVideoCompressionPropertiesKey,                                          nil];    AVAssetWriterInput *writerInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo                                                                     outputSettings:writerOutputSettings                                                                   sourceFormatHint:(__bridge CMFormatDescriptionRef)[videoTrack.formatDescriptions lastObject]];    [writerInput setExpectsMediaDataInRealTime:NO];    self.writerInput = writerInput;    self.writerAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil];    [self.writer addInput:self.writerInput];        }- (void)generateSamplesWithTrack:(AVAsset *)asset{    // Initialize the reader    AVAssetReader *reader = [[AVAssetReader alloc] initWithAsset:asset error:nil];    AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] lastObject];        NSDictionary *readerOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange], kCVPixelBufferPixelFormatTypeKey, nil];    AVAssetReaderTrackOutput* readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack                                                                                        outputSettings:readerOutputSettings];    [reader addOutput:readerOutput];    [reader startReading];        // read in the samples    _samples = [[NSMutableArray alloc] init];        CMSampleBufferRef sample;    while(sample = [readerOutput copyNextSampleBuffer]) {        [_samples addObject:(__bridge id)sample];        NSLog(@"count = %d",_samples.count);        CFRelease(sample);    }    if (_samples.count > 0 ) {        self.intervalTime = CMTimeMakeWithSeconds(CMTimeGetSeconds(self.segDuration)/(float)(_samples.count), _asset.duration.timescale);    }    }- (void)encodeSampleBuffer{    for(NSInteger i = 0; i < _samples.count; i++) {        // Get the presentation time for the frame                CMTime presentationTime = CMSampleBufferGetPresentationTimeStamp((__bridge CMSampleBufferRef)_samples[i]);                presentationTime = CMTimeAdd(_offsetTime, self.intervalTime);                size_t index = _samples.count - i - 1;                if (0 == _frame_count) {            presentationTime = kCMTimeZero;            index = _samples.count - i - 2; //倒過來的第一幀是黑的丟棄        }        CMTimeShow(presentationTime);                        CVPixelBufferRef imageBufferRef = CMSampleBufferGetImageBuffer((__bridge CMSampleBufferRef)_samples[index]);                while (!_writerInput.readyForMoreMediaData) {            [NSThread sleepForTimeInterval:0.1];        }        _offsetTime = presentationTime;                BOOL success = [self.writerAdaptor appendPixelBuffer:imageBufferRef withPresentationTime:presentationTime];        _frame_count++;        if (!success) {            NSLog(@"status = %ld",(long)self.writer.status);            NSLog(@"status = %@",self.writer.error);        }            }}@end

在iOS裡面,這段代碼可以倒放任意時間長度的視頻。但是在每一幀的時間戳記上,還有待改進。

iOS視頻倒放

相關文章

聯繫我們

該頁面正文內容均來源於網絡整理,並不代表阿里雲官方的觀點,該頁面所提到的產品和服務也與阿里云無關,如果該頁面內容對您造成了困擾,歡迎寫郵件給我們,收到郵件我們將在5個工作日內處理。

如果您發現本社區中有涉嫌抄襲的內容,歡迎發送郵件至: info-contact@alibabacloud.com 進行舉報並提供相關證據,工作人員會在 5 個工作天內聯絡您,一經查實,本站將立刻刪除涉嫌侵權內容。

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.