我使用AVAssetReader从视频文件一个接一个地获得帧缓冲区,并对帧进行一些操作,然后使用AVAssetWritter将新帧保存到临时文件。现在我有一个临时文件路径,所有的新帧一个接一个地保存。有没有办法播放视频的时间框架是不断添加到临时文件?
下面是从临时路径播放视频的代码(帧数不断增加):
(void)loadAssetFromFile {
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[(mMediaReader.mCameraRecorder) tempVideoFilePath ]] options:nil];
NSString *tracksKey = @"tracks";
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
// Completion handler block.
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
self.mPlayerItem = [AVPlayerItem playerItemWithAsset:asset];
[mPlayerItem addObserver:self forKeyPath:@"status"
options:0 context:&ItemStatusContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:mPlayerItem];
self.mPlayer = [AVPlayer playerWithPlayerItem:mPlayerItem];
[mPlayerView setPlayer:mPlayer];
[self play:nil];
}
else {
// You should deal with the error appropriately.
NSLog(@"The asset's tracks were not loaded:n%@", [error localizedDescription]);
}
});
}];
}
(IBAction)play:sender {
[mPlayer play];
}
并且块内的代码永远不会运行。
你可以将AVAssetReader返回的CMSampleBuffer转换为CGImage,然后是UIImage,并在UIImageView中显示,以渲染从原始视频文件中拉出的帧。
在AVFoundation编程指南中有示例代码,展示了如何进行此转换。