构建针对10.7+的64位本机OSX(非iOS)应用程序。在Cocoa世界中处理视频文件有点新。
我希望能够打开视频文件并在openGL渲染中显示输出(IE,我希望能够有效地访问视频的帧缓冲区并将每个帧转换为openGL纹理)
从概念上讲,这似乎很简单,但我有一个困难的时间涉水通过各种(旧的和弃用的)示例和选项,所有这些似乎都已被弃用,以支持AVFoundation。有可能我错过了一些东西,但是使用AVFoundation与OpenGL的例子似乎很薄。为了进一步澄清一点,这个示例应用程序(来自Apple的QTCoreVideo101)或多或少完全符合我的要求,除了它是围绕已弃用的QTKit构建的,因此甚至不能在64位编译。
我现在正在阅读AVFoundation文档,但我仍然不确定尝试从AVFoundation中获得glTexture是否有意义,或者我是否应该在其他地方寻找。
这是我最终采用的解决方案。"thisLayer.layerSource。"videoPlayerOutput"是AVPlayerItemVideoOutput对象。
if ([thisLayer.layerSource.videoPlayerOutput hasNewPixelBufferForItemTime:playerTime]){
frameBuffer= [thisLayer.layerSource.videoPlayerOutput copyPixelBufferForItemTime:playerTime itemTimeForDisplay:NULL];
CVReturn result= CVOpenGLTextureCacheCreateTextureFromImage(NULL,
textureCache,
frameBuffer,
NULL,
&textureRef);
if(result == kCVReturnSuccess){
// These appear to be GL_TEXTURE_RECTANGLE_ARB
thisLayer.layerSource.vid_glTextureTarget=CVOpenGLTextureGetTarget(textureRef);
thisLayer.layerSource.vid_glTexture=CVOpenGLTextureGetName(textureRef);
thisLayer.layerSource.vid_glTextureSize=NSMakeSize(CVPixelBufferGetWidth(frameBuffer), CVPixelBufferGetHeight(frameBuffer));
thisLayer.layerSource.vid_ciimage=[CIImage imageWithCVImageBuffer:frameBuffer];
CFRelease(textureRef);
CVOpenGLTextureCacheFlush(textureCache, 0);
}else{
NSLog(@"INTERNAL ERROR FAILED WITH CODE: %i",result);
}
CVBufferRelease(frameBuffer);
}
AVAssetReaderTrackOutput
(添加到您的AVAssetReader
)将输出CVPixelBufferRef
,其中您可以指定您的首选格式通过glTexImage
或glTexSubImage
上传到OpenGL
我也一直在看这个,这是我目前的解决方案:
- (BOOL) renderWithCVPixelBufferForTime: (NSTimeInterval) time
{
CMTime vTime = [self.playeroutput itemTimeForHostTime:CACurrentMediaTime()];
if ([self.playeroutput hasNewPixelBufferForItemTime:vTime]) {
if (_cvPixelBufferRef) {
CVPixelBufferUnlockBaseAddress(_cvPixelBufferRef, kCVPixelBufferLock_ReadOnly);
CVPixelBufferRelease(_cvPixelBufferRef);
}
_cvPixelBufferRef = [self.playeroutput copyPixelBufferForItemTime:vTime itemTimeForDisplay:NULL];
CVPixelBufferLockBaseAddress(_cvPixelBufferRef, kCVPixelBufferLock_ReadOnly);
GLsizei texWidth = CVPixelBufferGetWidth(_cvPixelBufferRef);
GLsizei texHeight = CVPixelBufferGetHeight(_cvPixelBufferRef);
GLvoid *baseAddress = CVPixelBufferGetBaseAddress(_cvPixelBufferRef);
glBindTexture(GL_TEXTURE_RECTANGLE_ARB, self.textureName);
glTexParameteri(GL_TEXTURE_RECTANGLE_ARB, GL_TEXTURE_STORAGE_HINT_APPLE , GL_STORAGE_CACHED_APPLE);
glTexImage2D(GL_TEXTURE_RECTANGLE_ARB, 0, GL_RGB, texWidth, texHeight, 0, GL_YCBCR_422_APPLE, GL_UNSIGNED_SHORT_8_8_APPLE, baseAddress);
glBindTexture(GL_TEXTURE_RECTANGLE_ARB, 0);
}
return YES;
}
然而,我想知道是否有一个更有效的解决方案,我也有一个关于同一主题的问题,有几种方法:
从AVPlayerItemVideoOutput到openGL Texture的最佳路径
锁基地址调用是一个hog,我不确定它是否真的需要