我试图通过MetalKit渲染I420 (YCbCr规划师)
大多数的例子是使用CMSampleBuffer从相机,
但是我的目标是使用给定的I420字节。
我这样做:
let data = NSMutableData(contentsOfURL: NSBundle.mainBundle().URLForResource("yuv_640_360", withExtension: "yuv")!)
// Cache for Y
CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, self.device!, nil, &videoTextureCache)
var pixelBuffer: CVPixelBuffer?
CVPixelBufferCreateWithBytes(kCFAllocatorDefault, Int(size.width), Int(size.height), kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, data.mutableBytes, Int(size.width), nil, nil, [
"kCVPixelBufferMetalCompatibilityKey": true,
"kCVPixelBufferOpenGLCompatibilityKey": true,
"kCVPixelBufferIOSurfacePropertiesKey": []
]
, &pixelBuffer)
// Y texture
var yTextureRef : Unmanaged<CVMetalTexture>?
let yWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 0)
let yHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 0)
let result = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, (videoTextureCache?.takeUnretainedValue())!, pixelBuffer, nil, MTLPixelFormat.R8Unorm, yWidth, yHeight, 0, &yTextureRef);
基本上代码与其他示例几乎相同,但我自己创建了自己的CVPixelBuffer。
我没有得到错误,当我创建CVPixelBuffer和CVMetalTexture,
但是对于yTexture总是返回null。
如何创建正确的CVPixelBuffer并使用它来渲染?
iosurface是重要的,我发现iosurface总是为空,如果你通过CVPixelBufferCreateWithBytes或CVPixelBufferCreateWithPlanarBytes创建CVPixelBuffer。
一旦你使用一个CVPixelBuffer,它的iosurface为null,那么MetalTexture总是为null。
应该这样做:
let result = CVPixelBufferCreate(kCFAllocatorDefault, width, height, kCVPixelFormatType_420YpCbCr8Planar, [
String(kCVPixelBufferIOSurfacePropertiesKey): [
"IOSurfaceOpenGLESFBOCompatibility": true,
"IOSurfaceOpenGLESTextureCompatibility": true,
"IOSurfaceCoreAnimationCompatibility": true,
]
], &self.pixelBuffer)
CVPixelBufferLockBaseAddress(self.pixelBuffer!, 0)
for index in 0...2 {
memcpy(CVPixelBufferGetBaseAddressOfPlane(self.pixelBuffer!, index), planesAddress[index], planesWidth[index] * planesHeight[index])
}
CVPixelBufferUnlockBaseAddress(self.pixelBuffer!, 0)