从CIImage创建CVPixelBufferRef以写入文件



我正在编写一个自定义的电影录制应用程序,并实现了AVAssetWriter和AVAssetWriterInputPixelBufferAdaptor,用于将帧写入文件。在DataOutputDelegate回调中,我试图将CIFilter应用于sampleBuffer。首先我得到一个CVPixelBufferRef,然后创建一个CIImage。然后我应用CIIFilter并获取生成的CIImage:

    CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
    CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer];
    CIFilter *hueAdjust = [CIFilter filterWithName:@"CIHueAdjust"];
    [hueAdjust setDefaults];
    [hueAdjust setValue: image forKey: @"inputImage"];
    [hueAdjust setValue: [NSNumber numberWithFloat: 2.094]
                 forKey: @"inputAngle"];
    CIImage *result = [hueAdjust valueForKey: @"outputImage"];
    CVPixelBufferRef newBuffer = //Convert CIImage...

我该如何转换CIImage以便可以:

    [self.filteredImageWriter appendPixelBuffer:newBuffer withPresentationTime:lastSampleTime];

更新答案

虽然我以前的方法有效,但我能够对其进行调整以简化代码。它似乎也跑得稍微快一点。

samplePixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(samplePixelBuffer, 0); // NOT SURE IF NEEDED // NO PERFORMANCE IMPROVEMENTS IF REMOVED
NSDictionary *options = [NSDictionary dictionaryWithObject:(__bridge id)rgbColorSpace forKey:kCIImageColorSpace];                
outputImage = [CIImage imageWithCVPixelBuffer:samplePixelBuffer options:options];
//-----------------
// FILTER OUTPUT IMAGE
@autoreleasepool {
    outputImage = [self applyEffectToCIImage:outputImage dict:dict];
}
CVPixelBufferUnlockBaseAddress(samplePixelBuffer, 0); // NOT SURE IF NEEDED // NO PERFORMANCE IMPROVEMENTS IF REMOVED
//-----------------
// RENDER OUTPUT IMAGE BACK TO PIXEL BUFFER
[self.filterContext render:outputImage toCVPixelBuffer:samplePixelBuffer bounds:[outputImage extent] colorSpace:CGColorSpaceCreateDeviceRGB()]; // DOES NOT SEEM TO WORK USING rgbColorSpace

以前的回答

我刚刚实现了以下操作,以从CIImage中获取像素缓冲区。确保像素格式一致,否则会出现颜色问题。CFDictionary也是非常重要的。http://allmybrain.com/2011/12/08/rendering-to-a-texture-with-ios-5-texture-cache-api/

CFDictionaryRef empty = CFDictionaryCreate(kCFAllocatorDefault, // EMPTY IOSURFACE DICT
                                           NULL,
                                           NULL,
                                           0,
                                           &kCFTypeDictionaryKeyCallBacks,
                                           &kCFTypeDictionaryValueCallBacks);
CFMutableDictionaryRef attributes = CFDictionaryCreateMutable(kCFAllocatorDefault,
                                                         1,
                                                         &kCFTypeDictionaryKeyCallBacks,
                                                         &kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attributes,kCVPixelBufferIOSurfacePropertiesKey,empty);
CVPixelBufferRef pixelBuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, 
                                      outputImage.extent.size.width, 
                                      outputImage.extent.size.height, 
                                      kCVPixelFormatType_32BGRA, 
                                      attributes, 
                                      &pixelBuffer);
if (status == kCVReturnSuccess && pixelBuffer != NULL) {
    CVPixelBufferLockBaseAddress(pixelBuffer, 0); // NOT SURE IF NEEDED // KEPT JUST IN CASE
    [self.filterContext render:outputImage toCVPixelBuffer:pixelBuffer bounds:[outputImage extent] colorSpace:CGColorSpaceCreateDeviceRGB()];
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); // NOT SURE IF NEEDED // KEPT JUST IN CASE
}

相关内容

  • 没有找到相关文章

最新更新