AVAssetWriterInputPixelBufferAdaptor返回空像素缓冲池



我确信我的缓冲区属性有问题,但我不清楚什么-它没有很好地记录应该去那里的东西,所以我猜基于CVPixelBufferPoolCreate - Core Foundation对我来说几乎是一本封闭的书。

    // "width" and "height" are const ints
    CFNumberRef cfWidth = CFNumberCreate(kCFAllocatorDefault, kCFNumberIntType, &width);
    CFNumberRef cfHeight = CFNumberCreate(kCFAllocatorDefault, kCFNumberIntType, &height);
    CFStringRef keys[] = {
        kCVPixelBufferWidthKey,
        kCVPixelBufferHeightKey,
        kCVPixelBufferCGImageCompatibilityKey
    };
    CFTypeRef values[] = {
        cfWidth,
        cfHeight,
        kCFBooleanTrue
    };
    int numValues = sizeof(keys) / sizeof(keys[0]);
    CFDictionaryRef bufferAttributes = CFDictionaryCreate(kCFAllocatorDefault, 
                                                          (const void **)&keys, 
                                                          (const void **)&values,
                                                          numValues,
                                                          &kCFTypeDictionaryKeyCallBacks,
                                                          &kCFTypeDictionaryValueCallBacks
                                                          );
    AVAssetWriterInputPixelBufferAdaptor *adaptor = [[AVAssetWriterInputPixelBufferAdaptor 
                                                      assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                      sourcePixelBufferAttributes:(NSDictionary*)bufferAttributes] retain];
    CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
    NSParameterAssert(bufferPool != NULL); // fails

当pixelBufferPool返回null时,检查以下内容:

    1。AVAssetsWriter的输出文件不存在。
    2。在AVAssetsWriter上调用startSessionAtTime:后使用pixelbuffer。
    3。AVAssetWriterInput和AVAssetWriterInputPixelBufferAdaptor设置正确。
    4。当前使用appendPixelBuffer的次数不相同

我有同样的问题,我认为这可能是因为你没有正确配置你的AVAssetWriterInput。我的游泳池开始工作后,我这样做了。特别是,除非我在AVVideoCompressionPropertiesKey中提供数据,否则池不会给我像素缓冲区。首先,创建并完全配置AVAssetWriter (在/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS4.3.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoSettings.h中查找钥匙&outputSettingscompressionSettings的值):

NSError * err = 0;
AVAssetWriter * outputWriter = [AVAssetWriter
    assetWriterWithURL: [NSURL fileURLWithPath:outputPath]
              fileType: AVFileTypeAppleM4V
                 error: & err];
NSMutableDictionary * outputSettings
    = [[NSMutableDictionary alloc] init];
[outputSettings setObject: AVVideoCodecH264
                   forKey: AVVideoCodecKey];
[outputSettings setObject: [NSNumber numberWithInt: width_]
                   forKey: AVVideoWidthKey];
[outputSettings setObject: [NSNumber numberWithInt: height_]
                   forKey: AVVideoHeightKey];
NSMutableDictionary * compressionProperties
    = [[NSMutableDictionary alloc] init];
[compressionProperties setObject: [NSNumber numberWithInt: 1000000]
                          forKey: AVVideoAverageBitRateKey];
[compressionProperties setObject: [NSNumber numberWithInt: 16]
                          forKey: AVVideoMaxKeyFrameIntervalKey];
[compressionProperties setObject: AVVideoProfileLevelH264Main31
                          forKey: AVVideoProfileLevelKey];
[outputSettings setObject: compressionProperties
                   forKey: AVVideoCompressionPropertiesKey];
AVAssetWriterInput * writerInput = [AVAssetWriterInput
    assetWriterInputWithMediaType: AVMediaTypeVideo
                   outputSettings: outputSettings];
[compressionProperties release];
[outputSettings release];

创建像素缓冲适配器:

NSMutableDictionary * pixBufSettings = [[NSMutableDictionary alloc] init];
[pixBufSettings setObject: [NSNumber numberWithInt: kCVPixelFormatType_32BGRA]
                   forKey: (NSString *) kCVPixelBufferPixelFormatTypeKey];
[pixBufSettings setObject: [NSNumber numberWithInt: width_]
                   forKey: (NSString *) kCVPixelBufferWidthKey];
[pixBufSettings setObject: [NSNumber numberWithInt: height_]
                   forKey: (NSString *) kCVPixelBufferHeightKey];
AVAssetWriterInputPixelBufferAdaptor * outputPBA =
    [AVAssetWriterInputPixelBufferAdaptor
    assetWriterInputPixelBufferAdaptorWithAssetWriterInput: outputInput
                               sourcePixelBufferAttributes: nil];

然后使用:

从其池中检索像素缓冲区
CVReturn res = CVPixelBufferPoolCreatePixelBuffer (NULL
    , [outputPBA pixelBufferPool]
    , & outputFrame);

根据文档:

"在关联的avassetwwriter对象上第一次调用startSessionAtTime:之前此属性为NULL。"

因此,如果您试图过早地访问池,它将为NULL。我自己也在学习这些东西,所以我现在不能详细说明。

对于每个仍在寻找解决方案的人:首先,确保你的avassetwwriter是通过检查它的状态正常工作。我遇到过这个问题,在检查状态后,虽然我已经开始了一些,但作者还没有开始。(在我的例子中,我已经将写入路径指向一个现有的文件,所以在删除它之后,它像一个魅力一样工作)

我搞定了!将选项字典设置为compatibility,他们说可以使用缓冲池,这里是工作示例和编写没有缓冲区的代码,但这是一个很好的开始。

下面是示例代码链接

下面是你需要的代码:

- (void) testCompressionSession
{
CGSize size = CGSizeMake(480, 320);

NSString *betaCompressionDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];
NSError *error = nil;
unlink([betaCompressionDirectory UTF8String]);
//----initialize compression engine
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory]
                                                       fileType:AVFileTypeQuickTimeMovie
                                                          error:&error];
NSParameterAssert(videoWriter);
if(error)
    NSLog(@"error = %@", [error localizedDescription]);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                               [NSNumber numberWithInt:size.height], AVVideoHeightKey, nil];
AVAssetWriterInput *writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
                                                       [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                                                                                 sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
if ([videoWriter canAddInput:writerInput])
    NSLog(@"I can add this input");
else
    NSLog(@"i can't add this input");
[videoWriter addInput:writerInput];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
//---
// insert demo debugging code to write the same image repeated as a movie
CGImageRef theImage = [[UIImage imageNamed:@"Lotus.png"] CGImage];
dispatch_queue_t    dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL);
int __block         frame = 0;
[writerInput requestMediaDataWhenReadyOnQueue:dispatchQueue usingBlock:^{
    while ([writerInput isReadyForMoreMediaData])
    {
        if(++frame >= 120)
        {
            [writerInput markAsFinished];
            [videoWriter finishWriting];
            [videoWriter release];
            break;
        }
        CVPixelBufferRef buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:theImage size:size];
        if (buffer)
        {
            if(![adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(frame, 20)])
                NSLog(@"FAIL");
            else
                NSLog(@"Success:%d", frame);
            CFRelease(buffer);
        }
    }
}];
NSLog(@"outside for loop");
}

- (CVPixelBufferRef )pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, 
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer);
// CVReturn status = CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);
NSParameterAssert(context);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}

AVAssetWriteroutputURL处没有文件时,

extension FileManager {
    func removeItemIfExist(at url: URL) {
        do {
            if FileManager.default.fileExists(atPath: url.path) {
                try FileManager.default.removeItem(at: url)
            }
        } catch {
            fatalError("(error)")
        }
    }
}
使用

let assetWriter = try? AVAssetWriter(outputURL: outputURL, fileType: .mov)
FileManager.default.removeItemIfExist(at: outputURL)
// do something

相关内容

  • 没有找到相关文章

最新更新