我正在尝试使用屏幕上的数据创建H.264压缩会话。我创建了一个这样的CGDisplayStreamRef
实例:
displayStream = CGDisplayStreamCreateWithDispatchQueue(0, 100, 100, k32BGRAPixelFormat, nil, self.screenCaptureQueue, ^(CGDisplayStreamFrameStatus status, uint64_t displayTime, IOSurfaceRef frameSurface, CGDisplayStreamUpdateRef updateRef) {
//Call encoding session here
});
以下是我当前如何具有编码函数设置的方式:
- (void) encode:(CMSampleBufferRef )sampleBuffer {
CVImageBufferRef imageBuffer = (CVImageBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
CMTime presentationTimeStamp = CMTimeMake(frameID++, 1000);
VTEncodeInfoFlags flags;
OSStatus statusCode = VTCompressionSessionEncodeFrame(EncodingSession,
imageBuffer,
presentationTimeStamp,
kCMTimeInvalid,
NULL, NULL, &flags);
if (statusCode != noErr) {
NSLog(@"H264: VTCompressionSessionEncodeFrame failed with %d", (int)statusCode);
VTCompressionSessionInvalidate(EncodingSession);
CFRelease(EncodingSession);
EncodingSession = NULL;
return;
}
NSLog(@"H264: VTCompressionSessionEncodeFrame Success");
}
我试图了解如何将数据从屏幕转换为CMSampleBufferRef
,以便我可以正确调用我的编码功能。到目前为止,我无法确定这是否可能,或者是我想做的事情的正确方法。有人有任何建议吗?
edit :我已将IOSurface
转换为CMBlockBuffer
,但尚未弄清楚如何将其转换为CMSampleBufferRef
:
void *mem = IOSurfaceGetBaseAddress(frameSurface);
size_t bytesPerRow = IOSurfaceGetBytesPerRow(frameSurface);
size_t height = IOSurfaceGetHeight(frameSurface);
size_t totalBytes = bytesPerRow * height;
CMBlockBufferRef blockBuffer;
CMBlockBufferCreateWithMemoryBlock(kCFAllocatorNull, mem, totalBytes, kCFAllocatorNull, NULL, 0, totalBytes, 0, &blockBuffer);
编辑2
更多进展:
CMSampleBufferRef *sampleBuffer;
OSStatus sampleStatus = CMSampleBufferCreate(
NULL, blockBuffer, TRUE, NULL, NULL,
NULL, 1, 1, NULL,
0, NULL, sampleBuffer);
[self encode:*sampleBuffer];
可能,我有点迟到,但是尽管如此,它可能对他人有帮助:
CGDisplayStreamCreateWithDispatchQueue(CGMainDisplayID(), 100, 100, k32BGRAPixelFormat, nil, self.screenCaptureQueue, ^(CGDisplayStreamFrameStatus status, uint64_t displayTime, IOSurfaceRef frameSurface, CGDisplayStreamUpdateRef updateRef) {
// The created pixel buffer retains the surface object.
CVPixelBufferRef pixelBuffer;
CVPixelBufferCreateWithIOSurface(NULL, frameSurface, NULL, &pixelBuffer);
// Create the video-type-specific description for the pixel buffer.
CMVideoFormatDescriptionRef videoFormatDescription;
CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixelBuffer, &videoFormatDescription);
// All the necessary parts for creating a `CMSampleBuffer` are ready.
CMSampleBufferRef sampleBuffer;
CMSampleTimingInfo timingInfo;
CMSampleBufferCreateReadyWithImageBuffer(NULL, pixelBuffer, videoFormatDescription, &timingInfo, &sampleBuffer);
// Do the stuff
// Release the resources to let the frame surface be reused in the queue
// `kCGDisplayStreamQueueDepth` is responsible for the size of the queue
CFRelease(sampleBuffer);
CFRelease(pixelBuffer);
});