内存警告OpenGL iOS应用程序



我正在开发一个丰富的图形iOS应用程序。在一个实例中,我们的应用程序占用的内存是250 MB。我会从相机中获取每一帧,用OpenGL着色器进行处理并提取一些数据。每次我使用相机获取帧进行处理时,我都会看到内存增加到280 MB。当我停止捕捉帧时,内存会恢复到250 MB。如果我重复启动相机并退出的过程10次(比如说),我会收到内存警告(尽管没有观察到内存泄漏)。我在这里没有使用ARC。我正在维护一个自动发布池,其中包括一个框架的整个处理过程。我在分析时没有看到任何泄漏。10次之后,内存似乎达到了250 MB。我不确定记忆警告的原因。有什么见解吗?我很乐意提供更多信息。Opengl版本-ES 2.0,iOS版本-7.0

你必须使用ARC,它会自动释放坏内存,并使你的应用程序优化

根据其他一些问题,如这个问题(内存警告后在iOS上运行OpenGL时崩溃)和这个问题(iOS仪器:为什么memory Monitor不同意Allocations?),问题可能是在使用完OpenGL资源(VBO、纹理、渲染缓冲区等)后,您没有删除它们。

没有看到代码,谁知道呢?您只是使用EAGLContext的presentRenderbuffer方法来渲染帧缓冲区吗?那么,您将如何处理传递给CVOpenGLESTextureCacheCreateTextureFromImage的pixelBuffer?在典型的使用场景中,像素缓冲区是唯一的大量内存来源。

但是,如果使用glReadPixels将渲染缓冲区中的数据交换到另一个缓冲区,则会引入几个内存占用中的一个。如果通过CGDataProvider交换到的缓冲区是CoreGraphics缓冲区,那么您是否包括数据发布回调,或者在创建提供程序时是否传递nil作为参数?你交换缓冲区后glFlush了吗?

如果您提供代码,我可以确定这些问题的答案;如果你认为你可以在不这样做的情况下解决这个问题,但希望看到在最困难的用例场景中成功管理内存的工作代码,那么可能会有:

https://demonicactivity.blogspot.com/2016/11/tech-serious-ios-developers-use-every.html

为了方便您,我在下面提供了一些代码。将它放在对presentRenderbuffer方法的任何调用之后,如果您不想将缓冲区呈现到CAEAGLLayer中的显示,请注释掉该调用(就像我在下面的示例中所做的那样):

//[_context presentRenderbuffer:GL_RENDERBUFFER];

dispatch_async(dispatch_get_main_queue(), ^{
@autoreleasepool {
// To capture the output to an OpenGL render buffer...
NSInteger myDataLength = _backingWidth * _backingHeight * 4;
GLubyte *buffer = (GLubyte *) malloc(myDataLength);
glPixelStorei(GL_UNPACK_ALIGNMENT, 8);
glReadPixels(0, 0, _backingWidth, _backingHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
// To swap the pixel buffer to a CoreGraphics context (as a CGImage)
CGDataProviderRef provider;
CGColorSpaceRef colorSpaceRef;
CGImageRef imageRef;
CVPixelBufferRef pixelBuffer;
@try {
provider = CGDataProviderCreateWithData(NULL, buffer, myDataLength, &releaseDataCallback);
int bitsPerComponent = 8;
int bitsPerPixel = 32;
int bytesPerRow = 4 * _backingWidth;
colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
imageRef = CGImageCreate(_backingWidth, _backingHeight, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
} @catch (NSException *exception) {
NSLog(@"Exception: %@", [exception reason]);
} @finally {
if (imageRef) {
// To convert the CGImage to a pixel buffer (for writing to a file using AVAssetWriter)
pixelBuffer = [CVCGImageUtil pixelBufferFromCGImage:imageRef];
// To verify the integrity of the pixel buffer (by converting it back to a CGIImage, and thendisplaying it in a layer)
imageLayer.contents = (__bridge id)[CVCGImageUtil cgImageFromPixelBuffer:pixelBuffer context:_ciContext];
}
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpaceRef);
CGImageRelease(imageRef);
}
}
});

。.

释放CGDataProvider类实例中数据的回调:

static void releaseDataCallback (void *info, const void *data, size_t size) {
free((void*)data);
}

CVCGImageUtil类接口和实现文件,分别为:

@import Foundation;
@import CoreMedia;
@import CoreGraphics;
@import QuartzCore;
@import CoreImage;
@import UIKit;
@interface CVCGImageUtil : NSObject
+ (CGImageRef)cgImageFromPixelBuffer:(CVPixelBufferRef)pixelBuffer context:(CIContext *)context;
+ (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image;
+ (CMSampleBufferRef)sampleBufferFromCGImage:(CGImageRef)image;
@end
#import "CVCGImageUtil.h"
@implementation CVCGImageUtil
+ (CGImageRef)cgImageFromPixelBuffer:(CVPixelBufferRef)pixelBuffer context:(CIContext *)context
{
// CVPixelBuffer to CoreImage
CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer];
image = [image imageByApplyingTransform:CGAffineTransformMakeRotation(M_PI)];
CGPoint origin = [image extent].origin;
image = [image imageByApplyingTransform:CGAffineTransformMakeTranslation(-origin.x, -origin.y)];
// CoreImage to CGImage via CoreImage context
CGImageRef cgImage = [context createCGImage:image fromRect:[image extent]];
// CGImage to UIImage (OPTIONAL)
//UIImage *uiImage = [UIImage imageWithCGImage:cgImage];
//return (CGImageRef)uiImage.CGImage;
return cgImage;
}
+ (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image
{
CGSize frameSize = CGSizeMake(CGImageGetWidth(image),
CGImageGetHeight(image));
NSDictionary *options =
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES],
kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES],
kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status =
CVPixelBufferCreate(
kCFAllocatorDefault, frameSize.width, frameSize.height,
kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef)options,
&pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(
pxdata, frameSize.width, frameSize.height,
8, CVPixelBufferGetBytesPerRow(pxbuffer),
rgbColorSpace,
(CGBitmapInfo)kCGBitmapByteOrder32Little |
kCGImageAlphaPremultipliedFirst);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
+ (CMSampleBufferRef)sampleBufferFromCGImage:(CGImageRef)image
{
CVPixelBufferRef pixelBuffer = [CVCGImageUtil pixelBufferFromCGImage:image];
CMSampleBufferRef newSampleBuffer = NULL;
CMSampleTimingInfo timimgInfo = kCMTimingInfoInvalid;
CMVideoFormatDescriptionRef videoInfo = NULL;
CMVideoFormatDescriptionCreateForImageBuffer(
NULL, pixelBuffer, &videoInfo);
CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,
pixelBuffer,
true,
NULL,
NULL,
videoInfo,
&timimgInfo,
&newSampleBuffer);
return newSampleBuffer;
}
@end

相关内容

最新更新