带有三个嵌入式控制器的屏幕截图,其中一个是视频模式下的UIImagePicker



我需要执行整个iPhone屏幕的实时屏幕捕获。屏幕中嵌入了三个容器视图。其中一个容器是UIImagePickerController。屏幕上的一切都很漂亮,但有UIImagePickerController的一个容器是黑色的。我需要整个屏幕截图,这样操作的连续性看起来是无缝的。是否有一种方法来捕获什么是目前显示在屏幕上从UIImagePickerController?下面是我用来捕获屏幕图像的代码。

我还试过苹果的技术问答q&a q1703。

UIGraphicsBeginImageContextWithOptions(myView.bounds.size, YES, 0.0f);
[myView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

提前感谢您的帮助!

我之前也遇到过类似的问题,当我试图捕获包含GLKView和UIImagePickerController的屏幕截图时。有时我会得到一个黑屏,有时我会得到关于无效上下文的抱怨(当使用类似于您的代码时)。我找不到解决方案,所以我实现了一个AVFoundation相机,从那以后就再也没有回头。这里有一些快速的源代码来帮助你。

ViewController.h

// Frameworks
#import <CoreVideo/CoreVideo.h>
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVFoundation.h>
#import <UIKit/UIKit.h>
@interface CameraViewController : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate>
// Camera
@property (strong, nonatomic) AVCaptureSession* captureSession;
@property (strong, nonatomic) AVCaptureVideoPreviewLayer* previewLayer;
@property (strong, nonatomic) UIImage* cameraImage;
@end

ViewController.m

#import "CameraViewController.h"
@implementation CameraViewController
- (void)viewDidLoad
{
    [super viewDidLoad];
    [self setupCamera];
}
- (void)setupCamera
{    
    AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil];
    AVCaptureVideoDataOutput* output = [[AVCaptureVideoDataOutput alloc] init];
    output.alwaysDiscardsLateVideoFrames = YES;
    dispatch_queue_t queue;
    queue = dispatch_queue_create("cameraQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    NSString* key = (NSString *) kCVPixelBufferPixelFormatTypeKey;
    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
    [output setVideoSettings:videoSettings];
    self.captureSession = [[AVCaptureSession alloc] init];
    [self.captureSession addInput:input];
    [self.captureSession addOutput:output];
    [self.captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
    self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    // CHECK FOR YOUR APP
    self.previewLayer.frame = CGRectMake(0, 0, self.view.frame.size.height, self.view.frame.size.width);
    self.previewLayer.orientation = AVCaptureVideoOrientationLandscapeRight;
    // CHECK FOR YOUR APP
    [self.view.layer insertSublayer:self.previewLayer atIndex:0];
    [self.captureSession startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);
    CGContextRelease(newContext);
    CGColorSpaceRelease(colorSpace);
    self.cameraImage = [UIImage imageWithCGImage:newImage];
    CGImageRelease(newImage);
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
}
// Call whenever you need a snapshot
- (UIImage *)snapshot
{
    NSLog(@"SNAPSHOT");
    return self.cameraImage;
}
@end

此代码根据所选的预设捕获输入图像(在本例中为photo: 852x640),因此如果您希望将其与视图一起捕获,我建议使用以下选项:

  1. 在捕获后缩放,裁剪和翻译图像。优点:相机仍然运行流畅。缺点:更多的代码
  2. 添加一个UIImageView代替previewLayer,它在captureOutput委托中更新它的图像。优点:所见即所得。缺点:可能会导致你的相机运行速度变慢。

在上述两种情况下,您都需要在截图后将结果捕获与其他图像合并(并不像听起来那么难)。

AVFoundation及其相关框架是相当令人生畏的,所以这是一个非常精简的实现,以获得您所追求的。如果您想了解更多细节,请查看以下示例:

  • iOS 4和直接访问相机
  • 截图-获取截图的合法方式

希望有帮助!

最新更新