IOS/Objective-C:在没有图像选择器控制器的情况下显示实时摄像机预览



>我希望屏幕在实时摄像机视图上打开屏幕,而不是在屏幕上显示静态照片图像,要求提供个人资料照片。 之后,我不介意使用 uipickercontroller 来捕获照片。 但是,我希望用户立即看到一些东西,而不是静态图像。

我是否需要像 Swift 的答案中那样使用 AVFoundation,或者在 Objective-C 中执行此操作的最简单方法是什么?

这是从这个 SO 问题中在 Swift 中使用 AVFoundation 的一些代码,但是,我对 Swift 很弱,想在 Objective-C 中做到这一点;

extension SelfieViewController:  AVCaptureVideoDataOutputSampleBufferDelegate{
func setupAVCapture(){
session.sessionPreset = AVCaptureSessionPreset640x480
let devices = AVCaptureDevice.devices();
// Loop through all the capture devices on this phone
for device in devices {
// Make sure this particular device supports video
if (device.hasMediaType(AVMediaTypeVideo)) {
// Finally check the position and confirm we've got the front camera
if(device.position == AVCaptureDevicePosition.Front) {
captureDevice = device as? AVCaptureDevice
if captureDevice != nil {
beginSession()
break
}
}
}
}
}
func beginSession(){
var err : NSError? = nil
var deviceInput:AVCaptureDeviceInput = AVCaptureDeviceInput(device: captureDevice, error: &err)
if err != nil {
println("error: (err?.localizedDescription)")
}
if self.session.canAddInput(deviceInput){
self.session.addInput(deviceInput)
}
self.videoDataOutput = AVCaptureVideoDataOutput()
var rgbOutputSettings = [NSNumber(integer: kCMPixelFormat_32BGRA):kCVPixelBufferPixelFormatTypeKey]
self.videoDataOutput.alwaysDiscardsLateVideoFrames=true
self.videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL)
self.videoDataOutput.setSampleBufferDelegate(self, queue:self.videoDataOutputQueue)
if session.canAddOutput(self.videoDataOutput){
session.addOutput(self.videoDataOutput)
}
self.videoDataOutput.connectionWithMediaType(AVMediaTypeVideo).enabled = true
self.previewLayer = AVCaptureVideoPreviewLayer(session: self.session)
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
var rootLayer :CALayer = self.cameraView.layer
rootLayer.masksToBounds=true
self.previewLayer.frame = rootLayer.bounds
rootLayer.addSublayer(self.previewLayer)
session.startRunning()
}
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
// do stuff here
}
// clean up AVCapture
func stopCamera(){
session.stopRunning()
}
}

提前感谢您的任何建议。

我只是在示例代码上翻译成 Obj-C。如果你想了解更多,也许你可以看看我的项目FaceDetectionDemo。希望对您有所帮助。

- (void)setupAVCapture {
NSError *error = nil;
// Select device
AVCaptureSession *session = [[AVCaptureSession alloc] init];
if ([[UIDevice currentDevice] userInterfaceIdiom] == 
UIUserInterfaceIdiomPhone) {
[session setSessionPreset:AVCaptureSessionPreset640x480];
} else {
[session setSessionPreset:AVCaptureSessionPresetPhoto];
}
AVCaptureDevice *device = [self findFrontCamera];
if (nil == device) {
self.isUsingFrontFacingCamera = NO;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
// get the input device
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput 
deviceInputWithDevice:device error:&error];
if (error) {
session = nil;
[self teardownAVCapture];
if ([_delegate 
respondsToSelector:@selector(FaceDetectionComponentError:error:)]) {
__weak typeof(self) weakSelf = self;
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf.delegate FaceDetectionComponentError:weakSelf 
error:error];
});
}
return;
}
// add the input to the session
if ([session canAddInput:deviceInput]) {
[session addInput:deviceInput];
}
// Make a video data output
self.videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
// We want RGBA, both CoreGraphics and OpenGL work well with 'RGBA'
NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
[NSNumber 
numberWithInt:kCMPixelFormat_32BGRA] forKey: 
(id)kCVPixelBufferPixelFormatTypeKey];
[self.videoDataOutput setVideoSettings:rgbOutputSettings];
[self.videoDataOutput setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked
self.videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", 
DISPATCH_QUEUE_SERIAL);
[self.videoDataOutput setSampleBufferDelegate:self 
queue:self.videoDataOutputQueue];
if ([session canAddOutput:self.videoDataOutput]) {
[session addOutput:self.videoDataOutput];
}
[[self.videoDataOutput connectionWithMediaType:AVMediaTypeVideo] 
setEnabled:YES];
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] 
initWithSession:session];
self.previewLayer.backgroundColor = [[UIColor blackColor] CGColor];
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
CALayer *rootLayer = [self.previewView layer];
[rootLayer setMasksToBounds:YES];
[self.previewLayer setFrame:[rootLayer bounds]];
[rootLayer addSublayer:self.previewLayer];
[session startRunning];
}
- (AVCaptureDevice *)findFrontCamera {
AVCaptureDevicePosition desiredPosition = AVCaptureDevicePositionFront;
for (AVCaptureDevice *d in [AVCaptureDevice 
devicesWithMediaType:AVMediaTypeVideo]) {
if ([d position] == desiredPosition) {
self.isUsingFrontFacingCamera = YES;
return d;
}
}
return nil;
}
// AVCaptureVideoDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
}

相关内容

最新更新