iOS Swift -Webrtc从前置摄像头更改为后摄像头



默认情况下的webrtc视频使用前置摄像头,该摄像头正常。但是,我需要将其切换到后相机,并且我找不到任何代码来做到这一点。我需要编辑哪一部分?是LocalView,LocalVideOtrack还是Capturer?

swift 3.0

对等连接可以只有一个用于发送视频流的" rtcvideotrack"。

首先,对于更改摄像头/后部,您必须删除当前的视频跟踪对等连接。之后,您在需要的相机上创建新的" rtcvideotrack",并将其设置为同行连接。

我使用了此方法。

func swapCameraToFront() {
    let localStream: RTCMediaStream? = peerConnection?.localStreams.first as? RTCMediaStream
    localStream?.removeVideoTrack(localStream?.videoTracks.first as! RTCVideoTrack)
    let localVideoTrack: RTCVideoTrack? = createLocalVideoTrack()
    if localVideoTrack != nil {
        localStream?.addVideoTrack(localVideoTrack)
        delegate?.appClient(self, didReceiveLocalVideoTrack: localVideoTrack!)
    }
    peerConnection?.remove(localStream)
    peerConnection?.add(localStream)
}
func swapCameraToBack() {
    let localStream: RTCMediaStream? = peerConnection?.localStreams.first as? RTCMediaStream
    localStream?.removeVideoTrack(localStream?.videoTracks.first as! RTCVideoTrack)
    let localVideoTrack: RTCVideoTrack? = createLocalVideoTrackBackCamera()
    if localVideoTrack != nil {
        localStream?.addVideoTrack(localVideoTrack)
        delegate?.appClient(self, didReceiveLocalVideoTrack: localVideoTrack!)
    }
    peerConnection?.remove(localStream)
    peerConnection?.add(localStream)
}

到目前为止,我只有Objective C语言的答案,即 ankit 的评论。一段时间后,我会将其转换为Swift

您可以检查以下代码

- (RTCVideoTrack *)createLocalVideoTrack {
    RTCVideoTrack *localVideoTrack = nil; 
    NSString *cameraID = nil; 
    for (AVCaptureDevice *captureDevice in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
       if (captureDevice.position == AVCaptureDevicePositionFront) { 
        cameraID = [captureDevice localizedName]; break;
       }
    }
    RTCVideoCapturer *capturer = [RTCVideoCapturer capturerWithDeviceName:cameraID]; 
    RTCMediaConstraints *mediaConstraints = [self defaultMediaStreamConstraints]; 
    RTCVideoSource *videoSource = [_factory videoSourceWithCapturer:capturer constraints:mediaConstraints]; 
    localVideoTrack = [_factory videoTrackWithID:@"ARDAMSv0" source:videoSource];
       return localVideoTrack; 
   }
- (RTCVideoTrack *)createLocalVideoTrackBackCamera {
    RTCVideoTrack *localVideoTrack = nil;
    //AVCaptureDevicePositionFront
    NSString *cameraID = nil;
    for (AVCaptureDevice *captureDevice in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
     if (captureDevice.position == AVCaptureDevicePositionBack) {
         cameraID = [captureDevice localizedName];
         break;
     }
    }
  RTCVideoCapturer *capturer = [RTCVideoCapturer capturerWithDeviceName:cameraID];
  RTCMediaConstraints *mediaConstraints = [self defaultMediaStreamConstraints];
  RTCVideoSource *videoSource = [_factory videoSourceWithCapturer:capturer constraints:mediaConstraints];
  localVideoTrack = [_factory videoTrackWithID:@"ARDAMSv0" source:videoSource];
  return localVideoTrack;
}

如果您决定在此处使用官方的Google构建说明:

首先,您必须在呼叫开始之前配置相机,在方法didCreateLocalCapturer

中使用ARDVideoCallViewDelegate的最佳地点
- (void)startCapture:(void (^)(BOOL succeeded))completionHandler {
    AVCaptureDevicePosition position = _usingFrontCamera ? AVCaptureDevicePositionFront : AVCaptureDevicePositionBack;
    __weak AVCaptureDevice *device = [self findDeviceForPosition:position];
    if ([device lockForConfiguration:nil]) {
        if ([device isFocusPointOfInterestSupported]) {
            [device setFocusModeLockedWithLensPosition:0.9 completionHandler: nil];
        }
    }
    AVCaptureDeviceFormat *format = [self selectFormatForDevice:device];
    if (format == nil) {
        RTCLogError(@"No valid formats for device %@", device);
        NSAssert(NO, @"");
        return;
    }
    NSInteger fps = [self selectFpsForFormat:format];
    [_capturer startCaptureWithDevice: device
                               format: format
                                  fps:fps completionHandler:^(NSError *    error) {
                                      NSLog(@"%@",error);
                                      if (error == nil) {
                                          completionHandler(true);
                                      }
                                  }];
}

不要忘记启用捕获设备是异步的,有时最好使用完成以确保按预期完成的所有操作。

我不确定您用于WebRTC使用的chrome版本,但是V54及更高版本在RTCAVFOUNDATIONVIDEOSOURCE类中具有" Bool"属性,称为" usebackCamera"。您可以使用此属性在前摄像头之间切换。

swift 4.0&'GoogleWebrtc':'1.1.20913'

RTCAVFOUNDATIONVIDEOSOURCE 类具有名为 usebackCamera 的属性,可用于切换使用的相机。

@interface RTCAVFoundationVideoSource : RTCVideoSource
- (instancetype)init NS_UNAVAILABLE;
/**
* Calling this function will cause frames to be scaled down to the
* requested resolution. Also, frames will be cropped to match the
* requested aspect ratio, and frames will be dropped to match the
* requested fps. The requested aspect ratio is orientation agnostic and
* will be adjusted to maintain the input orientation, so it doesn't
* matter if e.g. 1280x720 or 720x1280 is requested.
*/
- (void)adaptOutputFormatToWidth:(int)width height:(int)height fps:(int)fps;
/** Returns whether rear-facing camera is available for use. */
@property(nonatomic, readonly) BOOL canUseBackCamera;
/** Switches the camera being used (either front or back). */
@property(nonatomic, assign) BOOL useBackCamera;
/** Returns the active capture session. */
@property(nonatomic, readonly) AVCaptureSession *captureSession;

以下是开关相机的实现。

var useBackCamera: Bool = false
func switchCamera() {
    useBackCamera = !useBackCamera
    self.switchCamera(useBackCamera: useBackCamera)
}
private func switchCamera(useBackCamera: Bool) -> Void {
    let localStream = peerConnection?.localStreams.first
    if let videoTrack = localStream?.videoTracks.first {
        localStream?.removeVideoTrack(videoTrack)
    }
    let localVideoTrack = createLocalVideoTrack(useBackCamera: useBackCamera)
    localStream?.addVideoTrack(localVideoTrack)
    self.delegate?.webRTCClientDidAddLocal(videoTrack: localVideoTrack)
    if let ls = localStream {
        peerConnection?.remove(ls)
        peerConnection?.add(ls)
    }
}
func createLocalVideoTrack(useBackCamera: Bool) -> RTCVideoTrack {
    let videoSource = self.factory.avFoundationVideoSource(with: self.constraints)
    videoSource.useBackCamera = useBackCamera
    let videoTrack = self.factory.videoTrack(with: videoSource, trackId: "video")
    return videoTrack
}

在当前版本的 webrtc 中,rtcavfoundationVideoSource已被弃用并替换为通用RTCVIDEOSOURCE与RTCVIDEOCAPTURER的实现相结合。

为了切换相机,我正在执行此操作:

- (void)switchCameraToPosition:(AVCaptureDevicePosition)position completionHandler:(void (^)(void))completionHandler {
    if (self.cameraPosition != position) {
      RTCMediaStream *localStream = self.peerConnection.localStreams.firstObject;
      [localStream removeVideoTrack:self.localVideoTrack];
      //[self.peerConnection removeStream:localStream];
      self.localVideoTrack = [self createVideoTrack];
      [self startCaptureLocalVideoWithPosition:position completionHandler:^{
        [localStream addVideoTrack:self.localVideoTrack];
        //[self.peerConnection addStream:localStream];
        if (completionHandler) {
            completionHandler();
        }
      }];
      self.cameraPosition = position;
    }
}

看一下注释的行,如果您开始添加/删除从对等连接的流,将导致视频连接延迟。

我正在使用GoogleWebrtc-1.1.25102

最新更新