在要求帧率小于等于30帧的状况下,相机设置分辨率与帧率的方法是单独的,即设置帧率是帧率的方法,设置分辨率是分辨率的方法,二者没有绑定.git
设置分辨率github
使用此方法能够设置相机分辨率,能够设置的类型能够直接跳转进API文档处自行选择,目前支持最大的是3840*2160,若是不要求相机帧率大于30帧,此方法能够适用于你.bash
- (void)setCameraResolutionByPresetWithHeight:(int)height session:(AVCaptureSession *)session {
/*
Note: the method only support your frame rate <= 30 because we must use `activeFormat` when frame rate > 30, the `activeFormat` and `sessionPreset` are exclusive
*/
AVCaptureSessionPreset preset = [self getSessionPresetByResolutionHeight:height];
if ([session.sessionPreset isEqualToString:preset]) {
NSLog(@"Needn't to set camera resolution repeatly !");
return;
}
if (![session canSetSessionPreset:preset]) {
NSLog(@"Can't set the sessionPreset !");
return;
}
[session beginConfiguration];
session.sessionPreset = preset;
[session commitConfiguration];
}
复制代码
设置帧率session
使用此方法能够设置相机帧率,仅支持帧率小于等于30帧.框架
- (void)setCameraForLFRWithFrameRate:(int)frameRate {
// Only for frame rate <= 30
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[captureDevice lockForConfiguration:NULL];
[captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice unlockForConfiguration];
}
复制代码
若是须要对某一分辨率支持高帧率的设置,如50帧,60帧,120帧...,原先setActiveVideoMinFrameDuration
与setActiveVideoMaxFrameDuration
是没法作到的,Apple规定咱们须要使用新的方法设置帧率setActiveVideoMinFrameDuration
与setActiveVideoMaxFrameDuration
,而且该方法必须配合新的设置分辨率activeFormat
的方法一块儿使用.iphone
新的设置分辨率的方法activeFormat
与sessionPreset
是互斥的,若是使用了一个, 另外一个会失效,建议直接使用高帧率的设置方法,废弃低帧率下设置方法,避免产生兼容问题。ide
Apple在更新方法后将原先分离的分辨率与帧率的设置方法合二为一,原先是单独设置相机分辨率与帧率,而如今则须要一块儿设置,即每一个分辨率有其对应支持的帧率范围,每一个帧率也有其支持的分辨率,须要咱们遍从来查询,因此原先统一的单独的设置分辨率与帧率的方法在高帧率模式下至关于弃用,能够根据项目需求选择,若是肯定项目不会支持高帧率(fps>30),可使用之前的方法,简单且有效.post
注意: 使用
activeFormat
方法后,以前使用sessionPreset
方法设置的分辨率将自动变为AVCaptureSessionPresetInputPriority
,因此若是项目以前有用canSetSessionPreset
比较的if语句也都将失效,建议若是项目必须支持高帧率则完全启用sessionPreset
方法.ui
+ (BOOL)setCameraFrameRateAndResolutionWithFrameRate:(int)frameRate andResolutionHeight:(CGFloat)resolutionHeight bySession:(AVCaptureSession *)session position:(AVCaptureDevicePosition)position videoFormat:(OSType)videoFormat {
AVCaptureDevice *captureDevice = [self getCaptureDevicePosition:position];
BOOL isSuccess = NO;
for(AVCaptureDeviceFormat *vFormat in [captureDevice formats]) {
CMFormatDescriptionRef description = vFormat.formatDescription;
float maxRate = ((AVFrameRateRange*) [vFormat.videoSupportedFrameRateRanges objectAtIndex:0]).maxFrameRate;
if (maxRate >= frameRate && CMFormatDescriptionGetMediaSubType(description) == videoFormat) {
if ([captureDevice lockForConfiguration:NULL] == YES) {
// 对比镜头支持的分辨率和当前设置的分辨率
CMVideoDimensions dims = CMVideoFormatDescriptionGetDimensions(description);
if (dims.height == resolutionHeight && dims.width == [self getResolutionWidthByHeight:resolutionHeight]) {
[session beginConfiguration];
if ([captureDevice lockForConfiguration:NULL]){
captureDevice.activeFormat = vFormat;
[captureDevice setActiveVideoMinFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, frameRate)];
[captureDevice unlockForConfiguration];
}
[session commitConfiguration];
return YES;
}
}else {
NSLog(@"%s: lock failed!",__func__);
}
}
}
NSLog(@"Set camera frame is success : %d, frame rate is %lu, resolution height = %f",isSuccess,(unsigned long)frameRate,resolutionHeight);
return NO;
}
+ (AVCaptureDevice *)getCaptureDevicePosition:(AVCaptureDevicePosition)position {
NSArray *devices = nil;
if (@available(iOS 10.0, *)) {
AVCaptureDeviceDiscoverySession *deviceDiscoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:position];
devices = deviceDiscoverySession.devices;
} else {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#pragma clang diagnostic pop
}
for (AVCaptureDevice *device in devices) {
if (position == device.position) {
return device;
}
}
return NULL;
}
复制代码
切换先后置摄像头,看似简单,实际应用中会产生不少问题,由于同一部设备先后置摄像头支持的分辨率帧率的值是不一样的,因此若是从支持切向不支持就会产生问题,具体案例以下spa
好比iPhoneX, 后置摄像头最大支持(4K,60fps),前置摄像头最大支持(2K,30fps),当使用(4K,60fps)后置摄像头切到前置摄像头若是不作处理则没法切换,程序错乱.
注意
下面代码中咱们这行代码session.sessionPreset = AVCaptureSessionPresetLow;
,由于从后置切到前置咱们须要从新计算当前输入设备支持最大的分辨率与帧率,而输入设备若是不先添加上去咱们没法计算,因此在这里先随便设置一个可接受的分辨率以使咱们能够把输入设备添加,以后在求出当前设备最大支持的分辨率与帧率后再从新设置分辨率与帧率.
- (void)setCameraPosition:(AVCaptureDevicePosition)position session:(AVCaptureSession *)session input:(AVCaptureDeviceInput *)input videoFormat:(OSType)videoFormat resolutionHeight:(CGFloat)resolutionHeight frameRate:(int)frameRate {
if (input) {
[session beginConfiguration];
[session removeInput:input];
AVCaptureDevice *device = [self.class getCaptureDevicePosition:position];
NSError *error = nil;
AVCaptureDeviceInput *newInput = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (error != noErr) {
NSLog(@"%s: error:%@",__func__, error.localizedDescription);
return;
}
// 好比: 后置是4K, 前置最多支持2K,此时切换须要降级, 而若是不先把Input添加到session中,咱们没法计算当前摄像头支持的最大分辨率
session.sessionPreset = AVCaptureSessionPresetLow;
if ([session canAddInput:newInput]) {
self.input = newInput;
[session addInput:newInput];
}else {
NSLog(@"%s: add input failed.",__func__);
return;
}
int maxResolutionHeight = [self getMaxSupportResolutionByPreset];
if (resolutionHeight > maxResolutionHeight) {
resolutionHeight = maxResolutionHeight;
self.cameraModel.resolutionHeight = resolutionHeight;
NSLog(@"%s: Current support max resolution height = %d", __func__, maxResolutionHeight);
}
int maxFrameRate = [self getMaxFrameRateByCurrentResolution];
if (frameRate > maxFrameRate) {
frameRate = maxFrameRate;
self.cameraModel.frameRate = frameRate;
NSLog(@"%s: Current support max frame rate = %d",__func__, maxFrameRate);
}
BOOL isSuccess = [self.class setCameraFrameRateAndResolutionWithFrameRate:frameRate
andResolutionHeight:resolutionHeight
bySession:session
position:position
videoFormat:videoFormat];
if (!isSuccess) {
NSLog(@"%s: Set resolution and frame rate failed.",__func__);
}
[session commitConfiguration];
}
}
复制代码
咱们在这里首先要区分下屏幕方向与视频方向的概念,一个是用来表示设备方向(UIDeviceOrientation),一个是用来表示视频方向(AVCaptureVideoOrientation). 咱们使用的AVCaptureSession,若是要支持屏幕旋转,须要在屏幕旋转的同时将咱们的视频画面也进行旋转.
屏幕方向的旋转能够经过通知UIDeviceOrientationDidChangeNotification
接收,这里不作过多说明.
- (void)adjustVideoOrientationByScreenOrientation:(UIDeviceOrientation)orientation previewFrame:(CGRect)previewFrame previewLayer:(AVCaptureVideoPreviewLayer *)previewLayer videoOutput:(AVCaptureVideoDataOutput *)videoOutput {
[previewLayer setFrame:previewFrame];
switch (orientation) {
case UIInterfaceOrientationPortrait:
[self adjustAVOutputDataOrientation:AVCaptureVideoOrientationPortrait
videoOutput:videoOutput];
break;
case UIInterfaceOrientationPortraitUpsideDown:
[self adjustAVOutputDataOrientation:AVCaptureVideoOrientationPortraitUpsideDown
videoOutput:videoOutput];
break;
case UIInterfaceOrientationLandscapeLeft:
[[previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
[self adjustAVOutputDataOrientation:AVCaptureVideoOrientationLandscapeLeft
videoOutput:videoOutput];
break;
case UIInterfaceOrientationLandscapeRight:
[[previewLayer connection] setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
[self adjustAVOutputDataOrientation:AVCaptureVideoOrientationLandscapeRight
videoOutput:videoOutput];
break;
default:
break;
}
}
-(void)adjustAVOutputDataOrientation:(AVCaptureVideoOrientation)orientation videoOutput:(AVCaptureVideoDataOutput *)videoOutput {
for(AVCaptureConnection *connection in videoOutput.connections) {
for(AVCaptureInputPort *port in [connection inputPorts]) {
if([[port mediaType] isEqual:AVMediaTypeVideo]) {
if([connection isVideoOrientationSupported]) {
[connection setVideoOrientation:orientation];
}
}
}
}
}
复制代码
关于对焦,咱们须要特别说明手动设置对焦点进行对焦,由于对焦方法仅接受以左上角为(0,0),右下角为(1,1)的坐标系,因此咱们须要对UIView的坐标系进行转换,可是转换须要分为多种状况,以下
若是咱们是直接使用AVCaptureSession的AVCaptureVideoPreviewLayer作渲染,咱们可使用captureDevicePointOfInterestForPoint
方法自动计算,此结果会考虑上面全部状况.但若是咱们是本身对屏幕作渲染,则须要本身计算对焦点,上面的状况都须要考虑. 下面提供自动与手动计算两种方法.
- (void)autoFocusAtPoint:(CGPoint)point {
AVCaptureDevice *device = self.input.device;
if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
NSError *error;
if ([device lockForConfiguration:&error]) {
[device setExposurePointOfInterest:point];
[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
[device unlockForConfiguration];
}
}
}
复制代码
- (CGPoint)convertToPointOfInterestFromViewCoordinates:(CGPoint)viewCoordinates captureVideoPreviewLayer:(AVCaptureVideoPreviewLayer *)captureVideoPreviewLayer {
CGPoint pointOfInterest = CGPointMake(.5f, .5f);
CGSize frameSize = [captureVideoPreviewLayer frame].size;
if ([captureVideoPreviewLayer.connection isVideoMirrored]) {
viewCoordinates.x = frameSize.width - viewCoordinates.x;
}
// Convert UIKit coordinate to Focus Point(0.0~1.1)
pointOfInterest = [captureVideoPreviewLayer captureDevicePointOfInterestForPoint:viewCoordinates];
// NSLog(@"Focus - Auto test: %@",NSStringFromCGPoint(pointOfInterest));
return pointOfInterest;
}
复制代码
- (CGPoint)manualConvertFocusPoint:(CGPoint)point frameSize:(CGSize)frameSize captureVideoPreviewLayer:(AVCaptureVideoPreviewLayer *)captureVideoPreviewLayer position:(AVCaptureDevicePosition)position videoDataOutput:(AVCaptureVideoDataOutput *)videoDataOutput input:(AVCaptureDeviceInput *)input {
CGPoint pointOfInterest = CGPointMake(.5f, .5f);
if ([[videoDataOutput connectionWithMediaType:AVMediaTypeVideo] isVideoMirrored]) {
point.x = frameSize.width - point.x;
}
for (AVCaptureInputPort *port in [input ports]) {
if ([port mediaType] == AVMediaTypeVideo) {
CGRect cleanAperture = CMVideoFormatDescriptionGetCleanAperture([port formatDescription], YES);
CGSize resolutionSize = cleanAperture.size;
CGFloat resolutionRatio = resolutionSize.width / resolutionSize.height;
CGFloat screenSizeRatio = frameSize.width / frameSize.height;
CGFloat xc = .5f;
CGFloat yc = .5f;
if (resolutionRatio == screenSizeRatio) {
xc = point.x / frameSize.width;
yc = point.y / frameSize.height;
}else if (resolutionRatio > screenSizeRatio) {
if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
CGFloat needScreenWidth = resolutionRatio * frameSize.height;
CGFloat cropWidth = (needScreenWidth - frameSize.width) / 2;
xc = (cropWidth + point.x) / needScreenWidth;
yc = point.y / frameSize.height;
}else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect]){
CGFloat needScreenHeight = frameSize.width * (1/resolutionRatio);
CGFloat blackBarLength = (frameSize.height - needScreenHeight) / 2;
xc = point.x / frameSize.width;
yc = (point.y - blackBarLength) / needScreenHeight;
}else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize]) {
xc = point.x / frameSize.width;
yc = point.y / frameSize.height;
}
}else {
if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
CGFloat needScreenHeight = (1/resolutionRatio) * frameSize.width;
CGFloat cropHeight = (needScreenHeight - frameSize.height) / 2;
xc = point.x / frameSize.width;
yc = (cropHeight + point.y) / needScreenHeight;
}else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect]){
CGFloat needScreenWidth = frameSize.height * resolutionRatio;
CGFloat blackBarLength = (frameSize.width - needScreenWidth) / 2;
xc = (point.x - blackBarLength) / needScreenWidth;
yc = point.y / frameSize.height;
}else if ([[captureVideoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize]) {
xc = point.x / frameSize.width;
yc = point.y / frameSize.height;
}
}
pointOfInterest = CGPointMake(xc, yc);
}
}
if (position == AVCaptureDevicePositionBack) {
if (captureVideoPreviewLayer.connection.videoOrientation == AVCaptureVideoOrientationLandscapeLeft) {
pointOfInterest = CGPointMake(1-pointOfInterest.x, 1-pointOfInterest.y);
}
}else {
pointOfInterest = CGPointMake(pointOfInterest.x, 1-pointOfInterest.y);
}
//NSLog(@"Focus - manu test: %@",NSStringFromCGPoint(pointOfInterest));
return pointOfInterest;
}
复制代码
若是咱们是以UISlider做为调节控件,最简单的作法能够将其范围设置的与曝光度值的范围相同,即(-8~8),这样无需转换值,直接传入便可,若是是手势或是其余控件可根据需求自行调整.较为简单,再也不叙述.
- (void)setExposureWithNewValue:(CGFloat)newExposureValue device:(AVCaptureDevice *)device {
NSError *error;
if ([device lockForConfiguration:&error]) {
[device setExposureTargetBias:newExposureValue completionHandler:nil];
[device unlockForConfiguration];
}
}
复制代码
- (void)setTorchState:(BOOL)isOpen device:(AVCaptureDevice *)device {
if ([device hasTorch]) {
NSError *error;
[device lockForConfiguration:&error];
device.torchMode = isOpen ? AVCaptureTorchModeOn : AVCaptureTorchModeOff;
[device unlockForConfiguration];
}else {
NSLog(@"The device not support torch!");
}
}
复制代码
注意: 部分机型,部分分辨率使用此属性渲染可能会出现问题 (iphone xs, 本身渲染)
-(void)adjustVideoStabilizationWithOutput:(AVCaptureVideoDataOutput *)output {
NSArray *devices = nil;
if (@available(iOS 10.0, *)) {
AVCaptureDeviceDiscoverySession *deviceDiscoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:self.cameraModel.position];
devices = deviceDiscoverySession.devices;
} else {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#pragma clang diagnostic pop
}
for(AVCaptureDevice *device in devices){
if([device hasMediaType:AVMediaTypeVideo]){
if([device.activeFormat isVideoStabilizationModeSupported:AVCaptureVideoStabilizationModeAuto]) {
for(AVCaptureConnection *connection in output.connections) {
for(AVCaptureInputPort *port in [connection inputPorts]) {
if([[port mediaType] isEqual:AVMediaTypeVideo]) {
if(connection.supportsVideoStabilization) {
connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeStandard;
NSLog(@"activeVideoStabilizationMode = %ld",(long)connection.activeVideoStabilizationMode);
}else {
NSLog(@"connection don't support video stabilization");
}
}
}
}
}else{
NSLog(@"device don't support video stablization");
}
}
}
}
复制代码
注意在使用setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains
方法时必须比较当前的AVCaptureWhiteBalanceGains
值是否在有效范围.
-(AVCaptureWhiteBalanceGains)clampGains:(AVCaptureWhiteBalanceGains)gains toMinVal:(CGFloat)minVal andMaxVal:(CGFloat)maxVal {
AVCaptureWhiteBalanceGains tmpGains = gains;
tmpGains.blueGain = MAX(MIN(tmpGains.blueGain , maxVal), minVal);
tmpGains.redGain = MAX(MIN(tmpGains.redGain , maxVal), minVal);
tmpGains.greenGain = MAX(MIN(tmpGains.greenGain, maxVal), minVal);
return tmpGains;
}
-(void)setWhiteBlanceValueByTemperature:(CGFloat)temperature device:(AVCaptureDevice *)device {
if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
[device lockForConfiguration:nil];
AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
CGFloat currentTint = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].tint;
AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
.temperature = temperature,
.tint = currentTint,
};
AVCaptureWhiteBalanceGains deviceGains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
CGFloat maxWhiteBalanceGain = device.maxWhiteBalanceGain;
deviceGains = [self clampGains:deviceGains toMinVal:1 andMaxVal:maxWhiteBalanceGain];
[device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:deviceGains completionHandler:nil];
[device unlockForConfiguration];
}
}
-(void)setWhiteBlanceValueByTint:(CGFloat)tint device:(AVCaptureDevice *)device {
if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeLocked]) {
[device lockForConfiguration:nil];
CGFloat maxWhiteBalaceGain = device.maxWhiteBalanceGain;
AVCaptureWhiteBalanceGains currentGains = device.deviceWhiteBalanceGains;
currentGains = [self clampGains:currentGains toMinVal:1 andMaxVal:maxWhiteBalaceGain];
CGFloat currentTemperature = [device temperatureAndTintValuesForDeviceWhiteBalanceGains:currentGains].temperature;
AVCaptureWhiteBalanceTemperatureAndTintValues tempAndTintValues = {
.temperature = currentTemperature,
.tint = tint,
};
AVCaptureWhiteBalanceGains deviceGains = [device deviceWhiteBalanceGainsForTemperatureAndTintValues:tempAndTintValues];
deviceGains = [self clampGains:deviceGains toMinVal:1 andMaxVal:maxWhiteBalaceGain];
[device setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:deviceGains completionHandler:nil];
[device unlockForConfiguration];
}
}
复制代码
- (void)setVideoGravity:(AVLayerVideoGravity)videoGravity previewLayer:(AVCaptureVideoPreviewLayer *)previewLayer session:(AVCaptureSession *)session {
[session beginConfiguration];
[previewLayer setVideoGravity:videoGravity];
[session commitConfiguration];
}
复制代码