iOS開發直播app-美顏濾鏡GPUImageBeautifyFilter

natasa 8年前發布 | 22K 次閱讀 GPUImage iOS開發 移動開發

隨著各種各樣的直播app的爆火,實時美顏濾鏡的需求也越來越多。下面將主要介紹實現美顏濾鏡的原理和思路,原理可以移步看下GPUImage原理,本文主要是GPUImageBeautifyFilter美顏濾鏡的實現。美顏只是不同濾鏡組合起來的效果,實際上美顏也是一種濾鏡,只不過它組合了各種需求的濾鏡,例如磨皮、美白、提高飽和度、提亮之類的。

GPUImageBeautifyFilter

GPUImageBeautifyFilter是基于GPUImage的實時美顏濾鏡,包括

GPUImageBilateralFilter、 GPUImageCombinationFilterGPUImageHSBFilter

GPUImageBeautifyFilter.h創建上面的對象

@interface GPUImageBeautifyFilter : GPUImageFilterGroup {
GPUImageBilateralFilter *bilateralFilter;
GPUImageCannyEdgeDetectionFilter *cannyEdgeFilter;
GPUImageCombinationFilter *combinationFilter;
GPUImageHSBFilter *hsbFilter;
}

繪制步驟如下:

準備紋理

繪制紋理

顯示處理后的紋理

一、 準備紋理(將要用到的類)

[GPUImageVideoCamera] -

[GPUImageBeautifyFilter] -

[GPUImageBilateralFliter] -

[GPUImageCombinationFilter] -

[GPUImageCannyEdgeDetectionFilter] -

準備過程分三步:

第一個紋理:

1、GPUImageVideoCamera捕獲攝像頭圖像

調用newFrameReadyAtTime: atIndex:通知GPUImageBeautifyFilter;

2、GPUImageBeautifyFilter調用newFrameReadyAtTime: atIndex:

通知GPUImageBilateralFliter輸入紋理已經準備好;

第二個紋理:

3、GPUImageBilateralFliter 繪制圖像后,

informTargetsAboutNewFrameAtTime(),

調用setInputFramebufferForTarget: atIndex:

把繪制的圖像設置為GPUImageCombinationFilter輸入紋理,

并通知GPUImageCombinationFilter紋理已經繪制完畢;

4、GPUImageBeautifyFilter調用newFrameReadyAtTime: atIndex:

通知 GPUImageCannyEdgeDetectionFilter輸入紋理已經準備好;

第三個紋理:

5、GPUImageCannyEdgeDetectionFilter 繪制圖像后,

把圖像設置為GPUImageCombinationFilter輸入紋理;

6、GPUImageBeautifyFilter調用newFrameReadyAtTime: atIndex:

通知 GPUImageCombinationFilter輸入紋理已經準備好;

紋理準備.png

二、繪制紋理:

7、判斷紋理數量

GPUImageCombinationFilter判斷是否有三個紋理,三個紋理都已經準備好后

調用GPUImageThreeInputFilter的繪制函數renderToTextureWithVertices: textureCoordinates:,

圖像繪制完后,把圖像設置為GPUImageHSBFilter的輸入紋理,

通知GPUImageHSBFilter紋理已經繪制完畢;

8、繪制紋理

GPUImageHSBFilter調用renderToTextureWithVertices:

textureCoordinates:繪制圖像,

完成后把圖像設置為GPUImageView的輸入紋理,并通知GPUImageView輸入紋理已經繪制完畢;

三、顯示紋理

9、GPUImageView把輸入紋理繪制到自己的幀緩存,然后通過

[self.context presentRenderbuffer:GL_RENDERBUFFER];顯示到UIView上。

GPUImageBeautifyFilter.m文件

@interface GPUImageCombinationFilter : GPUImageThreeInputFilter
{
GLint smoothDegreeUniform;
}

@property (nonatomic, assign) CGFloat intensity;

@end

NSString *const kGPUImageBeautifyFragmentShaderString = SHADER_STRING
(
varying highp vec2 textureCoordinate;
varying highp vec2 textureCoordinate2;
varying highp vec2 textureCoordinate3;

uniform sampler2D inputImageTexture;
uniform sampler2D inputImageTexture2;
uniform sampler2D inputImageTexture3;
uniform mediump float smoothDegree;

void main()
{
 highp vec4 bilateral = texture2D(inputImageTexture, textureCoordinate);
 highp vec4 canny = texture2D(inputImageTexture2, textureCoordinate2);
 highp vec4 origin = texture2D(inputImageTexture3,textureCoordinate3);
 highp vec4 smooth;
 lowp float r = origin.r;
 lowp float g = origin.g;
 lowp float b = origin.b;
 if (canny.r < 0.2 && r > 0.3725 && g > 0.1568 && b > 0.0784 && r > b && (max(max(r, g), b) - min(min(r, g), b)) > 0.0588 && abs(r-g) > 0.0588) {
     smooth = (1.0 - smoothDegree) * (origin - bilateral) + bilateral;
 }
 else {
     smooth = origin;
 }
 smooth.r = log(1.0 + 0.2 * smooth.r)/log(1.2);
 smooth.g = log(1.0 + 0.2 * smooth.g)/log(1.2);
 smooth.b = log(1.0 + 0.2 * smooth.b)/log(1.2);
 gl_FragColor = smooth;
}
);

@implementation GPUImageCombinationFilter

-(id)init {
if (self = [super initWithFragmentShaderFromString:kGPUImageBeautifyFragmentShaderString]) {
    smoothDegreeUniform = [filterProgram uniformIndex:@"smoothDegree"];
}
self.intensity = 0.5;
return self;
}

-(void)setIntensity:(CGFloat)intensity {
_intensity = intensity;
[self setFloat:intensity forUniform:smoothDegreeUniform program:filterProgram];
}

@end

@implementation GPUImageBeautifyFilter

-(id)init;
{
if (!(self = [super init]))
{
    return nil;
}

// First pass: face smoothing filter
bilateralFilter = [[GPUImageBilateralFilter alloc] init];
bilateralFilter.distanceNormalizationFactor = 4.0;
[self addFilter:bilateralFilter];

// Second pass: edge detection
cannyEdgeFilter = [[GPUImageCannyEdgeDetectionFilter alloc] init];
[self addFilter:cannyEdgeFilter];

// Third pass: combination bilateral, edge detection and origin
combinationFilter = [[GPUImageCombinationFilter alloc] init];
[self addFilter:combinationFilter];

// Adjust HSB
hsbFilter = [[GPUImageHSBFilter alloc] init];
[hsbFilter adjustBrightness:1.1];
[hsbFilter adjustSaturation:1.1];

[bilateralFilter addTarget:combinationFilter];
[cannyEdgeFilter addTarget:combinationFilter];

[combinationFilter addTarget:hsbFilter];

self.initialFilters = [NSArray arrayWithObjects:bilateralFilter,cannyEdgeFilter,combinationFilter,nil];
self.terminalFilter = hsbFilter;

return self;
}

#pragma mark - GPUImageInput protocol

繪制紋理

-(void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
for (GPUImageOutput<GPUImageInput> *currentFilter in self.initialFilters)
{
    if (currentFilter != self.inputFilterToIgnoreForUpdates)
    {
        if (currentFilter == combinationFilter) {
            textureIndex = 2;
        }
        [currentFilter newFrameReadyAtTime:frameTime atIndex:textureIndex];
    }
}
}

設置繪制圖像的輸入紋理

-(void)setInputFramebuffer:(GPUImageFramebuffer *)newInputFramebuffer atIndex:(NSInteger)textureIndex;
{
for (GPUImageOutput<GPUImageInput> *currentFilter in self.initialFilters)
{
    if (currentFilter == combinationFilter) {
        textureIndex = 2;
    }
    [currentFilter setInputFramebuffer:newInputFramebuffer atIndex:textureIndex];
}
}

GPUImage集成步驟:

自定義組合濾鏡美顏

  1. 使用Cocoapods導入GPUImage;
  2. 創建視頻源GPUImageVideoCamera;
  3. 創建最終目的源:GPUImageView;
  4. 創建GPUImageFilterGroup濾鏡組合,需要組合亮度(GPUImageBrightnessFilter)和雙邊濾波(GPUImageBilateralFilter)這兩個濾鏡達到美顏效果;
  5. 設置濾鏡組鏈;
  6. 設置GPUImage處理鏈,從數據源 -> 濾鏡 -> 最終界面效果;
  7. 開始采集視頻。
-(void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
self.view.backgroundColor = [UIColor whiteColor];
self.title = @"GPUImage美顏";

[self initBottomView];

//  1. 創建視頻攝像頭
// SessionPreset:屏幕分辨率,AVCaptureSessionPresetHigh會自適應高分辨率
// cameraPosition:攝像頭方向
// 最好使用AVCaptureSessionPresetHigh,會自動識別,如果用太高分辨率,當前設備不支持會直接報錯
GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPresetHigh cameraPosition:AVCaptureDevicePositionFront];

//  2. 設置攝像頭輸出視頻的方向
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
_videoCamera = videoCamera;

//  3. 創建用于展示視頻的GPUImageView
GPUImageView *captureVideoPreview = [[GPUImageView alloc] initWithFrame:self.view.bounds];
[self.view insertSubview:captureVideoPreview atIndex:0];

//  4.創建磨皮、美白組合濾鏡
GPUImageFilterGroup *groupFliter = [[GPUImageFilterGroup alloc] init];

//  5.磨皮濾鏡
GPUImageBilateralFilter *bilateralFilter = [[GPUImageBilateralFilter alloc] init];
[groupFliter addFilter:bilateralFilter];
_bilateralFilter = bilateralFilter;

//  6.美白濾鏡
GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc] init];
[groupFliter addFilter:brightnessFilter];
_brightnessFilter = brightnessFilter;


//  7.設置濾鏡組鏈
[bilateralFilter addTarget:brightnessFilter];
[groupFliter setInitialFilters:@[bilateralFilter]];
groupFliter.terminalFilter = brightnessFilter;

//  8.設置GPUImage處理鏈 從數據源->濾鏡->界面展示
[videoCamera addTarget:groupFliter];
[groupFliter addTarget:captureVideoPreview];

//  9.調用startCameraCapture采集視頻,底層會把采集到的視頻源,渲染到GPUImageView上,接著界面顯示
[videoCamera startCameraCapture];
}

ps:

  1. GPUImageVideoCamera必須要強引用,否則在采集視頻過程中會被銷毀;
  2. 必須調用startCameraCapture,底層才會把采集到的視頻源,渲染到GPUImageView中才能顯示;
  3. GPUImageBilateralFilter的distanceNormalizationFactor值越小,磨皮效果越好,distanceNormalizationFactor取值范圍: 大于1。

利用美顏濾鏡GPUImageBeautifyFilter實現

1、使用Cocoapods導入GPUImage;

2、導入GPUImageBeautifyFilter文件夾;

3、創建視頻源GPUImageVideoCamera;

4、創建最終目的源:GPUImageView;

5、創建最終美顏濾鏡:GPUImageBeautifyFilter;

6、設置GPUImage處理鏈,從數據源 -> 濾鏡 -> 最終界面展示。

-(void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
self.view.backgroundColor = [UIColor whiteColor];
self.title = @"Beautify美顏";

UISwitch *switcher = [[UISwitch alloc] initWithFrame:CGRectMake(140, 80, 70, 30)];
[switcher addTarget:self action:@selector(changeBeautyFilter:) forControlEvents:UIControlEventValueChanged];

[self.view addSubview:switcher];

//  1. 創建視頻攝像頭
// SessionPreset:屏幕分辨率,AVCaptureSessionPresetHigh會自適應高分辨率
// cameraPosition:攝像頭方向
// 最好使用AVCaptureSessionPresetHigh,會自動識別,如果用太高分辨率,當前設備不支持會直接報錯
GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPresetHigh cameraPosition:AVCaptureDevicePositionFront];

//  2. 設置攝像頭輸出視頻的方向
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
_videoCamera = videoCamera;


//  3. 創建用于展示視頻的GPUImageView
GPUImageView *captureVideoPreview = [[GPUImageView alloc] initWithFrame:self.view.bounds];
[self.view insertSubview:captureVideoPreview atIndex:0];
_captureVideoPreview = captureVideoPreview;

//  4.設置處理鏈
[_videoCamera addTarget:_captureVideoPreview];

//  5.調用startCameraCapture采集視頻,底層會把采集到的視頻源,渲染到GPUImageView上,接著界面顯示
[videoCamera startCameraCapture];
}

切換美顏的時候要移動處理鏈

// 移除之前所有處理鏈
[_videoCamera removeAllTargets];

// 創建美顏濾鏡
GPUImageBeautifyFilter *beautifyFilter = [[GPUImageBeautifyFilter alloc] init];

// 設置GPUImage處理鏈,從數據源 => 濾鏡 => 最終界面效果
[_videoCamera addTarget:beautifyFilter];
[beautifyFilter addTarget:_captureVideoPreview];

參考文獻

http://www.jianshu.com/p/2ce9b63ecfef

http://www.jianshu.com/p/4646894245ba

 

來自:http://www.jianshu.com/p/6bdb4cb50f14

 

 本文由用戶 natasa 自行上傳分享,僅供網友學習交流。所有權歸原作者,若您的權利被侵害,請聯系管理員。
 轉載本站原創文章,請注明出處,并保留原始鏈接、圖片水印。
 本站是一個以用戶分享為主的開源技術平臺,歡迎各類分享!