自定义相机(三) -- 视频流数据 AND 预览 拍照 变焦

实现功能:

  1. 视频流数据

  2. 预览和拍照变焦, 所见即所得。

运行环境:

  1.  XCODE 5.1.1

  2.  真机(IPHONE5  ,  IOS6.1.4)

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>       //导入 - "视频流"

@interface MCViewController : UIViewController<AVCaptureVideoDataOutputSampleBufferDelegate>

@property (strong, nonatomic) AVCaptureSession * captureSession;            //AVCaptureSession实例
@property (strong, nonatomic) AVCaptureDeviceInput * videoInput;            //持有视频输入实例

@property (strong, nonatomic) AVCaptureStillImageOutput * stillImageOutput;  //持有静态图像输出实例

@end
//
//  MCViewController.m
//  MyCamera

#import "MCViewController.h"

@interface MCViewController ()

@end

@implementation MCViewController

#pragma mark - life cycle

- (void)viewWillAppear:(BOOL)animated
{
    [super viewWillAppear:animated];

    //开始扑捉会话
    [self.captureSession startRunning];
}

- (void)viewWillDisappear:(BOOL)animated
{
    [super viewWillDisappear:animated];

    //停止扑捉会话
    [self.captureSession stopRunning];
}

- (void)viewDidLoad
{
    [super viewDidLoad];

    //初始化视频流
    [self initAv];

    //添加拍照按钮
    [self addCaptureButton];
}

#pragma mark - 初始化视频流

- (void) initAv
{
    //1.1 创建AVCaptureSession
    self.captureSession = [[AVCaptureSession alloc] init];
    //1.2 指定输入设备。这里使用后摄像头。
    AVCaptureDevice * device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    //1.3 创建AVCaptureDeviceInput的实例,将所选设备作为扑捉会话的输入。
    //      此外,在将是其添加到回话前请创建好输入,这里需要做个检查。
    NSError * error = nil;
    self.videoInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (self.videoInput) {
        [self.captureSession addInput:self.videoInput];
    }
    else
    {
        NSLog(@"input error : %@", error);
    }

    //4. 视频流帧数据
    AVCaptureVideoDataOutput * output = [[AVCaptureVideoDataOutput alloc] init];
    [self.captureSession addOutput:output];

    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    //dispatch_release(queue);

    //    output.videoSettings = [NSDictionary dictionaryWithObject:
    //                                                       [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
    //                                                       forKey:(id)kCVPixelBufferPixelFormatTypeKey];

    output.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                            [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
                            [NSNumber numberWithInt: 320], (id)kCVPixelBufferWidthKey,
                            [NSNumber numberWithInt: 240], (id)kCVPixelBufferHeightKey,
                            nil];
    //output.minFrameDuration = CMTimeMake(1, 15);

    //2. 创建预览层
    AVCaptureVideoPreviewLayer * previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    UIView * aView = self.view;
    previewLayer.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
    previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    [aView.layer addSublayer:previewLayer];
    [previewLayer setAffineTransform:CGAffineTransformMakeScale(5.0, 5.0)]; //3. 实现拍照
    self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary * stillImageOutputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey, nil];
    [self.stillImageOutput setOutputSettings:stillImageOutputSettings];
    [self.captureSession addOutput:self.stillImageOutput];

//    //4. 拍照变焦
//    AVCaptureConnection * stillImageConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:[self.stillImageOutput connections]];
//
//    [stillImageConnection setVideoScaleAndCropFactor:5.0];  

}

#pragma  mark -
- (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections
{
    for ( AVCaptureConnection *connection in connections )
    {
        for ( AVCaptureInputPort *port in [connection inputPorts] )
        {
            if ( [[port mediaType] isEqual:mediaType] )
            {
                return connection;
            }
        }
    }
    return nil;
}

#pragma mark -  delegate -  AVCaptureVideoDataOutputSampleBufferDelegate

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    //视频流一帧数据(UIImage类型)
    UIImage * image =  [self imageFromSampleBuffer:sampleBuffer];

    NSLog(@"视频流, 宽:(%f)  高:(%f)", image.size.width, image.size.height);
    //IPHONE4  720  * 1280
    //IPHONE5  1080 * 1920

    //或:视频流一帧数据(NSData类型)
    //NSData * imageData = UIImageJPEGRepresentation(image, 0.5);

    //Code...

}

// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    // Get a CMSampleBuffer‘s Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0);

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    //UIImage *image = [UIImage imageWithCGImage:quartzImage];
    UIImage *image = [UIImage imageWithCGImage:quartzImage scale:1.0f orientation:UIImageOrientationRight];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}

#pragma mark - 添加按钮 AND 点击按钮事件

- (void)addCaptureButton
{
    CGRect frame = CGRectMake(0, 0, 100, 100);
    UIButton * btn = [UIButton buttonWithType:UIButtonTypeRoundedRect];
    btn.frame = frame;
    [btn setTitle:@"拍照" forState:UIControlStateNormal];
    btn.backgroundColor = [UIColor clearColor];
    btn.tag = 1111;
    [btn addTarget:self action:@selector(onClickCaptureButton:) forControlEvents:UIControlEventTouchUpInside];
    [self.view addSubview:btn];
}

-(IBAction)onClickCaptureButton:(id)sender
{
    [self takePicture];
}

#pragma mark - 保持图像到相册

- (void)saveImageToPhotos:(UIImage*)savedImage
{
    UIImageWriteToSavedPhotosAlbum(savedImage, self, @selector(image:didFinishSavingWithError:contextInfo:), NULL);
}

// 指定回调方法
- (void)image: (UIImage *) image didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo
{
    NSString * msg = nil ;
    if(error != NULL)
    {
        msg = @"保存图片失败" ;
    }
    else
    {
        msg = @"保存图片成功" ;
    }

    UIAlertView * alert = [[UIAlertView alloc] initWithTitle:@"保存图片结果提示"
                                                    message:msg
                                                   delegate:self
                                          cancelButtonTitle:@"确定"
                                          otherButtonTitles:nil];
    [alert show];
}

#pragma mark - 拍照函数

//拍照
- (void) takePicture
{
    //1.
    AVCaptureConnection * stillImageConnection = [self.stillImageOutput.connections objectAtIndex:0];

    if ([stillImageConnection isVideoOrientationSupported]) {
        [stillImageConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
    }

    //2.
    [self.stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

        if (imageDataSampleBuffer != NULL)
        {
            //图像数据类型转换
            NSData * imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
            UIImage * image = [[UIImage alloc] initWithData:imageData];

            //保存图像
            [self saveImageToPhotos:image];

        }
        else
        {
            NSLog(@"Error capturing still image %@", error);
        }
    }];

}

@end

自定义相机(三) -- 视频流数据 AND 预览 拍照 变焦

时间: 2024-08-09 21:50:26

自定义相机(三) -- 视频流数据 AND 预览 拍照 变焦的相关文章

相机 视频流数据--预览 拍照 变焦

转载自 http://www.cnblogs.com/iCodePhone/p/3785283.html 实现功能: 1. 视频流数据 2. 预览和拍照变焦, 所见即所得. 运行环境: 1.  XCODE 5.1.1 2.  真机(IPHONE5  ,  IOS6.1.4) #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> //导入 - "视频流" @interface MCViewCon

Gallery和自定义Adapter配合使用,实现图片预览

Gallery是一个可以拖动的列表,正中对应的是选中的东西.他和spinner有共同的父类:AbsSpinner 属性: android:animationDuration="1000" 图片切换动画持续时间 android:spacing="8dp"      设置图片之间的间距 android:unselectedAlpha="0.6"   设置没有选择的图片的透明度 <LinearLayout xmlns:android="

Android自定义照相机 预览拍照 切换前后置摄像头

Android提供了Camera来控制拍照,步骤如下:(1)调用Camera的open()方法打开相机.(2)调用Camera的getParameters()获取拍照参数,该方法返回一个Cmera.Parameters对象.(3)调用Camera.Parameters对象对照相的参数进行设置.(4)调用Camera的setParameters(),并将Camera.Parameters对象作为参数传入,这样就可以对拍照进行参数控制,Android2.3.3以后不用设置.(5)调用Camerade

Android实战技巧之四十七:不用预览拍照与图片缩放剪裁

副标题:Take Picture without preview Android Google出于对隐私的保护,制定了一条门槛,即在Android应用开发中编写拍照程序是必需要有图像预览的.这会对那些恶意程序比如Android中泛滥的Service在后台偷偷记录手机用户的行为与周边信息.这样的门槛还包括手机厂商自带的相机软件在拍照时必须是有声音,这样要避免一些偷拍的情况. 处于技术调研与一些特殊无害场景的使用,我们要用到不用预览的拍照.此文就是以此为背景,做的一些调研.只是用不多与五款手机测试,

ffmpeg实现mjpeg摄像头的采集-预览-拍照

摄像头输出是mjpeg格式的,需要实现在线预览功能,然后实现拍照功能 1.可以设置采集图像的分辨率,预览分辨率为640*480,可以自定义 2.ctrl+\ 拍照,ctrl+c 退出 void test() { if (signal(SIGQUIT, sigHandle) == SIG_ERR) { perror("set signal err"); } if (signal(SIGINT, sigHandle) == SIG_ERR) { perror("set signa

[extjs5学习笔记]第三十七节 Extjs6预览版都有神马新东西

本文在微信公众号文章地址:微信公众号文章地址 本文地址:http://blog.csdn.net/sushengmiyan/article/details/45190485 [TOC] 在Ext JS 6,可以使用单一的javascript框架来无缝的创建基于桌面.平板和智能手机的应用程序. ExtJS 6 早期版本发布,新增功能如下: 合并了 Ext JS 和 Sencha Touch 功能 通过 Sencha Cmd 6,新增时尚主题功能 3D 绘图功能增强 默认Ext JS 网格辅助选项为

android开发——自定义相机开发总结

最近这段时间我一直在开发自定义相机,谷歌了些网上的demo,发现有很多各种各样的问题.最终还是从API的camera类开始学习,进行改进.下面对之前的实现进行一些总结. 官方camera API: http://developer.android.com/guide/topics/media/camera.html 中文翻译: http://www.cnblogs.com/over140/archive/2011/11/16/2251344.html 自定义相机大致实现流程: 预览Camera这

自定义相机(一) -- 预览视频流

实现功能: 自定义视频流,实时预览. 运行环境: 1.  XCODE 5.1.1 2.  真机(IPHONE5  ,  IOS6.1.4) #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> //导入 - "视频流" @interface MCViewController : UIViewController @property (strong, nonatomic) AVCaptureSe

Android Multimedia框架总结(十四)Camera框架初识及自定义相机案例

转载请把头部出处链接和尾部二维码一起转载,本文出自逆流的鱼yuiop:http://blog.csdn.net/hejjunlin/article/details/52738492 前言:国庆节告一段落,又是新一月,上月主要是围绕MediaPlayer相关展开,从今天开始,开始分析多媒体框架中的Camera模块,看下今天的Agenda: Camera拍照 Camera录像 新API android.hardware.camera2 新旧API特点对比 Camera自定义相机 新API andro