相机 视频流数据--预览 拍照 变焦

转载自 http://www.cnblogs.com/iCodePhone/p/3785283.html

实现功能:

  1. 视频流数据

  2. 预览和拍照变焦, 所见即所得。

运行环境:

  1.  XCODE 5.1.1

  2.  真机(IPHONE5  ,  IOS6.1.4)

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>       //导入 - "视频流"

@interface MCViewController : UIViewController<AVCaptureVideoDataOutputSampleBufferDelegate>

@property (strong, nonatomic) AVCaptureSession * captureSession;            //AVCaptureSession实例
@property (strong, nonatomic) AVCaptureDeviceInput * videoInput;            //持有视频输入实例

@property (strong, nonatomic) AVCaptureStillImageOutput * stillImageOutput;  //持有静态图像输出实例

@end

//
//  MCViewController.m
//  MyCamera

#import "MCViewController.h"

@interface MCViewController ()

@end

@implementation MCViewController

#pragma mark - life cycle

- (void)viewWillAppear:(BOOL)animated
{
    [super viewWillAppear:animated];

    //开始扑捉会话
    [self.captureSession startRunning];
}

- (void)viewWillDisappear:(BOOL)animated
{
    [super viewWillDisappear:animated];

    //停止扑捉会话
    [self.captureSession stopRunning];
}

- (void)viewDidLoad
{
    [super viewDidLoad];

    //初始化视频流
    [self initAv];

    //添加拍照按钮
    [self addCaptureButton];
}

#pragma mark - 初始化视频流

- (void) initAv
{
    //1.1 创建AVCaptureSession
    self.captureSession = [[AVCaptureSession alloc] init];
    //1.2 指定输入设备。这里使用后摄像头。
    AVCaptureDevice * device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    //1.3 创建AVCaptureDeviceInput的实例,将所选设备作为扑捉会话的输入。
    //      此外,在将是其添加到回话前请创建好输入,这里需要做个检查。
    NSError * error = nil;
    self.videoInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (self.videoInput) {
        [self.captureSession addInput:self.videoInput];
    }
    else
    {
        NSLog(@"input error : %@", error);
    }

    //4. 视频流帧数据
    AVCaptureVideoDataOutput * output = [[AVCaptureVideoDataOutput alloc] init];
    [self.captureSession addOutput:output];

    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    //dispatch_release(queue);

    //    output.videoSettings = [NSDictionary dictionaryWithObject:
    //                                                       [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
    //                                                       forKey:(id)kCVPixelBufferPixelFormatTypeKey];

    output.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                            [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
                            [NSNumber numberWithInt: 320], (id)kCVPixelBufferWidthKey,
                            [NSNumber numberWithInt: 240], (id)kCVPixelBufferHeightKey,
                            nil];
    //output.minFrameDuration = CMTimeMake(1, 15);

    //2. 创建预览层
    AVCaptureVideoPreviewLayer * previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    UIView * aView = self.view;
    previewLayer.frame = CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height);
    previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    [aView.layer addSublayer:previewLayer];
    [previewLayer setAffineTransform:CGAffineTransformMakeScale(5.0, 5.0)]; //3. 实现拍照
    self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary * stillImageOutputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey, nil];
    [self.stillImageOutput setOutputSettings:stillImageOutputSettings];
    [self.captureSession addOutput:self.stillImageOutput];

//    //4. 拍照变焦
//    AVCaptureConnection * stillImageConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:[self.stillImageOutput connections]];
//
//    [stillImageConnection setVideoScaleAndCropFactor:5.0];  

}

#pragma  mark -
- (AVCaptureConnection *)connectionWithMediaType:(NSString *)mediaType fromConnections:(NSArray *)connections
{
    for ( AVCaptureConnection *connection in connections )
    {
        for ( AVCaptureInputPort *port in [connection inputPorts] )
        {
            if ( [[port mediaType] isEqual:mediaType] )
            {
                return connection;
            }
        }
    }
    return nil;
}

#pragma mark -  delegate -  AVCaptureVideoDataOutputSampleBufferDelegate

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    //视频流一帧数据(UIImage类型)
    UIImage * image =  [self imageFromSampleBuffer:sampleBuffer];

    NSLog(@"视频流, 宽:(%f)  高:(%f)", image.size.width, image.size.height);
    //IPHONE4  720  * 1280
    //IPHONE5  1080 * 1920

    //或:视频流一帧数据(NSData类型)
    //NSData * imageData = UIImageJPEGRepresentation(image, 0.5);

    //Code...

}

// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    // Get a CMSampleBuffer‘s Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0);

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    //UIImage *image = [UIImage imageWithCGImage:quartzImage];
    UIImage *image = [UIImage imageWithCGImage:quartzImage scale:1.0f orientation:UIImageOrientationRight];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}

#pragma mark - 添加按钮 AND 点击按钮事件

- (void)addCaptureButton
{
    CGRect frame = CGRectMake(0, 0, 100, 100);
    UIButton * btn = [UIButton buttonWithType:UIButtonTypeRoundedRect];
    btn.frame = frame;
    [btn setTitle:@"拍照" forState:UIControlStateNormal];
    btn.backgroundColor = [UIColor clearColor];
    btn.tag = 1111;
    [btn addTarget:self action:@selector(onClickCaptureButton:) forControlEvents:UIControlEventTouchUpInside];
    [self.view addSubview:btn];
}

-(IBAction)onClickCaptureButton:(id)sender
{
    [self takePicture];
}

#pragma mark - 保持图像到相册

- (void)saveImageToPhotos:(UIImage*)savedImage
{
    UIImageWriteToSavedPhotosAlbum(savedImage, self, @selector(image:didFinishSavingWithError:contextInfo:), NULL);
}

// 指定回调方法
- (void)image: (UIImage *) image didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo
{
    NSString * msg = nil ;
    if(error != NULL)
    {
        msg = @"保存图片失败" ;
    }
    else
    {
        msg = @"保存图片成功" ;
    }

    UIAlertView * alert = [[UIAlertView alloc] initWithTitle:@"保存图片结果提示"
                                                    message:msg
                                                   delegate:self
                                          cancelButtonTitle:@"确定"
                                          otherButtonTitles:nil];
    [alert show];
}

#pragma mark - 拍照函数

//拍照
- (void) takePicture
{
    //1.
    AVCaptureConnection * stillImageConnection = [self.stillImageOutput.connections objectAtIndex:0];

    if ([stillImageConnection isVideoOrientationSupported]) {
        [stillImageConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
    }

    //2.
    [self.stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

        if (imageDataSampleBuffer != NULL)
        {
            //图像数据类型转换
            NSData * imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
            UIImage * image = [[UIImage alloc] initWithData:imageData];

            //保存图像
            [self saveImageToPhotos:image];

        }
        else
        {
            NSLog(@"Error capturing still image %@", error);
        }
    }];

}

@end
时间: 2024-10-30 17:03:05

相机 视频流数据--预览 拍照 变焦的相关文章

自定义相机(三) -- 视频流数据 AND 预览 拍照 变焦

实现功能: 1. 视频流数据 2. 预览和拍照变焦, 所见即所得. 运行环境: 1.  XCODE 5.1.1 2.  真机(IPHONE5  ,  IOS6.1.4) #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> //导入 - "视频流" @interface MCViewController : UIViewController<AVCaptureVideoDataOutpu

Android自定义照相机 预览拍照 切换前后置摄像头

Android提供了Camera来控制拍照,步骤如下:(1)调用Camera的open()方法打开相机.(2)调用Camera的getParameters()获取拍照参数,该方法返回一个Cmera.Parameters对象.(3)调用Camera.Parameters对象对照相的参数进行设置.(4)调用Camera的setParameters(),并将Camera.Parameters对象作为参数传入,这样就可以对拍照进行参数控制,Android2.3.3以后不用设置.(5)调用Camerade

Android实战技巧之四十七:不用预览拍照与图片缩放剪裁

副标题:Take Picture without preview Android Google出于对隐私的保护,制定了一条门槛,即在Android应用开发中编写拍照程序是必需要有图像预览的.这会对那些恶意程序比如Android中泛滥的Service在后台偷偷记录手机用户的行为与周边信息.这样的门槛还包括手机厂商自带的相机软件在拍照时必须是有声音,这样要避免一些偷拍的情况. 处于技术调研与一些特殊无害场景的使用,我们要用到不用预览的拍照.此文就是以此为背景,做的一些调研.只是用不多与五款手机测试,

ffmpeg实现mjpeg摄像头的采集-预览-拍照

摄像头输出是mjpeg格式的,需要实现在线预览功能,然后实现拍照功能 1.可以设置采集图像的分辨率,预览分辨率为640*480,可以自定义 2.ctrl+\ 拍照,ctrl+c 退出 void test() { if (signal(SIGQUIT, sigHandle) == SIG_ERR) { perror("set signal err"); } if (signal(SIGINT, sigHandle) == SIG_ERR) { perror("set signa

IDT 数据预览查询

前面做了一件非常愚蠢的事情,由于不会预览数据.我都是直接发布到webi去查看的.可以想象一下了.真是太年轻了.为自己感到十分的汗颜. 在数据基础层做好连接之后,可以查看数据基础 .会显示相应的join 以及tables.更改基数方式(1:1,1:n.....,)之类的. 发布到业务层之后,有个查询属性.按自己需求进行调整. 点击下方的查询,选择相应的字段即可进行查询了.这是合成后的表查询.单标查询更简单了.直接放在表上面,显示表值就好. 查看脚本,会显示它内在表关系.有时候还是比较有用的. 大体

自定义相机(一) -- 预览视频流

实现功能: 自定义视频流,实时预览. 运行环境: 1.  XCODE 5.1.1 2.  真机(IPHONE5  ,  IOS6.1.4) #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> //导入 - "视频流" @interface MCViewController : UIViewController @property (strong, nonatomic) AVCaptureSe

h5 实现调用系统拍照或者选择照片并预览

这次又来分享个好东西! 调用手机相机拍照或者是调用手机相册选择照片,这个功能在 手机端页面 或者 webApp 应该是常用到的,就拿个人或会员资料录入那块来说就已经是经常会碰到的, 每当看到这块功能的时候,前端的小伙伴就得去找各种各样的插件.除非你收藏了什么好东西,或者是你收藏了什么比较旧的.需求跟不上的好东西,需求不一样体验不好 那你提交了,产品经理会买你账吗? 好了,咱入正题! 这里主要是针对手机端页面或者webApp的,pc端页面效果欠佳(有时候点击选择按钮,弹框要等你上完厕所才能弹得出来

玩转Android Camera开发(一):Surfaceview预览Camera,基础拍照功能完整demo

杂家前文是在2012年的除夕之夜仓促完成,后来很多人指出了一些问题,琐事缠身一直没有进行升级.后来随着我自己的使用,越来越发现不出个升级版的demo是不行了.有时候就连我自己用这个demo测一些性能.功能点,用着都不顺手.当初代码是在linux下写的,弄到windows里下全是乱码.还要自己改几分钟才能改好.另外,很多人说不能正常预览,原因是我在布局里把Surfaceview的尺寸写死了.再有就是initCamera()的时候设参数失败,直接黑屏退出,原因也是我把预览尺寸和照片尺寸写死了.再有就

Android仿微信图片上传,可以选择多张图片,缩放预览,拍照上传等

仿照微信,朋友圈分享图片功能 .可以进行图片的多张选择,拍照添加图片,以及进行图片的预览,预览时可以进行缩放,并且可以删除选中状态的图片 .很不错的源码,大家有需要可以下载看看 . 微信 微信 微信 微信 下载地址 : 微信上传图片源码 很多网友不知道怎么获取图片路径,这里贴出来: String path = Bimp.tempSelectBitmap.get(position).getImagePath(); //部分代码如下 [java] view plain copy package co