iPhone摄像头设备获取

转载自   http://blog.csdn.net/linzhiji/article/details/6730693

目的:打开、关闭前置摄像头,绘制图像,并获取摄像头的二进制数据。需要的库AVFoundation.framework 、CoreVideo.framework 、CoreMedia.framework 、QuartzCore.framework该摄像头捕抓必须编译真机的版本,模拟器下编译不了。

函数说明

- (void)createControl{// UI界面控件的创建}

- (AVCaptureDevice *)getFrontCamera;获取前置摄像头设备

- (void)startVideoCapture;打开摄像头并开始捕捉图像其中代码:

AVCaptureVideoPreviewLayer* previewLayer = [AVCaptureVideoPreviewLayer layerWithSession: self->avCaptureSession];

previewLayer.frame = localView.bounds;

previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

[self->localView.layer addSublayer: previewLayer];

为把图片画到UIView里面

- (void)stopVideoCapture:(id)arg;关闭摄像头,停止捕抓图像其中代码:

for(UIView*viewinself->localView.subviews) {[viewremoveFromSuperview];}

为移除摄像头图像的View详情见代码,代码拷过去可以直接使用

代码:头文件:

//
//  AVCallController.h
//  Pxlinstall
//
//  Created by Lin Charlie C. on 11-3-24.
//  Copyright 2011  xxxx. All rights reserved.
//  

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>  

@interface AVCallController : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate>
{
    //UI
    UILabel*labelState;
    UIButton*btnStartVideo;
    UIView*localView;  

    AVCaptureSession* avCaptureSession;
    AVCaptureDevice *avCaptureDevice;
    BOOLfirstFrame; //是否为第一帧
    intproducerFps;  

}

 @property (nonatomic, retain) AVCaptureSession *avCaptureSession;
 @property (nonatomic, retain) UILabel *labelState;  

- (void)createControl;
- (AVCaptureDevice *)getFrontCamera;
- (void)startVideoCapture;
- (void)stopVideoCapture:(id)arg;  

@end  

实现文件:

        //  AVCallController.m
        //  Pxlinstall
        //
        //  Created by Lin Charlie C. on 11-3-24.
        //  Copyright 2011  高鸿移通. All rights reserved.
        //  

        #import "AVCallController.h"  

        @implementation AVCallController  

        @synthesize avCaptureSession;
        @synthesize labelState;  

        // The designated initializer.  Override if you create the controller programmatically and want to perform customization that is not appropriate for viewDidLoad.
        /*
        - (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil {
            self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
            if (self) {
                // Custom initialization.
            }
            return self;
        }
        */
        -(id)init
        {
        if(self= [superinit])
        {
        firstFrame= YES;
        producerFps= 50;
        }
        returnself;
        }  

        // Implement loadView to create a view hierarchy programmatically, without using a nib.
        - (void)loadView {
        [superloadView];
        [selfcreateControl];
        }  

        /*
        // Implement viewDidLoad to do additional setup after loading the view, typically from a nib.
        - (void)viewDidLoad {
            [super viewDidLoad];
        }
        */  

        /*
        // Override to allow orientations other than the default portrait orientation.
        - (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {
            // Return YES for supported orientations.
            return (interfaceOrientation == UIInterfaceOrientationPortrait);
        }
        */  

        - (void)didReceiveMemoryWarning {
        // Releases the view if it doesn‘t have a superview.
        [superdidReceiveMemoryWarning];  

        // Release any cached data, images, etc. that aren‘t in use.
        }  

        - (void)viewDidUnload {
        [superviewDidUnload];
        // Release any retained subviews of the main view.
        // e.g. self.myOutlet = nil;
        }  

        - (void)dealloc {
            [super dealloc];
        }  

        #pragma mark -
        #pragma mark createControl
        - (void)createControl
        {
        //UI展示
        self.view.backgroundColor= [UIColorgrayColor];
        labelState= [[UILabelalloc] initWithFrame:CGRectMake(10, 20, 220, 30)];
        labelState.backgroundColor= [UIColorclearColor];
        [self.viewaddSubview:labelState];
        [labelStaterelease];  

        btnStartVideo= [[UIButtonalloc] initWithFrame:CGRectMake(20, 350, 80, 50)];
        [btnStartVideosetTitle:@"Star"forState:UIControlStateNormal];  

        [btnStartVideosetBackgroundImage:[UIImageimageNamed:@"Images/button.png"] forState:UIControlStateNormal];
        [btnStartVideoaddTarget:selfaction:@selector(startVideoCapture) forControlEvents:UIControlEventTouchUpInside];
        [self.viewaddSubview:btnStartVideo];
        [btnStartVideorelease];  

        UIButton* stop = [[UIButtonalloc] initWithFrame:CGRectMake(120, 350, 80, 50)];
        [stop setTitle:@"Stop"forState:UIControlStateNormal];  

        [stop setBackgroundImage:[UIImageimageNamed:@"Images/button.png"] forState:UIControlStateNormal];
        [stop addTarget:selfaction:@selector(stopVideoCapture:) forControlEvents:UIControlEventTouchUpInside];
        [self.view addSubview:stop];
        [stop release];  

        localView= [[UIViewalloc] initWithFrame:CGRectMake(40, 50, 200, 300)];
        [self.viewaddSubview:localView];
        [localViewrelease];  

        }
        #pragma mark -
        #pragma mark VideoCapture
        - (AVCaptureDevice *)getFrontCamera
        {
        //获取前置摄像头设备
        NSArray *cameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
            for (AVCaptureDevice *device in cameras)
        {
                if (device.position == AVCaptureDevicePositionFront)
                    return device;
            }
            return [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];  

        }
        - (void)startVideoCapture
        {
        //打开摄像设备,并开始捕抓图像
        [labelStatesetText:@"Starting Video stream"];
        if(self->avCaptureDevice|| self->avCaptureSession)
        {
        [labelStatesetText:@"Already capturing"];
        return;
        }  

        if((self->avCaptureDevice = [self getFrontCamera]) == nil)
        {
        [labelStatesetText:@"Failed to get valide capture device"];
        return;
        }  

        NSError *error = nil;
            AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:self->avCaptureDevice error:&error];
            if (!videoInput)
        {
        [labelStatesetText:@"Failed to get video input"];
        self->avCaptureDevice= nil;
                return;
            }  

            self->avCaptureSession = [[AVCaptureSession alloc] init];
            self->avCaptureSession.sessionPreset = AVCaptureSessionPresetLow;
            [self->avCaptureSession addInput:videoInput];  

        // Currently, the only supported key is kCVPixelBufferPixelFormatTypeKey. Recommended pixel format choices are
        // kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange or kCVPixelFormatType_32BGRA.
        // On iPhone 3G, the recommended pixel format choices are kCVPixelFormatType_422YpCbCr8 or kCVPixelFormatType_32BGRA.
        //
            AVCaptureVideoDataOutput *avCaptureVideoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
        NSDictionary*settings = [[NSDictionaryalloc] initWithObjectsAndKeys:
        //[NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey,
        [NSNumbernumberWithInt:240], (id)kCVPixelBufferWidthKey,
                                      [NSNumber numberWithInt:320], (id)kCVPixelBufferHeightKey,
          nil];
            avCaptureVideoDataOutput.videoSettings = settings;
            [settings release];
            avCaptureVideoDataOutput.minFrameDuration = CMTimeMake(1, self->producerFps);
        /*We create a serial queue to handle the processing of our frames*/
        dispatch_queue_tqueue = dispatch_queue_create("org.doubango.idoubs", NULL);
            [avCaptureVideoDataOutput setSampleBufferDelegate:self queue:queue];
            [self->avCaptureSession addOutput:avCaptureVideoDataOutput];
            [avCaptureVideoDataOutput release];
        dispatch_release(queue);  

        AVCaptureVideoPreviewLayer* previewLayer = [AVCaptureVideoPreviewLayer layerWithSession: self->avCaptureSession];
        previewLayer.frame = localView.bounds;
        previewLayer.videoGravity= AVLayerVideoGravityResizeAspectFill;  

        [self->localView.layer addSublayer: previewLayer];  

        self->firstFrame= YES;
            [self->avCaptureSession startRunning];  

        [labelStatesetText:@"Video capture started"];  

        }
        - (void)stopVideoCapture:(id)arg
        {
        //停止摄像头捕抓
        if(self->avCaptureSession){
        [self->avCaptureSession stopRunning];
        self->avCaptureSession= nil;
        [labelStatesetText:@"Video capture stopped"];
        }
        self->avCaptureDevice= nil;
        //移除localView里面的内容
        for(UIView*viewinself->localView.subviews) {
        [viewremoveFromSuperview];
        }
        }
        #pragma mark -
        #pragma mark AVCaptureVideoDataOutputSampleBufferDelegate
        - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
        {
        //捕捉数据输出 要怎么处理虽你便
        CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        /*Lock the buffer*/
        if(CVPixelBufferLockBaseAddress(pixelBuffer, 0) == kCVReturnSuccess)
        {
                UInt8 *bufferPtr = (UInt8 *)CVPixelBufferGetBaseAddress(pixelBuffer);
                size_t buffeSize = CVPixelBufferGetDataSize(pixelBuffer);  

        if(self->firstFrame)
        {
        if(1)
        {
        //第一次数据要求:宽高,类型
        int width = CVPixelBufferGetWidth(pixelBuffer);
        int height = CVPixelBufferGetHeight(pixelBuffer);  

        int pixelFormat = CVPixelBufferGetPixelFormatType(pixelBuffer);
        switch (pixelFormat) {
        casekCVPixelFormatType_420YpCbCr8BiPlanarVideoRange:
        //TMEDIA_PRODUCER(producer)->video.chroma = tmedia_nv12; // iPhone 3GS or 4
        NSLog(@"Capture pixel format=NV12");
        break;
        casekCVPixelFormatType_422YpCbCr8:
        //TMEDIA_PRODUCER(producer)->video.chroma = tmedia_uyvy422; // iPhone 3
        NSLog(@"Capture pixel format=UYUY422");
        break;
        default:
        //TMEDIA_PRODUCER(producer)->video.chroma = tmedia_rgb32;
        NSLog(@"Capture pixel format=RGB32");
        break;
        }  

        self->firstFrame = NO;
        }
        }
        /*We unlock the buffer*/
        CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
            }
        /*We create an autorelease pool because as we are not in the main_queue our code is
         not executed in the main thread. So we have to create an autorelease pool for the thread we are in*/
        // NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
        //
        //    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        //    /*Lock the image buffer*/
        //    CVPixelBufferLockBaseAddress(imageBuffer,0);
        //    /*Get information about the image*/
        //    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
        //    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        //    size_t width = CVPixelBufferGetWidth(imageBuffer);
        //    size_t height = CVPixelBufferGetHeight(imageBuffer);
        //
        //    /*Create a CGImageRef from the CVImageBufferRef*/
        //    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
        //    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
        //    CGImageRef newImage = CGBitmapContextCreateImage(newContext);
        //
        //    /*We release some components*/
        //    CGContextRelease(newContext);
        //    CGColorSpaceRelease(colorSpace);
        //
        //    /*We display the result on the custom layer. All the display stuff must be done in the main thread because
        //  UIKit is no thread safe, and as we are not in the main thread (remember we didn‘t use the main_queue)
        //  we use performSelectorOnMainThread to call our CALayer and tell it to display the CGImage.*/
        // [self.customLayer performSelectorOnMainThread:@selector(setContents:) withObject: (id) newImage waitUntilDone:YES];
        //
        // /*We display the result on the image view (We need to change the orientation of the image so that the video is displayed correctly).
        //  Same thing as for the CALayer we are not in the main thread so ...*/
        // UIImage *image= [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationRight];
        //
        // /*We relase the CGImageRef*/
        // CGImageRelease(newImage);
        //
        // [self.imageView performSelectorOnMainThread:@selector(setImage:) withObject:image waitUntilDone:YES];
        //
        // /*We unlock the  image buffer*/
        // CVPixelBufferUnlockBaseAddress(imageBuffer,0);
        //
        // [pool drain];
        }
        @end 
时间: 2024-10-17 22:10:15

iPhone摄像头设备获取的相关文章

iOS 使用AFN 进行单图和多图上传 摄像头/相册获取图片,压缩图片

图片上传时必要将图片进行压缩,不然会上传失败 首先是同系统相册选择图片和视频.iOS系统自带有UIImagePickerController,可以选择或拍摄图片视频,但是最大的问题是只支持单选,由于项目要求需要支持多选,只能自己自定义.获取系统图库的框架有两个,一个是ALAssetsLibrary,兼容iOS低版本,但是在iOS9中是不建议使用的:另一个是PHAsset,但最低要求iOS8以上.兼容到iOS7,可以选择了ALAssetsLibrary 现在我们先说选择一张图的情况 一.单图多图上

ONVIF客户端搜索设备获取rtsp地址开发笔记(精华篇)

原文  http://blog.csdn.net/gubenpeiyuan/article/details/25618177 概要: 前言及鸣谢: 感谢guog先生,快活林高先生,onvif全国交流群的的酷夏先生在开发过程中给予的巨大支持,没有你们的帮助开发过程将异常艰难啊.谢谢了! ONVIF介绍: ONVIF致力于通过全球性的开放接口标准来推进 网络视频 在安防市场的应用,这一接口标准将确保不同厂商生产的网络视频产品具有互通性.2008年11月,论坛正式发布了ONVIF第一版规范——ONVI

ios中摄像头/相册获取图片,压缩图片,上传服务器方法总结

相册 iphone的相册包含摄像头胶卷+用户计算机同步的部分照片.用户可以通过UIImagePickerController类提供的交互对话框来从相册中选择图像.但是,注意:相册中的图片机器路径无法直接从应用程序访问,只能通过终端用户去选择和使用相册图片 应用程序包 应用程序包可能会将图像与可执行程序.Info.plist文件和其他资源一同存储.我们可以通过本地文件路径来读取这些基于包的图像并在应用程序中显示它们. 沙盒 借助沙盒,我们可以把图片存储到Documents.Library.tmp文

ONVIFclient搜索设备获取rtsp地址开发笔记(精华篇)

概要: 前言及鸣谢: 感谢guog先生.快活林高先生,onvif全国交流群的的酷夏先生在开发过程中给予的巨大支持,没有你们的帮助开发过程将异常艰难啊.谢谢了! ONVIF介绍: ONVIF致力于通过全球性的开放接口标准来推进网络视频在安防市场的应用,这一接口标准将确保不同厂商生产的网络视频产品具有互通性.2008年11月.论坛正式公布了ONVIF第一版规范--ONVIF核心规范1.0.随着视频监控的网络化应用,产业链的分工将越来越细. 有些厂商专门做摄像头.有些厂商专门做DVS.有些厂商则可能专

WPF中使用DirectShowLib枚举摄像头设备和分辨率

提供window平台下基于Net技术和Qt技术的多点触摸设备应用开发,画板开发,摄像头/展台设备应用开发 本质还是对DX接口的运用,直接代码好理解 1. 定义设备接口 public interface IDevice{ string DeviceName{get;set;} //设备名称 string DevicePath{get;set;} //设备路径 System.Runtime.InteropServices.ComTypes.IMoniker Moniker{get;set;} } 2

iphone开发之获取网卡的MAC地址和IP地址

本文转载至 http://blog.csdn.net/arthurchenjs/article/details/6358489 这是获取网卡的硬件地址的代码,如果无法编译通过,记得把下面的这几个头文件加上把. #include <sys/socket.h> // Per msqr#include <sys/sysctl.h>#include <net/if.h>#include <net/if_dl.h> #pragma mark MAC addy// Re

iPhone系列设备媒体查询:

这就引出一个问题,我们在对iPhone设备适配时候,又多出几种情况.iPhone系列设备媒体查询: @media only screen and (min-device-width: 320px){ //针对iPhone 3 } @media only screen and (min-device-width: 320px)and (-webkit-min-device-pixel-ratio: 2) { //针对iPhone 4, 5c,5s, 所有iPhone6的放大模式,个别iPhone6

iphone开发之获取系统字体

代码: - (void)viewDidLoad { [super viewDidLoad]; // Do any additional setup after loading the view. NSLog(@"--系统的所有字体--%@",[self getAllSystemFonts]); NSLog(@"--系统当前字体--%@",[self getCurrentFont]); } //获得系统的所有字体 - (NSArray*)getAllSystemFon

ios获取iphone手机设备型号

iPhone6plus和iPhone6在放大模式下也可以获取: 导入: #import "sys/utsname.h" 调用: - (NSString*)deviceString { // 需要#import "sys/utsname.h" struct utsname systemInfo; uname(&systemInfo); NSString *platform = [NSString stringWithCString:systemInfo.mac