捕捉视频帧

通常视频应用程序使用一个缩略图来表示给定的视频,使用CoreMedia框架生成缩略图。

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>

@interface ViewController : UIViewController<AVCaptureFileOutputRecordingDelegate>

@property (strong, nonatomic) AVCaptureSession *captureSession;
@property (strong, nonatomic) AVCaptureDeviceInput *videoInput;
@property (strong, nonatomic) AVCaptureDeviceInput *audioInput;
@property (strong, nonatomic) AVCaptureStillImageOutput *stillImageOutput;
@property (strong, nonatomic) AVCaptureMovieFileOutput *movieOutput;

@property (weak, nonatomic) IBOutlet UIButton *captureButton;
@property (weak, nonatomic) IBOutlet UISegmentedControl *modeControl;
@property (weak, nonatomic) IBOutlet UIImageView *thumbnailImageView;

- (IBAction)capture:(id)sender;
- (IBAction)updateMode:(id)sender;

@end
#import "ViewController.h"

@interface ViewController ()

@end

@implementation ViewController
@synthesize captureButton;
@synthesize modeControl;
@synthesize thumbnailImageView;

- (void)viewDidLoad
{
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    self.captureSession = [[AVCaptureSession alloc] init];
    //Optional: self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;

    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];

    self.videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
    self.audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:audioDevice error:nil];

    self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *stillImageOutputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:
                                              AVVideoCodecJPEG, AVVideoCodecKey, nil];
    [self.stillImageOutput setOutputSettings:stillImageOutputSettings];

    self.movieOutput = [[AVCaptureMovieFileOutput alloc] init];

    // Setup capture session for taking pictures
    [self.captureSession addInput:self.videoInput];
    [self.captureSession addOutput:self.stillImageOutput];

    AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    UIView *aView = self.view;
    previewLayer.frame = CGRectMake(0, 70, self.view.frame.size.width, self.view.frame.size.height-140);
    [aView.layer addSublayer:previewLayer];
}

- (void)didReceiveMemoryWarning
{
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
    return (interfaceOrientation != UIInterfaceOrientationPortraitUpsideDown);
}

- (void) captureStillImage
{
    AVCaptureConnection *stillImageConnection = [self.stillImageOutput.connections objectAtIndex:0];
    if ([stillImageConnection isVideoOrientationSupported])
        [stillImageConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];

    [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:stillImageConnection
                                                         completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error)
     {
         if (imageDataSampleBuffer != NULL)
         {
             NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
             ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
             UIImage *image = [[UIImage alloc] initWithData:imageData];
             [library writeImageToSavedPhotosAlbum:[image CGImage]
                                       orientation:(ALAssetOrientation)[image imageOrientation]
                                   completionBlock:^(NSURL *assetURL, NSError *error)
              {
                  UIAlertView *alert;
                  if (!error)
                  {
                      alert = [[UIAlertView alloc] initWithTitle:@"Photo Saved"
                                                         message:@"The photo was successfully saved to you photos library"
                                                        delegate:nil
                                               cancelButtonTitle:@"OK"
                                               otherButtonTitles:nil, nil];
                  }
                  else
                  {
                      alert = [[UIAlertView alloc] initWithTitle:@"Error Saving Photo"
                                                         message:@"The photo was not saved to you photos library"
                                                        delegate:nil
                                               cancelButtonTitle:@"OK"
                                               otherButtonTitles:nil, nil];
                  }

                  [alert show];
              }
              ];
         }
         else
             NSLog(@"Error capturing still image: %@", error);
     }];
}

- (NSURL *) tempFileURL
{
    NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"output.mov"];
    NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
    NSFileManager *manager = [[NSFileManager alloc] init];
    if ([manager fileExistsAtPath:outputPath])
    {
        [manager removeItemAtPath:outputPath error:nil];
    }
    return outputURL;
}

- (IBAction)capture:(id)sender
{
    if (self.modeControl.selectedSegmentIndex == 0)
    {
        // Picture Mode
        [self captureStillImage];
    }
    else
    {
        // Video Mode
        if (self.movieOutput.isRecording == YES)
        {
            [self.captureButton setTitle:@"Capture" forState:UIControlStateNormal];
            [self.movieOutput stopRecording];
        }
        else
        {
            [self.captureButton setTitle:@"Stop" forState:UIControlStateNormal];
            [self.movieOutput startRecordingToOutputFileURL:[self tempFileURL] recordingDelegate:self];
        }
    }
}

- (IBAction)updateMode:(id)sender
{
    [self.captureSession stopRunning];
    if (self.modeControl.selectedSegmentIndex == 0)
    {
        if (self.movieOutput.isRecording == YES)
        {
            [self.movieOutput stopRecording];
        }
        // Still Image Mode
        [self.captureSession removeInput:self.audioInput];
        [self.captureSession removeOutput:self.movieOutput];
        [self.captureSession addOutput:self.stillImageOutput];
    }
    else
    {
        // Video Mode
        [self.captureSession removeOutput:self.stillImageOutput];
        [self.captureSession addInput:self.audioInput];
        [self.captureSession addOutput:self.movieOutput];

        // Set orientation of capture connections to portrait
        NSArray *array = [[self.captureSession.outputs objectAtIndex:0] connections];
        for (AVCaptureConnection *connection in array)
        {
            connection.videoOrientation = AVCaptureVideoOrientationPortrait;
        }
    }
    [self.captureButton setTitle:@"Capture" forState:UIControlStateNormal];

    [self.captureSession startRunning];
}

- (void)viewWillAppear:(BOOL)animated
{
    [super viewWillAppear:animated];
    [self.captureSession startRunning];
}

- (void)viewWillDisappear:(BOOL)animated
{
    [super viewWillDisappear:animated];
    [self.captureSession stopRunning];
}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
      fromConnections:(NSArray *)connections
                error:(NSError *)error
{
    BOOL recordedSuccessfully = YES;
    if ([error code] != noErr)
    {
        // A problem occurred: Find out if the recording was successful.
        id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
        if (value)
            recordedSuccessfully = [value boolValue];
        // Logging the problem anyway:
        NSLog(@"A problem occurred while recording: %@", error);
    }
    if (recordedSuccessfully)
    {
        [self createThumbnailForVideoURL:outputFileURL];
        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];

        [library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
                                    completionBlock:^(NSURL *assetURL, NSError *error)
         {
             UIAlertView *alert;
             if (!error)
             {
                 alert = [[UIAlertView alloc] initWithTitle:@"Video Saved"
                                                    message:@"The movie was successfully saved to you photos library"
                                                   delegate:nil
                                          cancelButtonTitle:@"OK"
                                          otherButtonTitles:nil, nil];
             }
             else
             {
                 alert = [[UIAlertView alloc] initWithTitle:@"Error Saving Video"
                                                    message:@"The movie was not saved to you photos library"
                                                   delegate:nil
                                          cancelButtonTitle:@"OK"
                                          otherButtonTitles:nil, nil];
             }

             [alert show];
         }
         ];
    }
}

-(void)createThumbnailForVideoURL:(NSURL *)videoURL
{
    AVURLAsset *myAsset = [[AVURLAsset alloc] initWithURL:videoURL options:[NSDictionary dictionaryWithObject:@"YES" forKey:AVURLAssetPreferPreciseDurationAndTimingKey]];

    AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:myAsset];
    imageGenerator.appliesPreferredTrackTransform = YES; //Makes sure images are correctly rotated.

    Float64 durationSeconds = CMTimeGetSeconds([myAsset duration]);
    CMTime half = CMTimeMakeWithSeconds(durationSeconds/2.0, 600);
    NSArray *times = [NSArray arrayWithObjects: [NSValue valueWithCMTime:half], nil];

    [imageGenerator generateCGImagesAsynchronouslyForTimes:times
                                              completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error)
    {
        if (result == AVAssetImageGeneratorSucceeded)
        {
            self.thumbnailImageView.image = [UIImage imageWithCGImage:image];
        }
        else if (result == AVAssetImageGeneratorFailed)
        {
            NSLog(@"Failed with error: %@", [error localizedDescription]);
        }
    }];
}

@end
时间: 2024-08-07 17:38:11

捕捉视频帧的相关文章

视频 -&gt; 帧 浅析

项目要求根据服务器返回的视频和秒数,生成该视频的预览图. 网上一搜关键词 “iOS 视频 帧” 结果都是:iOS如何获取视频的第一帧. 但是如果我不想要第一帧,要第s秒的第x帧怎么办? 先贴如何获取第一帧的代码: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 - (UIImage*) getVideoPreViewImage { AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoPath options:n

FFMPEG系列课程(二)读取视频帧

再来介绍下如何读取视频帧,打开视频参考前面的文章,首先需要创建一个帧的存放对象 AVPacket pkt; memset(&pkt, 0, sizeof(AVPacket)); 再通过 int err = av_read_frame(ic, &pkt); 读取帧数据,其中ic是之前打开的视频流句柄.读取视频帧后要注意一点av_read_frame会自动给视频帧分配空间,返回值0表示成功. 可以通过 av_packet_unref(&pkt);删除packet的控制,注意这里删除的不

利用ffmpeg获取视频帧

如果要对视频帧进行处理,可以先把视频帧读取出来. sh文件代码如下: #!/usr/bin/env sh VIDEO=/home/xxx/video/ FRAMES=/home/xxx/frame/ echo "Extract Frames..." for avi in $VIDEO/*.avi do FOLDER=$(basename $avi.avi) rm -rf $FOLDER mkdir $FRAMES/$FOLDER ffmpeg -i $avi -q:v 2 -f ima

ffmpeg-python 任意提取视频帧

? 环境准备 1.安装 FFmpeg 音/视频工具 FFmpeg 简易安装文档 2.安装 ffmpeg-python pip3 install ffmpeg-python 3.[可选]安装 opencv-python pip3 install opencv-python 4.[可选]安装 numpy pip3 install numpy ? 视频帧提取 准备视频素材 抖音视频素材下载:https://anoyi.com/dy/top 基于视频帧数提取任意一帧 import ffmpeg impo

RTSP播放器网页web无插件直播流媒体音视频播放器EasyPlayer-RTSP-Android解码获取视频帧的方法

应用场景 EasyPlayer-RTSP在多年与VLC的对标过程中,积累了广泛的应用场景,EasyPlayer-RTSP底层与上层全部自主开发,自主知识产权,可实战测试. EasyPlayer-RTSP播放器 EasyPlayer-RTSP播放器是一套RTSP专用的播放器,包括有:Windows(支持IE插件,npapi插件).Android.iOS三个平台,是由青犀TSINGSEE开放平台开发和维护的区别于市面上大部分的通用播放器,EasyPlayer-RTSP系列从2014年初发展至今得到了

视频帧类型及区别

I帧:帧内编码帧I帧特点:1.它是一个全帧压缩编码帧.它将全帧图像信息进行JPEG压缩编码及传输;2.解码时仅用I帧的数据就可重构完整图像;3.I帧描述了图像背景和运动主体的详情;4.I帧不需要参考其他画面而生成;5.I帧是P帧和B帧的参考帧(其质量直接影响到同组中以后各帧的质量);6.I帧是帧组GOP的基础帧(第一帧),在一组中只有一个I帧;7.I帧不需要考虑运动矢量;8.I帧所占数据的信息量比较大. P帧:前向预测编码帧.P帧的预测与重构:P帧是以I帧为参考帧,在I帧中找出P帧"某点&quo

关于MATLAB提取MP4视频帧时候,跨帧取速度会慢

项目需要将视频中每十帧抽取一帧,实际操作中发现,如果按照这样来写代码 vid = VideoReader(namestr);for ii = 1 : 10 : vid.NumberOfFrames frame = read(vid, ii); imwrite(frame, xxx, 'jpg'); end 设定循环变量为每10,然后直接从视频对象中read出图像帧,跑起来发现取一帧少则一秒 多则四五秒 这非常反直觉,因为大家逐帧抽取的时候都是一秒能抽好几百帧. 仔细想一想其实MP4作为一种压缩视

opencv之从视频帧中截取图片

最近在训练一个人脸识别的模型,而项目训练需要大量真实人脸图片样本. 刚好项目用到opencv识别人脸,可以把每一帧图片保存下来,用此方法可以方便的获取大量的脸部样本,大约20分钟可以获取到10000张. #-*- encoding:utf8 -*- import cv2 import os import sys import random # 获取分类器 classifier = cv2.CascadeClassifier('haarcascade_frontalface_default.xml

读取视频帧matlab

前言 视频处理分析的过程中,需要用到将视频一帧帧地读取,本文就涉及此问题. 系统环境 1.系统:win7_64 2.matlab版本:matlab2015a 测试代码 代码一: %To read video frames. clc clear close all fileName = 'E:\fatigue_detection\dataset\segVideosP1\1_5.avi'; obj = VideoReader(fileName); numFrames = obj.NumberOfFr