How to encode picture to H264 use AVFoundation on Mac, not use x264(续 :其中提到的用VideoToolBox硬编码,RTMP推流的开源工程 VideoCore project)

来源:https://github.com/jgh-/VideoCore

SSH clone URL

You can clone with

,

, or

.

An audio and video manipulation pipeline

  1. C++ 58.0%
  2. Objective-C++ 32.9%
  3. Objective-C 4.2%
  4. C 2.7%
  5. Swift 1.6%
  6. Ruby 0.6%

VideoCore/

Latest commit 823ec7c on 5 Aug jgh- Merge pull request #170 from maxcampolo/autofocus

 
  api/iOS Adding fisheye and glow filter enums. 6 months ago
 
  docs Test layout stuff a year ago
 
  filters Adding Glow filter. Adjusting Podfile.lock 6 months ago
 
  mixers 1. Use an atomic bool for preview view backgrounding in case we‘re be… 7 months ago
 
  rtmp 1. Use an atomic bool for preview view backgrounding in case we‘re be… 7 months ago
 
  sample Adding fisheye and glow filter enums. 6 months ago
 
  sources Merge pull request #170 from maxcampolo/autofocus 4 months ago
 
  stream 1. Use an atomic bool for preview view backgrounding in case we‘re be… 7 months ago
 
  system 1. Use an atomic bool for preview view backgrounding in case we‘re be… 7 months ago
 
  transforms Apply CTS offset in H264Encode as well 7 months ago
 
  .gitignore Create a .podspec file for VideoCore. This obviates the need for: 2 years ago
 
  .gitmodules We‘ll pull-in boost, glm, and UriParser via CocoaPods. As such: 2 years ago
 
  LICENSE Update to MIT 2 years ago
 
  README.md Update README.md 4 months ago
 
  VideoCore.podspec Update podspec 6 months ago

README.md

VideoCore

VideoCore is a project inteded to be an audio and video manipulation and streaming graph. It currently works with iOS and periodic (live) sources. It is a work in progress and will eventually expand to other platforms such as OS X and Android. Contributors welcome!

Table of Contents

Setup

CocoaPods

Create a Podfile with the contents

platform :ios, ‘6.0‘
pod ‘VideoCore‘, ‘~> 0.2.0‘

Next, run pod install and open the xcworkspace file that is created.

Sample Application

The SampleBroadcaster project in the sample folder uses CocoaPods to bring in VideoCore as a dependency:

cd sample/SampleBroadcaster
pod install
open SampleBroadcaster.xcworkspace

... or you can build from the command-line:

xcodebuild -workspace SampleBroadcaster.xcworkspace -scheme SampleBroadcaster build

More on CocoaPods: http://cocoapods.org/

Architecture Overview

VideoCore‘s architecture is inspired by Microsoft Media Foundation (except with saner naming). Samples start at the source, are passed through a series of transforms, and end up at the output.

e.g. Source (Camera) -> Transform (Composite) -> Transform (H.264 Encode) -> Transform (RTMP Packetize) -> Output (RTMP)

videocore/
sources/
videocore::ISource
videocore::IAudioSource : videocore::ISource
videocore::IVideoSource : videocore::ISource
videocore::Watermark : videocore:IVideoSource
iOS/
videocore::iOS::CameraSource : videocore::IVideoSource
Apple/
videocore::Apple::MicrophoneSource : videocore::IAudioSource
OSX/
videocore::OSX::DisplaySource : videocore::IVideoSource
videocore::OSX::SystemAudioSource : videocore::IAudioSource
outputs/
videocore::IOutput
videocore::ITransform : videocore::IOutput
iOS/
videocore::iOS::H264Transform : videocore::ITransform
videocore::iOS::AACTransform  : videocore::ITransform
OSX/
videocore::OSX::H264Transform : videocore::ITransform
videocore::OSX::AACTransform  : videocore::ITransform
RTMP/
videocore::rtmp::H264Packetizer : videocore::ITransform
videocore::rtmp::AACPacketizer : videocore::ITransform

mixers/
videocore::IMixer
videocore::IAudioMixer : videocore::IMixer
videocore::IVideoMixer : videocore::IMixer
videocore::AudioMixer : videocore::IAudioMixer
iOS/
videocore::iOS::GLESVideoMixer : videocore::IVideoMixer
OSX/
videocore::OSX::GLVideoMixer : videocore::IVideoMixer

rtmp/
videocore::RTMPSession : videocore::IOutput

stream/
videocore::IStreamSession
Apple/
videocore::Apple::StreamSession : videocore::IStreamSession

Version History

  • 0.3.1

    • Various bugfixes
    • Introduction of pixel buffer sources so you can add images to broadcast.
  • 0.3.0
    • Improvements to audio/video timestamps and synchronization
    • Adds an incompatible API call with previous versions. Custom
    • graphs must now call IMixer::start() to begin mixing.
  • 0.2.3
    • Add support for image filters
  • 0.2.2
    • Fix video streaking bug when adaptative bitrate is enabled
    • Increase the aggressiveness of the adaptative bitrate algorithm
    • Add internal pixel buffer format
  • 0.2.0
    • Removes deprecated functions
    • Adds Main Profile video
    • Improves adaptive bitrate algorithm
  • 0.1.12
    • Bugfixes
    • Red5 support
    • Improved Adaptive Bitrate algorithm
  • 0.1.10
    • Bugfixes
    • Adaptive Bitrate introduced
  • 0.1.9
    • Bugfixes, memory leak fixes
    • Introduces the ability to choose whether to use interface orientation or device orientation for Camera orientation.
  • 0.1.8
    • Introduces VideoToolbox encoding for iOS 8+ and OS X 10.9+
    • Adds -lc++ for compatibility with Xcode 6
  • 0.1.7
    • Add a simplified iOS API for the common case of streaming camera/microphone
    • Deprecate camera aspect ratio and position
    • Add a matrix transform for Position
    • Add a matrix transform for Aspect Ratio
    • Bugfixes
  • 0.1.6
    • Use device orientation for CameraSource rather than interface orientation
  • 0.1.5
    • Add aspect fill to CameraSource
  • 0.1.4
    • Switch from LGPL 2.1 to MIT licensing.
    • Add Camera preview layer.
    • Add front/back camera toggle.
    • Fix aspect ratio bug in Camera source.
  • 0.1.3
    • Update sample app with a more efficient viewport render
  • 0.1.2
    • Fixes a serious bug in the GenericAudioMixer that was causing 100% cpu usage and audio lag.
  • 0.1.1
    • Fixes Cocoapods namespace conflicts for UriParser-cpp
  • 0.1.0
    • Initial CocoaPods version
时间: 2024-08-12 21:13:14

How to encode picture to H264 use AVFoundation on Mac, not use x264(续 :其中提到的用VideoToolBox硬编码,RTMP推流的开源工程 VideoCore project)的相关文章

How to encode picture to H264 use AVFoundation on Mac, not use x264

来源:http://stackoverflow.com/questions/29502563/how-to-encode-picture-to-h264-use-avfoundation-on-mac-not-use-x264 How to encode picture to H264 use AVFoundation on Mac, not use x264 up vote 1 down vote favorite 1 I'm trying to make a Mac broadcast cl

视频直播技术(二):实时视频编码之H264硬编码

1.硬编码 & 软编码 硬编码:通过Android系统自带的Camera录制视频,实际上调用的是底层的高清编码硬件模块,即显卡,不使用CPU,速度快 软编码:使用CPU进行编码,如常见C/C++代码,编译生成二进制文件,速度相对较慢.例如使用Android NDK编译H264生成so库,编写jni接口,再使用java调用so库. 2.硬编码过程和原理 过程:通过MediaRecoder采集视频,再将视频流映射到LocalSocket实现收发. 原理:详见[流媒體]H264-MP4格式及在MP4文

第六十篇、音视频采集硬编码(H264+ACC)

使用 AVCaptureSession进行实时采集音视频(YUV.),编码 通过AVCaptureVideoDataOutputSampleBufferDelegate获取到音视频buffer- 数据 分别对音视频原始数据进行编码 传输 ViewController // // ViewController.h // H264AACEncode // // Created by ZhangWen on 15/10/14. // Copyright ? 2015年 Zhangwen. All ri

iOS-VideoToolbox硬编码H264

前言 VideoToolBox是iOS8之后,苹果开发的用于硬解码编码H264/H265(iOS11以后支持)的API. 对于H264还不了解的童鞋一定要先看下这边的H264的简介. 编码流程 我们实现一个简单的Demo,从摄像头获取到视频数据,然后再编码成H264裸数据保存在沙盒中. 1. 创建初始化VideoToolBox 核心代码如下 - (void)initVideoToolBox { dispatch_sync(encodeQueue , ^{ frameNO = 0; int wid

ios 视频流H264硬编码---分解LFLiveKit

#import "LFHardwareVideoEncoder.h" #import <VideoToolbox/VideoToolbox.h> @interface LFHardwareVideoEncoder (){ VTCompressionSessionRef compressionSession; // 编码器 NSInteger frameCount; // 帧数(用于设置关键帧) NSData *sps; NSData *pps; FILE *fp; BOOL

使用VideoToolbox硬编码H.264&lt;转&gt;

文/落影loyinglin(简书作者)原文链接:http://www.jianshu.com/p/37784e363b8a著作权归作者所有,转载请联系作者获得授权,并标注“简书作者”. =========================================== 使用VideoToolbox硬编码H.264 前言 H.264是目前很流行的编码层视频压缩格式,目前项目中的协议层有rtmp与http,但是视频的编码层都是使用的H.264.在熟悉H.264的过程中,为更好的了解H.264,尝

android流媒体之硬编码【代码篇】

转载此处:http://www.apkbus.com/blog-86476-43829.html 上一篇文章进行了思路和16进制文件的分析.这篇该代码实现了.目前没有在真实手机上测试, android4.0之后的模拟器可以用模拟摄像头或者叫做webcam的[其实就是笔记本摄像头].之后会在程序安装包data/data/edu.ustb.videoencoder/下面会有h264.3gp,sps[存放sps数据].pps[存放pps数据].media.xml[存放找到mdat的位置],/sdcar

直播项目详解

项目下载地址 项目文件结构: Login : 登录页面集成了友盟第三方登录微信和QQ,新浪授权登录是请求新浪官方的OAuth请求,以及一些登录所需要的资源 Main :主要包含标签视图控制器UITabBarController .导航控制器UINavigationController.数据请求工具类XLLiveTool.业务逻辑类XLDealData.代理类.pch文件和单例的头文件,都是一些全局都能用的东西. Home : 首页,主要包括热门,最新和关注三部分,把这三部分添加到(XLHomeV

iOS8系统H264视频硬件编解码说明

文章-原址 公司项目原因,接触了一下视频流H264的编解码知识,之前项目使用的是FFMpeg多媒体库,利用CPU做视频的编码和解码,俗称为软编软解.该方法比较通用,但是占用CPU资源,编解码效率不高.一般系统都会提供GPU或者专用处理器来对视频流进行编解码,也就是硬件编码和解码,简称为硬编解码.苹果在iOS 8.0系统之前,没有开放系统的硬件编码解码功能,不过Mac OS系统一直有,被称为Video ToolBox的框架来处理硬件的编码和解码,终于在iOS 8.0后,苹果将该框架引入iOS系统.