gstreamer

Ogg Vorbis is a completely open, patent-free, professional audio encoding and streaming technology
Copy-on-write (COW), sometimes referred to as implicit sharing[1] or shadowing,[2]
is a resource-management technique used in computer programming to efficiently implement a "duplicate" or "copy" operation on modifiable resources.[3]
If a resource is duplicated but not modified, it is not necessary to create a new resource; the resource can be shared between the copy and the original.
Modifications must still create a copy, hence the technique: the copy operation is deferred to the first write. By sharing resources in this way,
it is possible to significantly reduce the resource consumption of unmodified copies, while adding a small overhead to resource-modifying operations.

Pads are element‘s input and output, where you can connect other elements.
Pads have specific data handling capabilities,Data types are negotiated between pads using a process called caps negotiation
A bin is a container for a collection of elements
A pipeline is a top-level bin. It provides a bus for the application and manages the synchronization for its children

buffers are objects for passing streaming data between elements in the pipeline.
events are objects sent between elements or from the application to elements
messages are objects posted by elements on the pipeline‘s message bus
queries allow applications to request information such as duration or current playback position from the pipeline(always synchronous)

gst_init has to be called from the main application. This call will perform the necessary initialization of the library
A GstElement object is created from a factory.
gst-inspect
Element factories are the basic types retrieved from the GStreamer registry, they describe all plugins and elements that GStreamer can create
adding an element to a bin will disconnect any already existing links. Also, you cannot directly link elements that are not in the same bin or pipeline
When you set a bin or pipeline to a certain target state, it will usually propagate the state change to all elements within the bin or pipeline automatically
you add an element to will take ownership of that element. If you destroy the bin, the element will be dereferenced with it.
playbin and uridecodebin
A bus is a simple system that takes care of forwarding messages from the streaming threads to an application in its own thread context
pipeline: gstreamer will create at least one thread FOR Bus to pass message to appication
Caps are called simple caps when they contain only one structure, and fixed caps when they contain only one structure and have no variable field types
Events are control particles that are sent both up- and downstream in a pipeline along with buffers,Buffers contain the actual media data

Seeking is done using the concept of events
Querying is defined as requesting a specific stream property related to progress trackin

nternally, queries will be sent to the sinks, and “dispatched” backwards until one element can handle it

After the pipeline is prerolled, it will go back to the state (PAUSED or PLAYING)
prerolled()
The first is stream tags, which describe the content of a stream in a non-technical way. Examples include the author of a song, the title of that very same song or the album it is a part of.
The other type of metadata is stream-info, which is a somewhat technical description of the properties of a stream. This can include video size, audio samplerate, codecs used and so on.

Stream information can most easily be read by reading it from a GstPad. This has already been discussed before in Using capabilities for metadata. Therefore, we will skip it here.
Note that this requires access to all pads of which you want stream information.

Tag reading is done through a bus in GStreamer, which has been discussed previously in Bus. You can listen for GST_MESSAGE_TAG messages and handle them as you wish

GStreamer uses a GstClock object, buffer timestamps and a SEGMENT event to synchronize streams in a pipeline as we will see in the next sections.
Synchronization is now a matter of making sure that a buffer with a certain running-time is played when the clock reaches the same running-time.

Latency compensation
Before the pipeline goes to the PLAYING state, it will, in addition to selecting a clock and calculating a base-time, calculate the latency in the pipeline.
It does this by doing a LATENCY query on all the sinks in the pipeline. The pipeline then selects the maximum latency in the pipeline and configures this with a LATENCY event.

All sink elements will delay playback by the value in the LATENCY event. Since all sinks delay with the same amount of time, they will be relative in sync

he purpose of buffering is to accumulate enough data in a pipeline so that playback can occur smoothly and without interruptions
In the buffering state, the application should keep the pipeline in the PAUSED state

An element can, for example, choose to start a thread to start pulling from the sink pad or/and start pushing

What will happen in any case is that some elements will start a thread for their data processing, called the “streaming threads”. The streaming threads, or GstTask objects, are created from a GstTaskPool when the element needs to make a streaming thread

时间: 2024-08-27 00:17:40

gstreamer的相关文章

gstreamer 10.5版本发布啦

??????,今天偶逛gstreamer 官网:https://gstreamer.freedesktop.org/releases/gst-plugins-bad/1.10.5.html 在10.5版本的release memo 中竟然见自己名字列在名单中,果断截图做个纪念.

Gstreamer开发环境安装与测试

1)安装: sudo apt-get install libgstreamer0.10-dev gstreamer-tools gstreamer0.10-tools gstreamer0.10-doc sudo apt-get install gstreamer0.10-plugins-base gstreamer0.10-plugins-good gstreamer0.10-plugins-ugly gstreamer0.10-plugins-bad gstreamer0.10-plugin

ubuntu系统下安装gstreamer的ffmpeg支持

当您在安装gstreamer到您的ubuntu系统中时,为了更好地进行流媒体开发,需要安装ffmpeg支持,但一般情况下,直接使用 sudo apt-get install gstreamer0.10-ffmpeg 会提示安装不成功. 我们在前应当添加PPA到您的系统,更新本地存储索引,然后再进行安装: sudo add-apt-repository ppa:mc3man/trusty-media sudo apt-get update sudo apt-get install gstreame

raspberrypi 编译 gstreamer 1.3.2

在http://gstreamer.freedesktop.org/src/ 下载gstreamer 1.3.2 进入解压后的目录 执行 chmod 777 configure 获取读写权限 执行 ./configure make make install 安装过程中可能会缺少库的处理方法: bison:apt-get install bison flex:apt-get install flex glib-2.0 :apt-get install libglibmm-2.4-dev 如果在运行

GStreamer 记录

GStreamer 是一个新的多媒体框架,大大简化了多媒体工具的开发流程,比如,这里有一个 IBM 的文档,介绍了一个 MP3 播放器. http://www.ibm.com/developerworks/cn/linux/l-gstreamer/ 另一个,用 python + GStreamer 做的播放器: http://blog.csdn.net/sxwyf248/article/details/7031481 用 GStreamer 做摄像头测试 http://blog.sina.com

为什么gstreamer 编译出来少了很多mod

1. [email protected]:~/code/openwrt$ find -name "gstreamer*.ipk" ./bin/ar71xx/packages/packages/gstreamer1-utils_1.4.5-1_ar71xx.ipk./bin/ar71xx/packages/packages/gstreamer1-libs_1.4.5-1_ar71xx.ipk./bin/ar71xx/packages/packages/gstreamer1-plugins

Gstreamer学习

Gstreamer学习笔记----Gstreamer架构设计思想 http://blog.csdn.net/tx3344/article/details/7497434 Gstreamer到底是个啥? GStreamer 是一个 基于pipeline的多媒体框架,基于GObject,以C语言写成. 应用GStreamer这个这个多媒体框架,你可以写出任意一种流媒体的应用来如:meidaplayer.音视频编辑器.VOIP.流媒体服务器.音视频编码等等. pipeline是啥? pipeline是

raspi # gstreamer - tcpclientsink 和 udpsrc 插件用法

前提说明: 在做gstreamer项目的时候某些时候需要主动发送设备中采集的数据到服务端, 这样就可以利用tcpclientsink和udpsink插件,主动发送数据到指定的服务器. tcpclientsink 用法 说明:如果想主动往服务器发送数据,可以通过tcpclientsink插件进行传输 具体的代码如下 data-client: 发送到linux: raspivid -t 0 -w 800 -h 600 -fps 25 -g 5 -b 4000000 -vf -p 20,20,640,

gstreamer让playbin能够播放rtp over udp流数据

最近一段时间在研究传屏低延迟传输相关的一些东西.本来想使用gstreamer来验证下rtp over udp传送h264 nal数据相关 的,结果发现竟然不能用playbin来播放rtp的数据!诚然,这也有其原因,因为rtp需要一些带外数据,这是不能简单通过流 来检查的,然而也没有手段简单传入sdp给playbin,让其正常工作.没有办法,在gstreamer-devel和https://bugzilla.gnome.org bug管理平台上搜了一圈,有人碰到类似的问题,但是没有完整的解决方案,

openwrt gstreamer实例学习笔记(三.深入了解gstreamer 的 Element)

在前面的部分,我们简要介绍过 GstElementFactory 可以用来创建一个element的实例,但是GstElementFactory不仅仅只能做这件事,GstElementFactory作为在 GStreamer 注册系统中的一个基本类型,它可以描述所有的插件(plugins)以及由GStreamer创建的element.这意味着GstElementFactory可以应用于一些自动element实例, 像自动插件(autopluggers); 或者创建一个可用element列表,像管道