[quote ]ffmpeg, gstreamer, Raspberry Pi, Windows Desktop streaming
http://blog.pi3g.com/2013/08/ffmpeg-gstreamer-raspberry-pi-windows-desktop-streaming/
This is a work still in progress with unsatisfactory results (image quality, delay, very low frame rate), but here’s for the brave-hearted and those who are researching into the same direction:
Set up Windows streaming host
This can be a multi-monitor machine. Your left-most monitor will be streamed.
I generally use FullHD resolution for testing.
- Install a Direct Show Screen Capture Filter for Windows. We used the direct show filter provided with “Screen Capturer Recorder” by Roger D Pack. Roger also includes an audio direct show capturer. And all free of charge – a real bargain
- Maybe a reboot is necessary here
- Install latest version of ffmpeg from Zeranoe. Opt for the static builds (probably 64 bit if you are running a modern Windows 64 bit OS on a modern computer)
- extract the download to a safe location
- Open PowerShell, and navigate to the location
List the available screen filter devices:
This and all following shell commands are to be issued in the PowerShell.
.\ffmpeg -list_devices true -f dshow -i dummy
This will show you the available input devices to capture from. My list looks like this, for instance:
DirectShow video devices "Integrated Webcam" "screen-capture-recorder" DirectShow audio devices "Microphone (2- High Definition Audio Device)" "virtual-audio-capturer"
Start the stream:
.\ffmpeg -f dshow -i video="screen-capture-recorder" -vcodec libx264 -vprofile baseline -preset ultrafast -tune zerolatency -pix_fmt yuv420p -b:v 400k -r 30 -threads 4 -fflags nobuffer -f rtp rtp://192.168.1.14:1234
I used PowerShell to start this, thus the .\ is needed in front of an application in the current folder.
- libx264 is used as video codec, rather than mpeg4 (for superior quality – the Raspi is capable of H264 hardware decoding)
- baseline profile needs to be used together with –pix_fmt yuv420p – this basically reduces the encoding to a simple subset of the full standard. Leaving out these two options led to the streaming not working, but you may be able to figure out something – please comment!
- -preset ultrafast and –tune zerolatency both accelerate the video output. I have a latency of about 1 – 2 sec. in our lab here
- -b:v 400k sets the target bitrate (as variable)
- -r 30 this sets the framerate to 30
- -threads 4 – give more threads to ffmpeg
- -fflags nobuffer – should decrease latency even further. Not sure if it does, though.
- -f rtp – specifies the output format. Here we use rtp, and stream it directly to the raspberry – which has the IP 192.168.1.14 on our network. You can choose whatever you like for the port, by an odd coincidence we chose 1234. Aliens?!?
Hit “Enter” and ffmpeg will start streaming. It will show you handy statistics – current frame number, framerate, quality, total size, total time, current bitrate, duplicated capture-frames, dropped capture-frames (i.e. the capturing rate does not align with the streaming rate). Do not worry too much about those for now.
Please note that you need some horsepower for capturing, encoding and streaming in real-time.
Set up Raspberry Pi
omxplayer can’t handle RTP streams directly – thus, we resort to GStreamer.
GStreamer 1.0 includes special support for the Raspberry Pi’s Broadcom SoC’s VideoCore IV hardware video functions (also known as OpenMax). Unfortunately, the Raspbian maintainers do not want to include it (yet), in order not to diverge too far from the official Debian repositories.
Luckily for you, though, someone has precompiled the binaries and set up a repository. See this thread for more background information, or simply follow my instructions:
sudo nano /etc/apt/sources.list
This will open nano to edit your package repository list. Please add the following line into this file:
deb http://vontaene.de/raspbian-updates/ . main
After saving the file (Ctrl + O, Ctrl + X), run the following commands:
sudo aptitude update sudo aptitude install libgstreamer1.0-0-dbg gstreamer1.0-tools libgstreamer-plugins-base1.0-0 gstreamer1.0-plugins-good gstreamer1.0-plugins-bad-dbg gstreamer1.0-omx gstreamer1.0-alsa
This will install the necessary gstreamer1.0 & components.
Start the stream receiver & decoder chain:
gst-launch-1.0 -v udpsrc port=1234 caps=‘application/x-rtp,payload=(int)96,encoding-name=(string)H264‘ ! queue ! rtph264depay ! h264parse ! omxh264dec ! autovideosink sync=True
This can be done as user pi. Please note, that this may not be the perfect command to achieve playback, but it is a good starting point – as it works!
Gstreamer sets up “pipelines”, in which data is passed on in transformed state from step to step. While it seems to be quite a bit at the first look, it is very logical in itself, once you have figured it out.
- we specify a UDP source (udpsrc), the port, and “caps”
- Without the RTP caps, playback is not possible. Apparently they are not provided along with the stream? Thus, we have to specify the caps manually.
- In the caps we specify some information for the pipeline
- queue may be omitted, I am not sure what it does
- rtph264depay – depayload h264 data from rtp stream
- h264parse – parse h264 data
- omxh264dec – decode the data with BroadCom OpenMAX hardware acceleration
- autovideosink – put the result on the display
- sync=True – I am not sure whether this does anything, or whether it is in the right place and form. It was an attempt to fix the gst_base_sink_is_too_late problems (but it did NOT fix them).
Issues
slow screen updates
These are very likely caused by a slow screen capture refresh rate, this may be better with a different screen capturer.
On Windows 8, with a pretty powerful Core i7 machine, I get possible fps 15.41 (negotiated for 30 fps). This is using Roger’s / betterlogic’s screen-capture-recorder. Roger claims this is due to Aero.
See more about it here and here (also provides a list of available other directshow screen capture filters).
artifacts
Gstreamer shows massive H.264 artifacts – Matthias Bock has opened an issue for this, and some further hints.
This seems to be related to the bitrate set in FFMPEG – if I lower it to ~ 400 k, the artifacts become less distorted, and image quality is quite OK. Also, use a variable bitrate instead of a constant one.
gst_base_sink_is_too_late()
This may be related to the Pi’s fake hardware clock (?). It also appears when running gstreamer with a simple test image setup:
gst-launch-1.0 videotestsrc ! autovideosink
gstbasesink.c(2683): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstEglGlesSink:autovideosink0-actual-sink-eglgles:
There may be a timestamping problem, or this computer is too slow.
The command above will display a test video image.
Sound
I have not tried sound yet. Sound shoud be input into ffmpeg using the following arguments:
-i audio="virtual-audio-capturer":video="screen-capture-recorder"
This directly from Roger’s GitHUB documentation.
Ideas
- try to use gstreamer on Windows for streaming?
- Adjust Parameters for betterlogic/Roger’s direct show capturer
- apparently it hits the ceiling at 15 fps with Aero on
- Use a different direct show capturer
- Tune quality for ffmpeg stream
Background info
- H.264 is MPEG-4 Part 10 or = MPEG-4 AVC – and is the more modern and data-efficient codec format (“advanced video coding”);
- whereas MPEG-4 Part 2 = MPEG-4 Visual is based on the older image compression standards used in MPEG-2, and also implemented in DivX, Xvid, etc.
- you can also use .\ffplay –i udp://:1234 to test the streaming output on the local machine. The video quality IS NOT TO BE USED AS A REFERENCE. It just shows, that it “works”. Change the target IP accordingly (“localhost” instead of the Raspi’s IP will do, I believe.)
References
- List of DirectShow Screen Capture Filters
- How to use VLC to stream – link for historical purposes; command line means better control than vlc graphical interface.
- Unreal Screen capture DirectShow source filter – may give better frame rate than Roger’s filter.
- ffmpeg Streaming Guide – to give you an idea how ffmpeg is being set up for streaming, includes some hints about Latency
- How to grab the desktop with FFmpeg – includes information how to do this under Linux, too.
- General latency optimisation suggestions for streaming
- Nerdlogger’s article on streaming the Windows desktop using ffmpeg
- H.264 @ Wikipedia – background about baseline profile
- MPEG-4 @ Wikipedia
- Matthiasbock gstreamer GITHub Repo
- Gstreamer 1.0 for Raspbian
- Gst-Launch Manpage (for 0.10 version, might not be applicable 100 %)
- Gstreamer Cheat Sheet
- RTP vs TS White paper – RTP is a better choice for MPEG-4 AVC/AAC data as compared to MPEG-2 TS
ffmpeg, gstreamer, Raspberry Pi, Windows Desktop streaming
This is a work still in progress with unsatisfactory results (image quality, delay, very low frame rate), but here’s for the brave-hearted and those who are researching into the same direction:
Set up Windows streaming host
This can be a multi-monitor machine. Your left-most monitor will be streamed.
I generally use FullHD resolution for testing.
- Install a Direct Show Screen Capture Filter for Windows. We used the direct show filter provided with “Screen Capturer Recorder” by Roger D Pack. Roger also includes an audio direct show capturer. And all free of charge – a real bargain
- Maybe a reboot is necessary here
- Install latest version of ffmpeg from Zeranoe. Opt for the static builds (probably 64 bit if you are running a modern Windows 64 bit OS on a modern computer)
- extract the download to a safe location
- Open PowerShell, and navigate to the location
List the available screen filter devices:
This and all following shell commands are to be issued in the PowerShell.
.\ffmpeg -list_devices true -f dshow -i dummy
This will show you the available input devices to capture from. My list looks like this, for instance:
DirectShow video devices "Integrated Webcam" "screen-capture-recorder" DirectShow audio devices "Microphone (2- High Definition Audio Device)" "virtual-audio-capturer"
Start the stream:
.\ffmpeg -f dshow -i video="screen-capture-recorder" -vcodec libx264 -vprofile baseline -preset ultrafast -tune zerolatency -pix_fmt yuv420p -b:v 400k -r 30 -threads 4 -fflags nobuffer -f rtp rtp://192.168.1.14:1234
I used PowerShell to start this, thus the .\ is needed in front of an application in the current folder.
- libx264 is used as video codec, rather than mpeg4 (for superior quality – the Raspi is capable of H264 hardware decoding)
- baseline profile needs to be used together with –pix_fmt yuv420p – this basically reduces the encoding to a simple subset of the full standard. Leaving out these two options led to the streaming not working, but you may be able to figure out something – please comment!
- -preset ultrafast and –tune zerolatency both accelerate the video output. I have a latency of about 1 – 2 sec. in our lab here
- -b:v 400k sets the target bitrate (as variable)
- -r 30 this sets the framerate to 30
- -threads 4 – give more threads to ffmpeg
- -fflags nobuffer – should decrease latency even further. Not sure if it does, though.
- -f rtp – specifies the output format. Here we use rtp, and stream it directly to the raspberry – which has the IP 192.168.1.14 on our network. You can choose whatever you like for the port, by an odd coincidence we chose 1234. Aliens?!?
Hit “Enter” and ffmpeg will start streaming. It will show you handy statistics – current frame number, framerate, quality, total size, total time, current bitrate, duplicated capture-frames, dropped capture-frames (i.e. the capturing rate does not align with the streaming rate). Do not worry too much about those for now.
Please note that you need some horsepower for capturing, encoding and streaming in real-time.
Set up Raspberry Pi
omxplayer can’t handle RTP streams directly – thus, we resort to GStreamer.
GStreamer 1.0 includes special support for the Raspberry Pi’s Broadcom SoC’s VideoCore IV hardware video functions (also known as OpenMax). Unfortunately, the Raspbian maintainers do not want to include it (yet), in order not to diverge too far from the official Debian repositories.
Luckily for you, though, someone has precompiled the binaries and set up a repository. See this thread for more background information, or simply follow my instructions:
sudo nano /etc/apt/sources.list
This will open nano to edit your package repository list. Please add the following line into this file:
deb http://vontaene.de/raspbian-updates/ . main
After saving the file (Ctrl + O, Ctrl + X), run the following commands:
sudo aptitude update sudo aptitude install libgstreamer1.0-0-dbg gstreamer1.0-tools libgstreamer-plugins-base1.0-0 gstreamer1.0-plugins-good gstreamer1.0-plugins-bad-dbg gstreamer1.0-omx gstreamer1.0-alsa
This will install the necessary gstreamer1.0 & components.
Start the stream receiver & decoder chain:
gst-launch-1.0 -v udpsrc port=1234 caps=‘application/x-rtp,payload=(int)96,encoding-name=(string)H264‘ ! queue ! rtph264depay ! h264parse ! omxh264dec ! autovideosink sync=True
This can be done as user pi. Please note, that this may not be the perfect command to achieve playback, but it is a good starting point – as it works!
Gstreamer sets up “pipelines”, in which data is passed on in transformed state from step to step. While it seems to be quite a bit at the first look, it is very logical in itself, once you have figured it out.
- we specify a UDP source (udpsrc), the port, and “caps”
- Without the RTP caps, playback is not possible. Apparently they are not provided along with the stream? Thus, we have to specify the caps manually.
- In the caps we specify some information for the pipeline
- queue may be omitted, I am not sure what it does
- rtph264depay – depayload h264 data from rtp stream
- h264parse – parse h264 data
- omxh264dec – decode the data with BroadCom OpenMAX hardware acceleration
- autovideosink – put the result on the display
- sync=True – I am not sure whether this does anything, or whether it is in the right place and form. It was an attempt to fix the gst_base_sink_is_too_late problems (but it did NOT fix them).
Issues
slow screen updates
These are very likely caused by a slow screen capture refresh rate, this may be better with a different screen capturer.
On Windows 8, with a pretty powerful Core i7 machine, I get possible fps 15.41 (negotiated for 30 fps). This is using Roger’s / betterlogic’s screen-capture-recorder. Roger claims this is due to Aero.
See more about it here and here (also provides a list of available other directshow screen capture filters).
artifacts
Gstreamer shows massive H.264 artifacts – Matthias Bock has opened an issue for this, and some further hints.
This seems to be related to the bitrate set in FFMPEG – if I lower it to ~ 400 k, the artifacts become less distorted, and image quality is quite OK. Also, use a variable bitrate instead of a constant one.
gst_base_sink_is_too_late()
This may be related to the Pi’s fake hardware clock (?). It also appears when running gstreamer with a simple test image setup:
gst-launch-1.0 videotestsrc ! autovideosink
gstbasesink.c(2683): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstEglGlesSink:autovideosink0-actual-sink-eglgles:
There may be a timestamping problem, or this computer is too slow.
The command above will display a test video image.
Sound
I have not tried sound yet. Sound shoud be input into ffmpeg using the following arguments:
-i audio="virtual-audio-capturer":video="screen-capture-recorder"
This directly from Roger’s GitHUB documentation.
Ideas
- try to use gstreamer on Windows for streaming?
- Adjust Parameters for betterlogic/Roger’s direct show capturer
- apparently it hits the ceiling at 15 fps with Aero on
- Use a different direct show capturer
- Tune quality for ffmpeg stream
Background info
- H.264 is MPEG-4 Part 10 or = MPEG-4 AVC – and is the more modern and data-efficient codec format (“advanced video coding”);
- whereas MPEG-4 Part 2 = MPEG-4 Visual is based on the older image compression standards used in MPEG-2, and also implemented in DivX, Xvid, etc.
- you can also use .\ffplay –i udp://:1234 to test the streaming output on the local machine. The video quality IS NOT TO BE USED AS A REFERENCE. It just shows, that it “works”. Change the target IP accordingly (“localhost” instead of the Raspi’s IP will do, I believe.)
References
- List of DirectShow Screen Capture Filters
- How to use VLC to stream – link for historical purposes; command line means better control than vlc graphical interface.
- Unreal Screen capture DirectShow source filter – may give better frame rate than Roger’s filter.
- ffmpeg Streaming Guide – to give you an idea how ffmpeg is being set up for streaming, includes some hints about Latency
- How to grab the desktop with FFmpeg – includes information how to do this under Linux, too.
- General latency optimisation suggestions for streaming
- Nerdlogger’s article on streaming the Windows desktop using ffmpeg
- H.264 @ Wikipedia – background about baseline profile
- MPEG-4 @ Wikipedia
- Matthiasbock gstreamer GITHub Repo
- Gstreamer 1.0 for Raspbian
- Gst-Launch Manpage (for 0.10 version, might not be applicable 100 %)
- Gstreamer Cheat Sheet
- RTP vs TS White paper – RTP is a better choice for MPEG-4 AVC/AAC data as compared to MPEG-2 TS