Whole bunch of changes, including memory and queues

This commit is contained in:
Matthew Clark 2018-06-08 21:35:46 +01:00
parent bfdbb547d7
commit 4dc80889ad
10 changed files with 204 additions and 47 deletions

View file

@ -1,18 +1,20 @@
# GStreamer command-line cheat sheet
This series of docs provides a cheat sheet for GStreamer on the command-line.
A few Python examples are also included for when you need GStreamer to be dynamic (i.e. react to user or some other action).
A few Python examples are also [included](python_examples/) for when you need GStreamer to be dynamic (i.e. react to user or some other action).
## Contents
* [Test streams](test_streams.md)
* [Basics](basics.md)
* [Test streams](test_streams.md)
* [RTMP](rtmp.md)
* [Mixing video & audio](mixing.md)
* [Images](images.md)
* [Queues](queues.md)
* [Capturing images](capturing_images.md)
* [Sending to multiple destinations (tee)](tee.md)
* [Sending/receiving video from shared memory](memory_transfer.md)
* [Sharing and receiving pipelines, including sending/receiving video from shared memory](sharing_and_splitting_pipelines.md)
* [Network transfer](network_transfer.md) (including how to send so that VLC can preview)
## Sources and references
@ -38,14 +40,18 @@ If you want to interact with GStreamer after it's started (e.g. respond to an ev
### Python with GStreamer
Good GStreamer Python resources include:
Python is an easy language, so it's no surprise that it's good way to develop using GStreamer.
Some example scripts can be found in the [python_examples/](python_examples/) directory.
Other good GStreamer Python resources that I've found:
* [Getting started with GStreamer with Python](https://www.jonobacon.com/2006/08/28/getting-started-with-gstreamer-with-python/)
* [Python GStreamer Tutorial](http://brettviren.github.io/pygst-tutorial-org/pygst-tutorial.html)
* [Function reference](http://lazka.github.io/pgi-docs/#Gst-1.0)
* [Nice example script](https://github.com/rabits/rstream/blob/master/rstream.py)
### C++ with GStreamer
### C/C++ with GStreamer
My favourite reference is [Valadoc](https://valadoc.org/gstreamer-1.0/index.htm)

View file

@ -8,7 +8,7 @@
<h1>Demo TCP video playback</h1>
<h2>I've only managed to get this working on Firefox, not Chrome or Safari.</h2>
<video width=640 height=360 autoplay style="border: 1px solid green" controls>
<source src="http://localhost:9090/">
<source src="tcp://127.0.0.1:9090/">
</video>
</body>
</html>

View file

@ -1,37 +0,0 @@
# Capturing images (GStreamer command-line cheat sheet)
The [`shmsink`](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad/html/gst-plugins-bad-plugins-shmsink.html) element allows you to write video into shared memory, from which another gstreamer application can read it with [`shmsrc`](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad/html/gst-plugins-bad-plugins-shmsrc.html).
### Putting a stream into memory
Put a test video source into memory:
```
gst-launch-1.0 -v videotestsrc ! \
'video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=(fraction)30/1' ! \
queue ! identity ! \
shmsink wait-for-connection=1 socket-path=/tmp/tmpsock shm-size=20000000 sync=true
```
Another example, this time from a file rather than test source, and keeping the audio local:
```
gst-launch-1.0 filesrc location=$SRC ! \
qtdemux name=demux demux.audio_0 ! queue ! decodebin ! audioconvert ! audioresample ! \
autoaudiosink \
demux.video_0 ! queue ! \
decodebin ! videoconvert ! videoscale ! videorate ! \
'video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=(fraction)30/1' ! \
queue ! identity ! \
shmsink wait-for-connection=0 socket-path=/tmp/tmpsock shm-size=20000000 sync=true
```
### Reading a stream from memory
This will display the video of a stream locally:
```
gst-launch-1.0 shmsrc socket-path=/tmp/tmpsock ! \
'video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=(fraction)30/1' ! \
autovideosink
````

View file

@ -29,7 +29,7 @@ gst-launch-1.0 \
mix.
```
Put a box around the in-picture using videobox e.g.
Put a box around the in-picture using `videobox` e.g.
```
gst-launch-1.0 \

View file

@ -83,6 +83,10 @@ gst-launch-1.0 \
rtpmp2tdepay ! decodebin name=decoder ! autoaudiosink decoder. ! autovideosink
```
### How to receive with VLC
To receive a UDP stream, an `sdp` file is required. An example can be found at https://gist.github.com/nebgnahz/26a60cd28f671a8b7f522e80e75a9aa5
## TCP
### Audio via TCP

63
queues.md Normal file
View file

@ -0,0 +1,63 @@
# Queues
A `queue` can appear almost anywhere in a GStreamer pipeline. Like most elements, it has an input (sink) and output (src). It has two uses:
* As a thread boundary - i.e. The elements after a queue will run in a different thread to those before it
* As a buffer, for when different parts of the pipeline may move at different speeds.
Queues can generally be added anywhere in a pipeline. For example, a test stream:
```
gst-launch-1.0 videotestsrc ! autovideosink
```
This works just as well with a queue in the middle:
```
gst-launch-1.0 videotestsrc ! queue ! autovideosink
```
(If you [count the number of threads on the process](https://stackoverflow.com/questions/28047653/osx-how-can-i-see-the-tid-of-all-threads-from-my-process), you will see that this second example, with a queue, has one more.)
Queues add latency, so the general advice is not to add them unless you need them.
## Queue2
Confusingly, `queue2` is not a replacement for `queue`. It's not obvious when to use one or the other.
Most of the time, `queue2` appears to replace `queue` without issue. For example:
```
# Same as above, but with queue2 instead of queue:
gst-launch-1.0 videotestsrc ! queue2 ! autovideosink
```
According to the [GStreamer tutorial](https://gstreamer.freedesktop.org/documentation/tutorials/basic/handy-elements.html), _as a rule of thumb, prefer queue2 over queue whenever network buffering is a concern to you._
## Multiqueue
The `multiqueue` can provide multiple queues. If, for example, you have a video and an audio queue, it can handle them both, and do a better job of allowing one to grow if the other is delayed.
As a simple (pointless) example, it can be used to replace `queue` or `queue2`
```
# Same as above, but with multiqueue instead of queue/queue2:
gst-launch-1.0 videotestsrc ! multiqueue ! autovideosink
```
A more realistic example is where there are two queues, such as here, for video and audio:
```
gst-launch-1.0 \
videotestsrc ! queue ! autovideosink \
audiotestsrc ! queue ! autoaudiosink
```
The two queues could be replaced with one multiqueue. Naming it (in this case, `q`) allows it to be referenced later.
```
gst-launch-1.0 \
videotestsrc ! multiqueue name=q ! autovideosink \
audiotestsrc ! q. q. ! autoaudiosink
```

18
rtmp.md
View file

@ -116,13 +116,23 @@ gst-launch-1.0 \
### Sending a test stream to an RTMP server
This will send a test video source:
This will send a video test source:
```
gst-launch-1.0 videotestsrc is-live=true ! \
queue ! x264enc ! flvmux name=muxer ! rtmpsink location="$RTMP_DEST live=1"
```
This will send a audio test source (not `flvmux` is still required even though there is no muxing of audio & video):
```
gst-launch-1.0 audiotestsrc is-live=true ! \
audioconvert ! audioresample ! audio/x-raw,rate=48000 ! \
voaacenc bitrate=96000 ! audio/mpeg ! aacparse ! audio/mpeg, mpegversion=4 ! \
flvmux name=mux ! \
rtmpsink location=$RTMP_DEST
```
This sends both video and audio as a test source:
```
@ -163,7 +173,7 @@ gst-launch-1.0 filesrc location=$SRC ! \
---
Can we work out why a bad RTMP brings down the other mix?
TODO - Can we work out why a bad RTMP brings down the other mix?
```
export QUEUE="queue max-size-time=0 max-size-bytes=0 max-size-buffers=0"
@ -181,3 +191,7 @@ gst-launch-1.0 \
videoscale ! video/x-raw,width=320,height=180! \
mix.
```
## Misc: latency
There's a comment about reducing latency at https://lists.freedesktop.org/archives/gstreamer-devel/2018-June/068076.html

View file

@ -0,0 +1,100 @@
# Sharing and splitting pipelines (GStreamer command-line cheat sheet)
There are various reasons why you might want your video (or audio) to leave the pipeline, such as:
* To enter a separate application, such as [Snowmix](http://snowmix.sourceforge.net/)
* To use multiple processes (perhaps for security reasons)
* To split into multiple pipelines, so that a failure in one part does not alter another
* To split into multiple pipelines, so that you can 'seek' (jump to a certain point) in one without affecting another
To quote from http://blog.nirbheek.in/2018/02/decoupling-gstreamer-pipelines.html:
> In some applications, you want even greater decoupling of parts of your pipeline.
> For instance, if you're reading data from the network, you don't want a network error
> to bring down our entire pipeline, or if you're working with a hotpluggable device,
> device removal should be recoverable without needing to restart the pipeline.
There are many elements that can achieve this, each with their own pros and cons.
## Summary of methods to share and split pipelines
_As with the rest of this site, this is a rough guide, and is probably not complete or accurate!_
Name | Description | Points to note | Further reading
---- | ----------- | -------------- | ---------------
*shmsink and shmsrc* | Allows video to be read/written from shared memory | * Used to send/receive from Snowmix | See below
*appsrc/appsink* | Allows video data to leave/enter the pipeline from your own application | n/a | https://thiblahute.github.io/GStreamer-doc/app-1.0/index.html?gi-language=c
*fdsrc/fdsink* | Allows communication via a file descriptor | n/a | https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-plugins/html/gstreamer-plugins-fdsrc.html
*interpipe* | Allows simple communication between two or more independent pipelines. Very powerful. | * Not part of GStreamer (though it is open-source... I'm not sure why it's not been included) | * Well-documented at https://developer.ridgerun.com/wiki/index.php?title=GstInterpipe
* https://gstreamer.freedesktop.org/data/events/gstreamer-conference/2015/Melissa%20Montero%20-%20GST%20Daemon%20and%20Interpipes:%20A%20simpler%20way%20to%20get%20your%20applications%20done%20.pdf
*inter* (intervideosink, etc) | Send/receive AV between two pipelines in the same process | Only support raw audio or video, and drop events and queries at the boundary (source: [Nirbheek's blog](http://blog.nirbheek.in/2018/02/decoupling-gstreamer-pipelines.html)) | * https://thiblahute.github.io/GStreamer-doc/inter-1.0/index.html?gi-language=c
* Pros and cons discussed here: http://gstreamer-devel.966125.n4.nabble.com/How-to-connect-intervideosink-and-intervideosrc-for-IPC-pipelines-td4684567.html
*ipcpipeline* | Allows communication between pipelines *in different processes*. | * Arrived with GStreamer 1.14 (Spring 2018) | https://www.collabora.com/news-and-blog/blog/2017/11/17/ipcpipeline-splitting-a-gstreamer-pipeline-into-multiple-processes/
*gstproxy (proxysink and proxysrc)* | Send/receive AV between two pipelines in the same process. | * Arrived with GStreamer 1.14 (Spring 2018) | * See below
* Introduced by the blog mentioned above (http://blog.nirbheek.in/2018/02/decoupling-gstreamer-pipelines.html)
* Example code on proxysrc here: https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad-plugins/html/gst-plugins-bad-plugins-proxysrc.html
* Equivalent proxysink: https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad-plugins/html/gst-plugins-bad-plugins-proxysink.html
## Sharing via memory - shmsink and shmsrc
The [`shmsink`](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad/html/gst-plugins-bad-plugins-shmsink.html) element allows you to write video into shared memory, from which another gstreamer application can read it with [`shmsrc`](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad/html/gst-plugins-bad-plugins-shmsrc.html).
### Putting a stream into memory
Put a test video source into memory:
```
gst-launch-1.0 -v videotestsrc ! \
'video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=(fraction)30/1' ! \
queue ! identity ! \
shmsink wait-for-connection=1 socket-path=/tmp/tmpsock shm-size=20000000 sync=true
```
Another example, this time from a file rather than test source, and keeping the audio local:
```
gst-launch-1.0 filesrc location=$SRC ! \
qtdemux name=demux demux.audio_0 ! queue ! decodebin ! audioconvert ! audioresample ! \
autoaudiosink \
demux.video_0 ! queue ! \
decodebin ! videoconvert ! videoscale ! videorate ! \
'video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=(fraction)30/1' ! \
queue ! identity ! \
shmsink wait-for-connection=0 socket-path=/tmp/tmpsock shm-size=20000000 sync=true
```
### Reading a stream from memory
This will display the video of a stream locally:
```
gst-launch-1.0 shmsrc socket-path=/tmp/tmpsock ! \
'video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=(fraction)30/1' ! \
autovideosink
````
## gstproxy (proxysink and proxysrc)
I've used *proxysink* and *proxysrc* to split large pipelines into smaller ones. That way, if a part fails, the rest can continue.
It's not possible to use them via the command-line, because you connect them by having the receiver (`proxysrc`) reference the sender (`proxysink`).
A very simple example would be:
```
1st pipeline: audiotestsrc is-live=1 ! proxysink
2nd pipeline: proxysrc ! autoaudiosink
```
This would achieve the same as `audiotestsrc | autoaudiosink`, but in two pipelines.
An Python example of this can be found at [python_examples/gstproxy_01_audiotestsrc/py](python_examples/gstproxy_01_audiotestsrc/py).
A slightly more interesting example can be found at
[python_examples/gstproxy_02_playbin/py](python_examples/gstproxy_02_playbin/py). This plays a video file (e.g. mp4). It shows:
# that `proxysink` can work with [`playbin`](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/gst-plugins-base-plugins-playbin.html)
# separate proxies for audio and video
# _TO BE CONFIRMED_ that when the video ends, the other pipelines continue.

2
tee.md
View file

@ -1,6 +1,6 @@
# Multiple outputs (tee)
The `tee` command allows audio & video streams to be sent to more than one place.
This page describes the `tee` element, which allows audio & video streams to be sent to more than one place.
## Tee to two local video outputs

View file

@ -0,0 +1,7 @@
gst-launch-1.0 -e videotestsrc ! video/x-raw-yuv, framerate=25/1, width=640, height=360 ! x264enc ! \
mpegtsmux ! filesink location=test.ts
gst-launch-1.0 -e videotestsrc !\
x264enc !\
mpegtsmux !\
filesink location=test.ts