diff --git a/README.md b/README.md index f640f91..62b1a18 100644 --- a/README.md +++ b/README.md @@ -1,18 +1,20 @@ # GStreamer command-line cheat sheet This series of docs provides a cheat sheet for GStreamer on the command-line. -A few Python examples are also included for when you need GStreamer to be dynamic (i.e. react to user or some other action). + +A few Python examples are also [included](python_examples/) for when you need GStreamer to be dynamic (i.e. react to user or some other action). ## Contents -* [Test streams](test_streams.md) * [Basics](basics.md) +* [Test streams](test_streams.md) * [RTMP](rtmp.md) * [Mixing video & audio](mixing.md) * [Images](images.md) +* [Queues](queues.md) * [Capturing images](capturing_images.md) * [Sending to multiple destinations (tee)](tee.md) -* [Sending/receiving video from shared memory](memory_transfer.md) +* [Sharing and receiving pipelines, including sending/receiving video from shared memory](sharing_and_splitting_pipelines.md) * [Network transfer](network_transfer.md) (including how to send so that VLC can preview) ## Sources and references @@ -38,14 +40,18 @@ If you want to interact with GStreamer after it's started (e.g. respond to an ev ### Python with GStreamer -Good GStreamer Python resources include: +Python is an easy language, so it's no surprise that it's good way to develop using GStreamer. + +Some example scripts can be found in the [python_examples/](python_examples/) directory. + +Other good GStreamer Python resources that I've found: * [Getting started with GStreamer with Python](https://www.jonobacon.com/2006/08/28/getting-started-with-gstreamer-with-python/) * [Python GStreamer Tutorial](http://brettviren.github.io/pygst-tutorial-org/pygst-tutorial.html) * [Function reference](http://lazka.github.io/pgi-docs/#Gst-1.0) * [Nice example script](https://github.com/rabits/rstream/blob/master/rstream.py) -### C++ with GStreamer +### C/C++ with GStreamer My favourite reference is [Valadoc](https://valadoc.org/gstreamer-1.0/index.htm) diff --git a/html_examples/tcp-receive.html b/html_examples/tcp-receive.html index c46c64e..d5e7481 100644 --- a/html_examples/tcp-receive.html +++ b/html_examples/tcp-receive.html @@ -8,7 +8,7 @@

Demo TCP video playback

I've only managed to get this working on Firefox, not Chrome or Safari.

diff --git a/memory_transfer.md b/memory_transfer.md deleted file mode 100644 index c79cd70..0000000 --- a/memory_transfer.md +++ /dev/null @@ -1,37 +0,0 @@ -# Capturing images (GStreamer command-line cheat sheet) - -The [`shmsink`](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad/html/gst-plugins-bad-plugins-shmsink.html) element allows you to write video into shared memory, from which another gstreamer application can read it with [`shmsrc`](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad/html/gst-plugins-bad-plugins-shmsrc.html). - -### Putting a stream into memory - -Put a test video source into memory: - -``` -gst-launch-1.0 -v videotestsrc ! \ - 'video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=(fraction)30/1' ! \ - queue ! identity ! \ - shmsink wait-for-connection=1 socket-path=/tmp/tmpsock shm-size=20000000 sync=true -``` - -Another example, this time from a file rather than test source, and keeping the audio local: - -``` -gst-launch-1.0 filesrc location=$SRC ! \ - qtdemux name=demux demux.audio_0 ! queue ! decodebin ! audioconvert ! audioresample ! \ - autoaudiosink \ - demux.video_0 ! queue ! \ - decodebin ! videoconvert ! videoscale ! videorate ! \ - 'video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=(fraction)30/1' ! \ - queue ! identity ! \ - shmsink wait-for-connection=0 socket-path=/tmp/tmpsock shm-size=20000000 sync=true -``` - -### Reading a stream from memory - -This will display the video of a stream locally: - -``` -gst-launch-1.0 shmsrc socket-path=/tmp/tmpsock ! \ - 'video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=(fraction)30/1' ! \ - autovideosink -```` diff --git a/mixing.md b/mixing.md index 53caf5a..3b64944 100644 --- a/mixing.md +++ b/mixing.md @@ -29,7 +29,7 @@ gst-launch-1.0 \ mix. ``` -Put a box around the in-picture using ‘videobox’ e.g. +Put a box around the in-picture using `videobox` e.g. ``` gst-launch-1.0 \ diff --git a/network_transfer.md b/network_transfer.md index 10a86fa..65ede6b 100644 --- a/network_transfer.md +++ b/network_transfer.md @@ -83,6 +83,10 @@ gst-launch-1.0 \ rtpmp2tdepay ! decodebin name=decoder ! autoaudiosink decoder. ! autovideosink ``` +### How to receive with VLC + +To receive a UDP stream, an `sdp` file is required. An example can be found at https://gist.github.com/nebgnahz/26a60cd28f671a8b7f522e80e75a9aa5 + ## TCP ### Audio via TCP diff --git a/queues.md b/queues.md new file mode 100644 index 0000000..aa39597 --- /dev/null +++ b/queues.md @@ -0,0 +1,63 @@ +# Queues + +A `queue` can appear almost anywhere in a GStreamer pipeline. Like most elements, it has an input (sink) and output (src). It has two uses: + +* As a thread boundary - i.e. The elements after a queue will run in a different thread to those before it +* As a buffer, for when different parts of the pipeline may move at different speeds. + +Queues can generally be added anywhere in a pipeline. For example, a test stream: + +``` +gst-launch-1.0 videotestsrc ! autovideosink +``` + +This works just as well with a queue in the middle: + +``` +gst-launch-1.0 videotestsrc ! queue ! autovideosink +``` + +(If you [count the number of threads on the process](https://stackoverflow.com/questions/28047653/osx-how-can-i-see-the-tid-of-all-threads-from-my-process), you will see that this second example, with a queue, has one more.) + +Queues add latency, so the general advice is not to add them unless you need them. + +## Queue2 + +Confusingly, `queue2` is not a replacement for `queue`. It's not obvious when to use one or the other. + +Most of the time, `queue2` appears to replace `queue` without issue. For example: + +``` +# Same as above, but with queue2 instead of queue: +gst-launch-1.0 videotestsrc ! queue2 ! autovideosink +``` + +According to the [GStreamer tutorial](https://gstreamer.freedesktop.org/documentation/tutorials/basic/handy-elements.html), _as a rule of thumb, prefer queue2 over queue whenever network buffering is a concern to you._ + + +## Multiqueue + +The `multiqueue` can provide multiple queues. If, for example, you have a video and an audio queue, it can handle them both, and do a better job of allowing one to grow if the other is delayed. + +As a simple (pointless) example, it can be used to replace `queue` or `queue2` + +``` +# Same as above, but with multiqueue instead of queue/queue2: +gst-launch-1.0 videotestsrc ! multiqueue ! autovideosink +``` + +A more realistic example is where there are two queues, such as here, for video and audio: + +``` +gst-launch-1.0 \ + videotestsrc ! queue ! autovideosink \ + audiotestsrc ! queue ! autoaudiosink +``` + +The two queues could be replaced with one multiqueue. Naming it (in this case, `q`) allows it to be referenced later. + +``` +gst-launch-1.0 \ + videotestsrc ! multiqueue name=q ! autovideosink \ + audiotestsrc ! q. q. ! autoaudiosink +``` diff --git a/rtmp.md b/rtmp.md index eca79ad..c2ae7d5 100644 --- a/rtmp.md +++ b/rtmp.md @@ -116,13 +116,23 @@ gst-launch-1.0 \ ### Sending a test stream to an RTMP server -This will send a test video source: +This will send a video test source: ``` gst-launch-1.0 videotestsrc is-live=true ! \ queue ! x264enc ! flvmux name=muxer ! rtmpsink location="$RTMP_DEST live=1" ``` +This will send a audio test source (not `flvmux` is still required even though there is no muxing of audio & video): + +``` +gst-launch-1.0 audiotestsrc is-live=true ! \ + audioconvert ! audioresample ! audio/x-raw,rate=48000 ! \ + voaacenc bitrate=96000 ! audio/mpeg ! aacparse ! audio/mpeg, mpegversion=4 ! \ + flvmux name=mux ! \ + rtmpsink location=$RTMP_DEST +``` + This sends both video and audio as a test source: ``` @@ -163,7 +173,7 @@ gst-launch-1.0 filesrc location=$SRC ! \ --- -Can we work out why a bad RTMP brings down the other mix? +TODO - Can we work out why a bad RTMP brings down the other mix? ``` export QUEUE="queue max-size-time=0 max-size-bytes=0 max-size-buffers=0" @@ -181,3 +191,7 @@ gst-launch-1.0 \ videoscale ! video/x-raw,width=320,height=180! \ mix. ``` + +## Misc: latency + +There's a comment about reducing latency at https://lists.freedesktop.org/archives/gstreamer-devel/2018-June/068076.html diff --git a/sharing_and_splitting_pipelines.md b/sharing_and_splitting_pipelines.md new file mode 100644 index 0000000..062e9c2 --- /dev/null +++ b/sharing_and_splitting_pipelines.md @@ -0,0 +1,100 @@ +# Sharing and splitting pipelines (GStreamer command-line cheat sheet) + +There are various reasons why you might want your video (or audio) to leave the pipeline, such as: + +* To enter a separate application, such as [Snowmix](http://snowmix.sourceforge.net/) +* To use multiple processes (perhaps for security reasons) +* To split into multiple pipelines, so that a failure in one part does not alter another +* To split into multiple pipelines, so that you can 'seek' (jump to a certain point) in one without affecting another + +To quote from http://blog.nirbheek.in/2018/02/decoupling-gstreamer-pipelines.html: + +> In some applications, you want even greater decoupling of parts of your pipeline. +> For instance, if you're reading data from the network, you don't want a network error +> to bring down our entire pipeline, or if you're working with a hotpluggable device, +> device removal should be recoverable without needing to restart the pipeline. + +There are many elements that can achieve this, each with their own pros and cons. + +## Summary of methods to share and split pipelines + +_As with the rest of this site, this is a rough guide, and is probably not complete or accurate!_ + +Name | Description | Points to note | Further reading +---- | ----------- | -------------- | --------------- +*shmsink and shmsrc* | Allows video to be read/written from shared memory | * Used to send/receive from Snowmix | See below +*appsrc/appsink* | Allows video data to leave/enter the pipeline from your own application | n/a | https://thiblahute.github.io/GStreamer-doc/app-1.0/index.html?gi-language=c +*fdsrc/fdsink* | Allows communication via a file descriptor | n/a | https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-plugins/html/gstreamer-plugins-fdsrc.html +*interpipe* | Allows simple communication between two or more independent pipelines. Very powerful. | * Not part of GStreamer (though it is open-source... I'm not sure why it's not been included) | * Well-documented at https://developer.ridgerun.com/wiki/index.php?title=GstInterpipe +* https://gstreamer.freedesktop.org/data/events/gstreamer-conference/2015/Melissa%20Montero%20-%20GST%20Daemon%20and%20Interpipes:%20A%20simpler%20way%20to%20get%20your%20applications%20done%20.pdf +*inter* (intervideosink, etc) | Send/receive AV between two pipelines in the same process | Only support raw audio or video, and drop events and queries at the boundary (source: [Nirbheek's blog](http://blog.nirbheek.in/2018/02/decoupling-gstreamer-pipelines.html)) | * https://thiblahute.github.io/GStreamer-doc/inter-1.0/index.html?gi-language=c +* Pros and cons discussed here: http://gstreamer-devel.966125.n4.nabble.com/How-to-connect-intervideosink-and-intervideosrc-for-IPC-pipelines-td4684567.html +*ipcpipeline* | Allows communication between pipelines *in different processes*. | * Arrived with GStreamer 1.14 (Spring 2018) | https://www.collabora.com/news-and-blog/blog/2017/11/17/ipcpipeline-splitting-a-gstreamer-pipeline-into-multiple-processes/ +*gstproxy (proxysink and proxysrc)* | Send/receive AV between two pipelines in the same process. | * Arrived with GStreamer 1.14 (Spring 2018) | * See below +* Introduced by the blog mentioned above (http://blog.nirbheek.in/2018/02/decoupling-gstreamer-pipelines.html) +* Example code on proxysrc here: https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad-plugins/html/gst-plugins-bad-plugins-proxysrc.html +* Equivalent proxysink: https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad-plugins/html/gst-plugins-bad-plugins-proxysink.html + + +## Sharing via memory - shmsink and shmsrc + +The [`shmsink`](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad/html/gst-plugins-bad-plugins-shmsink.html) element allows you to write video into shared memory, from which another gstreamer application can read it with [`shmsrc`](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad/html/gst-plugins-bad-plugins-shmsrc.html). + +### Putting a stream into memory + +Put a test video source into memory: + +``` +gst-launch-1.0 -v videotestsrc ! \ + 'video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=(fraction)30/1' ! \ + queue ! identity ! \ + shmsink wait-for-connection=1 socket-path=/tmp/tmpsock shm-size=20000000 sync=true +``` + +Another example, this time from a file rather than test source, and keeping the audio local: + +``` +gst-launch-1.0 filesrc location=$SRC ! \ + qtdemux name=demux demux.audio_0 ! queue ! decodebin ! audioconvert ! audioresample ! \ + autoaudiosink \ + demux.video_0 ! queue ! \ + decodebin ! videoconvert ! videoscale ! videorate ! \ + 'video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=(fraction)30/1' ! \ + queue ! identity ! \ + shmsink wait-for-connection=0 socket-path=/tmp/tmpsock shm-size=20000000 sync=true +``` + +### Reading a stream from memory + +This will display the video of a stream locally: + +``` +gst-launch-1.0 shmsrc socket-path=/tmp/tmpsock ! \ + 'video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=(fraction)30/1' ! \ + autovideosink +```` + +## gstproxy (proxysink and proxysrc) + +I've used *proxysink* and *proxysrc* to split large pipelines into smaller ones. That way, if a part fails, the rest can continue. + +It's not possible to use them via the command-line, because you connect them by having the receiver (`proxysrc`) reference the sender (`proxysink`). + +A very simple example would be: + +``` +1st pipeline: audiotestsrc is-live=1 ! proxysink +2nd pipeline: proxysrc ! autoaudiosink +``` + +This would achieve the same as `audiotestsrc | autoaudiosink`, but in two pipelines. + +An Python example of this can be found at [python_examples/gstproxy_01_audiotestsrc/py](python_examples/gstproxy_01_audiotestsrc/py). + +A slightly more interesting example can be found at + +[python_examples/gstproxy_02_playbin/py](python_examples/gstproxy_02_playbin/py). This plays a video file (e.g. mp4). It shows: + +# that `proxysink` can work with [`playbin`](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/gst-plugins-base-plugins-playbin.html) +# separate proxies for audio and video +# _TO BE CONFIRMED_ that when the video ends, the other pipelines continue. diff --git a/tee.md b/tee.md index 5285683..bb5aa0b 100644 --- a/tee.md +++ b/tee.md @@ -1,6 +1,6 @@ # Multiple outputs (tee) -The `tee` command allows audio & video streams to be sent to more than one place. +This page describes the `tee` element, which allows audio & video streams to be sent to more than one place. ## Tee to two local video outputs diff --git a/writing_to_files.md b/writing_to_files.md index e69de29..5a9a8bf 100644 --- a/writing_to_files.md +++ b/writing_to_files.md @@ -0,0 +1,7 @@ + gst-launch-1.0 -e videotestsrc ! video/x-raw-yuv, framerate=25/1, width=640, height=360 ! x264enc ! \ + mpegtsmux ! filesink location=test.ts + +gst-launch-1.0 -e videotestsrc !\ + x264enc !\ + mpegtsmux !\ + filesink location=test.ts