diff --git a/README.md b/README.md index 94d9a85..360cb17 100644 --- a/README.md +++ b/README.md @@ -11,6 +11,7 @@ A few Python examples are also included for when you need GStreamer to be dynami * [Mixing video & audio](mixing.md) * [Images](images.md) * [Capturing images](capturing_images.md) +* [Sending to multiple destinations (tee)](tee.md) * [Sending/receiving video from shared memory](memory_transfer.md) * [Network transfer](network_transfer.md) (including how to send so that VLC can preview) @@ -29,9 +30,10 @@ A few Python examples are also included for when you need GStreamer to be dynami ## Interacting with the GStreamer pipeline -If you want to interact with GStreamer after it's started (e.g. respond to an event, or dynamically change a pipeline), the command-line GStreamer doesn't really cut it. Instead you have two options: +If you want to interact with GStreamer after it's started (e.g. respond to an event, or dynamically change a pipeline), the command-line GStreamer doesn't really cut it. Instead, here are some options: * *[GStreamer Daemon (gstd)](https://github.com/RidgeRun/gstd-1.x)* - allows setting and updating via a TCP connection +* *[Snowmix](http://snowmix.sourceforge.net/) - an open-source live video mixer * *Develop using the GStreamer library*, in either [C](https://gstreamer.freedesktop.org/documentation/application-development/basics/helloworld.html), [Python](https://github.com/GStreamer/gst-python), or [C#/.NET](https://github.com/GStreamer/gstreamer-sharp) ### Python with GStreamer diff --git a/images/pinwheel_and_ball.png b/images/pinwheel_and_ball.png new file mode 100644 index 0000000..7328079 Binary files /dev/null and b/images/pinwheel_and_ball.png differ diff --git a/tee.md b/tee.md new file mode 100644 index 0000000..2b70515 --- /dev/null +++ b/tee.md @@ -0,0 +1,42 @@ +# Multiple outputs (tee) + +The `tee` command allows audio & video streams to be sent to more than one place. + +Here's a simple example that sends shows video test source twice (using `autovideosink`) + +``` +# The two windows may overlay on top of each other +gst-launch-1.0 \ + videotestsrc ! tee name=t \ + t. ! queue ! videoconvert ! autovideosink \ + t. ! queue ! videoconvert ! autovideosink +``` + +Here's an example that sends video to both `autovideosink` and a TCP server (`tcpserversink`). +Note how `async=false` is required on both sinks. + +``` +gst-launch-1.0 videotestsrc ! \ + decodebin ! tee name=t \ + t. ! queue ! videoconvert ! autovideosink async=false \ + t. ! queue ! x264enc ! mpegtsmux ! tcpserversink port=7001 host=127.0.0.1 recover-policy=keyframe sync-method=latest-keyframe async=false +``` + +However, as discussed [here](http://gstreamer-devel.966125.n4.nabble.com/tee-won-t-go-in-playing-state-td4680128.html), `async=false` can cause issues. Adding `tune=zerolatency` to the `x264enc` also resolves the issue. + +``` +gst-launch-1.0 videotestsrc ! \ + decodebin ! tee name=t \ + t. ! queue ! videoconvert ! autovideosink \ + t. ! queue ! x264enc tune=zerolatency ! mpegtsmux ! tcpserversink port=7001 host=127.0.0.1 recover-policy=keyframe sync-method=latest-keyframe +``` + +You can also use `tee` in order to do multiple things with inputs. This example combines two audio visualisations: + +``` +gst-launch-1.0 filesrc location=$SRC ! decodebin ! tee name=t ! \ + queue ! audioconvert ! wavescope style=color-lines shade-amount=0x00080402 ! alpha alpha=0.5 ! \ + videomixer name=m background=black ! videoconvert ! vertigotv ! autovideosink \ + t. ! queue ! audioconvert ! spacescope style=color-lines shade-amount=0x00080402 ! alpha alpha=0.5 ! m. \ + t. ! queue ! autoaudiosink +```