Updated RTMP and SRT

This commit is contained in:
Matthew Clark 2022-08-26 21:59:12 +01:00
parent 5a3ecebeb6
commit 4998db70e4
4 changed files with 135 additions and 21 deletions

View file

@ -4,6 +4,8 @@ GStreamer can send and receive audio and video via a network socket, using eithe
*UDP* is faster but lossy - there is no attempt to resend lost network packets to it will fail if the network is not perfect. *TCP* acknowledges every network packet so is slower, but more reliable.
See also [SRT](srt.md) as an alternative format.
## UDP
### Audio via UDP

52
rtmp.md
View file

@ -1,22 +1,23 @@
# RTMP (GStreamer command-line cheat sheet)
GStreamer can receive an RTMP stream from an RTMP server. It can also send an RTMP stream to an RTMP server.
GStreamer can:
* retrieve an RTMP stream from an RTMP server, and
* also send an RTMP stream to an RTMP server (including YouTube).
If you need your own RTMP server, [the Nginx RTMP extension](https://github.com/arut/nginx-rtmp-module) works quite well. [Linode has a good NGINX RTMP installation guide.](https://www.linode.com/docs/guides/set-up-a-streaming-rtmp-server/)
### Play an RTMP stream
To play from RTMP server, playbin can be used (as with files, HLS streams, DASH streams, etc):
RTMP can be live streams, or on-demand streams - playback is the same in both cases.
To play from RTMP server, [playbin](https://gstreamer.freedesktop.org/documentation/playback/playbin.html) can be used. (Playbin is magical - it can also play files, HLS streams, DASH streams, and many other sources!) Example:
```
export RTMP_SRC="rtmp://matthewc.co.uk/vod/scooter.flv"
gst-launch-1.0 playbin uri=$RTMP_SRC
```
A test RTMP stream is available at `rtmp://184.72.239.149/vod/BigBuckBunny_115k.mov` which serves as a useful example:
```
gst-launch-1.0 playbin uri='rtmp://184.72.239.149/vod/BigBuckBunny_115k.mov'
```
A test RTMP VOD stream is available at `rtmp://matthewc.co.uk/vod/scooter.flv` which serves as a useful example:
Instead of using `playbin`, it's possible to get video only with `uridecodebin` then shown with `autovideosink`:
@ -112,7 +113,7 @@ gst-launch-1.0 \
mix.
```
## Sending to an RTMP server
## Sending a live stream to an RTMP server
The examples below use the `RTMP_DEST` environment variable. You can set it to reference your RTMP server, e.g.
@ -136,9 +137,9 @@ rtmp {
}
```
then the application name is `livestream`, and so your URL will be `rtmp://<your-domain>/livestream/<stream-name>`.
then the application name is `livestream`, and so your URL will be `rtmp://<your-domain>/livestream/<stream-name>` (where `<stream-name> can be anything`).
### Sending a test stream to an RTMP server
### Sending a test live stream to an RTMP server
To send a video test source:
@ -168,6 +169,37 @@ gst-launch-1.0 videotestsrc is-live=true ! \
voaacenc bitrate=96000 ! audio/mpeg ! aacparse ! audio/mpeg, mpegversion=4 ! mux.
```
### Live streaming to YouTube via RTMP
YouTube accepts live RTMP streams. They must have both audio and video.
Set up a stream by visiting [YouTube.com](https://www.youtube.com/) on desktop, and selecting 'Create' from the top-right.
YouTube will provide a 'Stream URL' and a 'Stream key'. Combine these to create the full URL.
For example, if the URL is `rtmp://a.rtmp.youtube.com/live2` and the key is `abcd-1234-5678`, then:
```
export RTMP_DEST="rtmp://a.rtmp.youtube.com/live2/abcd-1234-5678"
```
Given the [YouTube stream suggestions](https://support.google.com/youtube/answer/2853702)) here's a good test stream:
```
gst-launch-1.0 \
videotestsrc is-live=1 \
! videoconvert \
! "video/x-raw, width=1280, height=720, framerate=30/1" \
! queue \
! x264enc cabac=1 bframes=2 ref=1 \
! "video/x-h264,profile=main" \
! flvmux streamable=true name=mux \
! rtmpsink location="${RTMP_DEST} live=1" \
audiotestsrc is-live=1 wave=ticks \
! voaacenc bitrate=128000 \
! mux.
```
### Send a file over RTMP
Audio & video:

56
srt.md
View file

@ -1,26 +1,57 @@
# SRT (GStreamer command-line cheat sheet)
SRT is a means of sending AV between to servers.
SRT has a client-server relationship. One side must be the server, the other the client.
The server can either be the side that is sending the audio/video (pushing) or the side that is
receiving (pulling). The server must have started exist before the client.
NOTE: THIS IS FOR GSTREAMER 1.14; IT HAS CHANGED IN 1.16.
## Pre-requisites
## Server sending the AV
Create a sender server like this:
If on MacOS, ensure the SRT library is installed:
```
gst-launch-1.0 videotestsrc ! video/x-raw, height=360, width=640 ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high ! mpegtsmux ! srtserversink uri=srt://127.0.0.1:8888/
brew install srt
```
And create a receiver client like this:
Then at the time of writing, the default Homebrew build does not include SRT, so build from scratch:
```
gst-launch-1.0 -v srtclientsrc uri="srt://127.0.0.1:8888" ! decodebin ! autovideosink
brew install --build-from-source gst-plugins-bad
```
## Server receiving the AV
## Client receiving from server
The most common usage is that the server has the video, and a client reads from it.
To create a sender server, use `srtsink` (or alternatively, `srtserversink`):
```
gst-launch-1.0 -v videotestsrc ! video/x-raw, height=360, width=640 ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high ! mpegtsmux ! srtsink uri=srt://:8888
```
And then recieve with `srtsrc` (or alternatively, `srtclientsrc`):
```
gst-launch-1.0 -v srtsrc uri="srt://127.0.0.1:8888" ! decodebin ! autovideosink
```
Don't forget, the amazing `playbin` can show anything - including SRT streams. This is useful for debugging:
```
gst-launch-1.0 playbin uri=srt://127.0.0.1:8888
```
By default, `srtsink` will wait for a clilent connection before allowing the stream to start. If you'd prefer this not to happen, set `wait-for-connection=false`:
```
gst-launch-1.0 -v videotestsrc ! video/x-raw, height=360, width=640 ! \
videoconvert ! clockoverlay font-desc="Sans, 48" ! x264enc tune=zerolatency ! \
video/x-h264, profile=high ! mpegtsmux ! srtsink uri=srt://:8888 wait-for-connection=false
```
## Client sending to server
The alternative way round is to have the producer sending the AV to the receiving server.
To have the server receiving, rather than sending, swap 'srtclientsrc' for 'srcserversrc'.
Likewise, to have the client sending rather than receiving, swap 'srtserversink' for 'srtclientsink'.
@ -28,11 +59,14 @@ Likewise, to have the client sending rather than receiving, swap 'srtserversink'
Create a receiver server like this:
```
gst-launch-1.0 -v srtserversrc uri="srt://127.0.0.1:8888" ! decodebin ! autovideosink
gst-launch-1.0 -v srtserversrc uri="srt://127.0.0.1:8889" ! decodebin ! autovideosink
```
And a sender client like this:
```
gst-launch-1.0 videotestsrc ! video/x-raw, height=360, width=640 ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high ! mpegtsmux ! srtclientsink uri="srt://127.0.0.1:8888/"
```
gst-launch-1.0 -v videotestsrc ! video/x-raw, height=360, width=640 ! \
videoconvert ! clockoverlay font-desc="Sans, 48" ! x264enc tune=zerolatency ! \
video/x-h264, profile=high ! mpegtsmux ! srtclientsink uri=srt://:8889
```

46
web_page_capture.md Normal file
View file

@ -0,0 +1,46 @@
# Web page capture (WPE)
The [wpesrc](https://gstreamer.freedesktop.org/documentation/wpe/wpesrc.html?gi-language=python) plugin can take a web page, and offer it as a GStreamer source. This allows you to:
* Show web pages on screen, and
* Use web pages as a means of doing graphics.
The `wpesrc` plugin isn't frequently used. To see if you have it installed:
```
gst-inspect-1.0 | grep wpe
```
## Installing
MacOS: The `wpesrc` plugin isn't part of the Homebrew build, unortunately.
Ubuntu: `wpesrc` has a separate package called `gstreamer1.0-wpe`. So to install:
```
sudo apt-get install gstreamer1.0-wpe
```
## Using
TODO
### Streaming to SRT
```
LIBGL_ALWAYS_SOFTWARE=true gst-launch-1.0 -v wpevideosrc location="https://www.bbc.co.uk" ! videoconvert ! x264enc tune=zerolatency ! \
video/x-h264, profile=high ! mpegtsmux ! srtsink uri=srt://:8889 wait-for-connection=false
```
### Without GPU
Setting `LIBGL_ALWAYS_SOFTWARE=true` allows the `wpserc` element to work without GPU. This can lead to issues if the format is not set to `BGRA`. An example, sending a web page as a live stream to an RTMP server:
```
LIBGL_ALWAYS_SOFTWARE=true gst-launch-1.0 \
wpevideosrc location="https://en.wikipedia.org/wiki/Main_Page" \
! videoconvert ! videoscale ! videorate \
! "video/x-raw, format=BGRA, width=854, height=480, framerate=30/1" \
! videoconvert ! queue ! x264enc speed-preset=1 ! flvmux name=muxer \
! rtmpsink location="$RTMP_DEST live=1"
```