ts-jitterbuffer: set jbuf delay when instantiating it

The internal (C) jitterbuffer needs to know about the configured
latency when calculating a PTS, as it otherwise may consider that
the packet is too late, trigger a resync and cause the element to
discard the packet altogether.

I could not identify when this was broken, but the net effect was
that in the current state, ts-jitterbuffer was discarding up to
half of all the incoming packets.
This commit is contained in:
Mathieu Duponchelle 2022-05-11 01:42:10 +02:00 committed by Sebastian Dröge
parent 05ece5560e
commit 943a138d49

View file

@ -1074,7 +1074,12 @@ impl TaskImpl for JitterBufferTask {
self.sink_pad_handler.clear();
let jb = self.element.imp();
*jb.state.lock().unwrap() = State::default();
let latency = jb.settings.lock().unwrap().latency;
let state = State::default();
state.jbuf.set_delay(latency);
*jb.state.lock().unwrap() = state;
gst::log!(CAT, obj: &self.element, "Task started");
Ok(())