ts-jitterbuffer: set jbuf delay when instantiating it

The internal (C) jitterbuffer needs to know about the configured
latency when calculating a PTS, as it otherwise may consider that
the packet is too late, trigger a resync and cause the element to
discard the packet altogether.

I could not identify when this was broken, but the net effect was
that in the current state, ts-jitterbuffer was discarding up to
half of all the incoming packets.
This commit is contained in:
Mathieu Duponchelle 2022-05-11 01:42:10 +02:00 committed by Sebastian Dröge
parent 71877934b5
commit 11a1bbbe69

View file

@ -1080,7 +1080,12 @@ impl TaskImpl for JitterBufferTask {
self.sink_pad_handler.clear();
let jb = self.element.imp();
*jb.state.lock().unwrap() = State::default();
let latency = jb.settings.lock().unwrap().latency;
let state = State::default();
state.jbuf.borrow().set_delay(latency);
*jb.state.lock().unwrap() = state;
gst_log!(CAT, obj: &self.element, "Task started");
Ok(())