synchronized streaming question

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view

synchronized streaming question

Hi all,

I am working on a networked scenario in which one sender will transmit audio
to multiple receivers whom will render the audio synchronized, following the
description in this presentation:

Just need some clarification on how the synchronization actually works. Will
the audio buffers be fairly synchronized when they leave the
rtpjitterbuffer? We have configured the total pipeline latency for all the
receivers to be larger than the minimum pipeline latency.

I know that the alsasink has a part in the synchronization and will try to
play the buffers at the correct time. If I set alsasink sync=false, the
buffers will be played as soon as they reach the sink. Say for instance that
I  have a total sink buffer of 40ms, then the sink could only (if sync=true)
adjust within that amount of data. Is this correct?
Does this mean that amongst the receivers, the maximum deviation of the
playback of audio buffers, if sync=false, would be equal to the size of the
alsa sink buffer? I.e. between two receivers, the playback of a particular
buffer could be at most 40ms apart following the above example?

Thankful for any information regarding this!


Sent from:
gstreamer-devel mailing list
[hidden email]