Diagnosing Frame Sync Issues?

classic Classic list List threaded Threaded
8 messages Options
Reply | Threaded
Open this post in threaded view
|

Diagnosing Frame Sync Issues?

Stirling Westrup
This may, or may not, be a GStreamer issue. I've written a video-wall application that splits a playing video into pieces and shows the pieces on zero-clients connected to the host computer via USB or Ethernet cable.

This basic functionality is working, but when playing large (ie 4K resolution) videos, we sometimes see momentary differences between when different monitors get updated with the next frame of video.

Now, I can easily imagine a number of places that a video frame may be delayed so as to show after its neighbor:

1) Inside the GStreamer pipeline.
2) Inside the network stack
3) due to USB bandwidth issues.

So far the only code in GStreamer that tries to maintain sync is that in the queue that handles each output monitor. The assumption made was that that would be sufficient. I could (with a lot of rewriting) replace all N output queues with a single N-way multiqueue, but I'm unsure that such a move is actually necessary.

Does anyone have any suggestions of ways, techniques, whatever to try to narrow down cause of the frame sync issue before I go rewriting great chunks of code. I would hate to do all that just to find out it was a USB driver issue all along...


_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Diagnosing Frame Sync Issues?

Chuck Crisler-2
How many USB connections are you using? While you *COULD* use a multi-threaded decoder, that would probably only increase your frame-sync problem and not give any significant benefit (though decoding a 4k resolution frame would be quicker). If you forced a single thread decode and multi thread display that might give you some insight into where the problem is. It is a neat problem though!


On Mon, May 13, 2013 at 3:13 PM, Stirling Westrup <[hidden email]> wrote:
This may, or may not, be a GStreamer issue. I've written a video-wall application that splits a playing video into pieces and shows the pieces on zero-clients connected to the host computer via USB or Ethernet cable.

This basic functionality is working, but when playing large (ie 4K resolution) videos, we sometimes see momentary differences between when different monitors get updated with the next frame of video.

Now, I can easily imagine a number of places that a video frame may be delayed so as to show after its neighbor:

1) Inside the GStreamer pipeline.
2) Inside the network stack
3) due to USB bandwidth issues.

So far the only code in GStreamer that tries to maintain sync is that in the queue that handles each output monitor. The assumption made was that that would be sufficient. I could (with a lot of rewriting) replace all N output queues with a single N-way multiqueue, but I'm unsure that such a move is actually necessary.

Does anyone have any suggestions of ways, techniques, whatever to try to narrow down cause of the frame sync issue before I go rewriting great chunks of code. I would hate to do all that just to find out it was a USB driver issue all along...


_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel



_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Diagnosing Frame Sync Issues?

Nicolas Dufresne
In reply to this post by Stirling Westrup
Le lundi 13 mai 2013 à 15:13 -0400, Stirling Westrup a écrit :
So far the only code in GStreamer that tries to maintain sync is that in the queue that handles each output monitor. The assumption made was that that would be sufficient. I could (with a lot of rewriting) replace all N output queues with a single N-way multiqueue, but I'm unsure that such a move is actually necessary.

This is an interesting case, I'm totally new to zero-client, but what type of video sink element is being used ? How does zero-client displays report their latency ? In optimal condition, you would have a video sink that is aware of the screen latency and potentially having this latency sufficient to accept a certain jitter caused by the host machine having such a high load. If you want to keep latency low, the answer might rely in using real time scheduling for the threads that sends frames to the displays. In any case, trying to increase buffer size after the decoders is to be tested.

regards,
Nicolas
_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Diagnosing Frame Sync Issues?

Edward Hervey
In reply to this post by Stirling Westrup
Hi,

On Mon, 2013-05-13 at 15:13 -0400, Stirling Westrup wrote:

> This may, or may not, be a GStreamer issue. I've written a video-wall
> application that splits a playing video into pieces and shows the
> pieces on zero-clients connected to the host computer via USB or
> Ethernet cable.
>
>
> This basic functionality is working, but when playing large (ie 4K
> resolution) videos, we sometimes see momentary differences between
> when different monitors get updated with the next frame of video.
>
>
> Now, I can easily imagine a number of places that a video frame may be
> delayed so as to show after its neighbor:
>
>
> 1) Inside the GStreamer pipeline.
>
> 2) Inside the network stack
>
> 3) due to USB bandwidth issues.
>
>
> So far the only code in GStreamer that tries to maintain sync is that
> in the queue that handles each output monitor. The assumption made was
> that that would be sufficient. I could (with a lot of rewriting)
> replace all N output queues with a single N-way multiqueue, but I'm
> unsure that such a move is actually necessary.

  That's not quite the reality.
  1) The queue's job is to dedicate a thread to the rendering process
(i.e. the thread;s sole taks is to pick the next buffer available from
the queue (if any) and push it to the sink). Switching to multiqueue
won't change anything.
  2) The synchronization (i.e. waiting for the render time) is done in
the sink.

  Based on what you mentioned, you have N different sinks for the
various outputs. I'll go under that assumption.

  The issue is that the delay between the moment GStreamer (and the
GstBaseSink base class) says "yes, this buffer time corresponds to this
clock time, render it at this clock time" and the moment it actually
gets displayed to the user is:
  1) not immediate
  2) not the same for every sink

  In order to get perfect sync between your displays (and also with the
audio, which you don't mention), the sinks need to know the latency
introduced in the render process.

  The current synchronization model in video sinks (and all other sink
element that use the standard GstBaseSink synchronization model) is
"wait-then-render".
  When a buffer comes in, GstBaseSink will figure out a target clock
running time, wait for that moment and then call your sink's "render"
method (which does the actual display/output/...).

  And this is where it gets tricky...

  In order to compensate for that render delay in the wait-then-render
model, your sink needs to report that to the base class (render_delay
property), so that it will subtract that value from the target clock
running_time and call your render method that much earlier...
  ... which will only work if:
   1) you can efficiently calculate that delay
   2) it is constant (some jitter might be acceptable)

  If you can't calculate that delay and if it's not roughly constant...
you're out of luck with the current model.
  You could try to do some empirical testing and come up with some
"better" values ("in average it's 100ms over usb and 50ms over
ethernet") but it will only mitigate the problem, it won't solve it.
This might be acceptable though for your use-case, you might want to try
that.

  If you want to get "perfect" synchronization, you need to:
  1) Have systems which can report to you at what time a certain frame
was displayed (so you can internally calculate the remote system latency
and display rate)
  2) Have systems which can ask you for the frame to be displayed at a
certain target remote time.
  2) And switch to a synchronization model which is more like how audio
sinks work in Gstreamer (they are pull-based with an internal
ring-buffer and decide which frame to give the audio subsystem at a
given time).

  Note that Barco does such multi-screen displays (for control centers
and so forth), but they have solved the problem by having "smart"
displays which do the decoding/synchronization/display. The data is fed
to the screens via RTP and all displays have got a synchronized clock.
Doing it that way essentially moves the whole latency/synchronization
issue to the displays.
  More info in this talk :
http://gstconf.ubicast.tv/videos/displaying-sychronized-video-2/

  Hope this helps,

  Edward

P.S. Note that "perfect" is very subjective especially when it comes to
audio/video synchronization. Even if you have you audio/video
synchronized in output ... if your speakers are 5meters away from where
you sit... the sound will arrive to your ears 16ms late ... Joy :)

>
>
> Does anyone have any suggestions of ways, techniques, whatever to try
> to narrow down cause of the frame sync issue before I go rewriting
> great chunks of code. I would hate to do all that just to find out it
> was a USB driver issue all along...
>
>
> --
> Stirling Westrup
> Programmer, Entrepreneur.
> https://www.linkedin.com/e/fpf/77228
> http://www.linkedin.com/in/swestrup
> http://technaut.livejournal.com
> http://sourceforge.net/users/stirlingwestrup
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Diagnosing Frame Sync Issues?

Ian Davidson

On 14/05/2013 08:37, Edward Hervey wrote:

> The current synchronization model in video sinks (and all other sink
> element that use the standard GstBaseSink synchronization model) is
> "wait-then-render". When a buffer comes in, GstBaseSink will figure
> out a target clock running time, wait for that moment and then call
> your sink's "render" method (which does the actual
> display/output/...). And this is where it gets tricky.
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
Just a thought.  Does the time taken to render depend on what is being
rendered?  For example, if one 'nth' of the total picture happened to be
'clear blue sky', it would (possibly) be quicker to render than another
'nth' which contained lots of detail.  Since both pieces start to be
rendered at about the same time (remembering that the computer only has
so many CPUs actually available), that could allow the 'easy' pieces to
be displayed before the 'complicated' bits - and sometimes that might be
noticeable.

I could be totally wrong.
--
--
Ian Davidson

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Diagnosing Frame Sync Issues?

Edward Hervey
Hi,

On Tue, 2013-05-14 at 09:35 +0100, Ian Davidson wrote:

> On 14/05/2013 08:37, Edward Hervey wrote:
> > The current synchronization model in video sinks (and all other sink
> > element that use the standard GstBaseSink synchronization model) is
> > "wait-then-render". When a buffer comes in, GstBaseSink will figure
> > out a target clock running time, wait for that moment and then call
> > your sink's "render" method (which does the actual
> > display/output/...). And this is where it gets tricky.
> >
> > _______________________________________________
> > gstreamer-devel mailing list
> > [hidden email]
> > http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> >
> Just a thought.  Does the time taken to render depend on what is being
> rendered?  For example, if one 'nth' of the total picture happened to be
> 'clear blue sky', it would (possibly) be quicker to render than another
> 'nth' which contained lots of detail.  Since both pieces start to be
> rendered at about the same time (remembering that the computer only has
> so many CPUs actually available), that could allow the 'easy' pieces to
> be displayed before the 'complicated' bits - and sometimes that might be
> noticeable.

  When I talk about render, I only mean the display side of things, not
the decoding part.

  The fact that some frames might be faster than others to decode
(before display) is smoothed out by having a small queue between decoder
and sink. So as long as you can decode (on average) N frames over N/fps
duration, that will be handled by the queues. And each decoder works in
separate threads, so the kernel scheduler should also take care of
load-balancing the work being done by multiple decoders.

>
> I could be totally wrong.


_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Diagnosing Frame Sync Issues?

Stirling Westrup
In reply to this post by Edward Hervey
On Tue, May 14, 2013 at 3:37 AM, Edward Hervey <[hidden email]> wrote:

Thanks hugely for your response. I very much appreciate the time you took to explain the inner model of the sync process.
 

On Mon, 2013-05-13 at 15:13 -0400, Stirling Westrup wrote:
> This may, or may not, be a GStreamer issue. I've written a video-wall
> application that splits a playing video into pieces and shows the
> pieces on zero-clients connected to the host computer via USB or
> Ethernet cable.
>
>
> This basic functionality is working, but when playing large (ie 4K
> resolution) videos, we sometimes see momentary differences between
> when different monitors get updated with the next frame of video.
>
>
> Now, I can easily imagine a number of places that a video frame may be
> delayed so as to show after its neighbor:
>
>
> 1) Inside the GStreamer pipeline.
>
> 2) Inside the network stack
>
> 3) due to USB bandwidth issues.
>
>
> So far the only code in GStreamer that tries to maintain sync is that
> in the queue that handles each output monitor. The assumption made was
> that that would be sufficient. I could (with a lot of rewriting)
> replace all N output queues with a single N-way multiqueue, but I'm
> unsure that such a move is actually necessary.

  That's not quite the reality.
  1) The queue's job is to dedicate a thread to the rendering process
(i.e. the thread;s sole taks is to pick the next buffer available from
the queue (if any) and push it to the sink). Switching to multiqueue
won't change anything.
  2) The synchronization (i.e. waiting for the render time) is done in
the sink.

Ah. Thanks. I had not really understood that, despite reading both the main manual and the element writing manual.
 

  Based on what you mentioned, you have N different sinks for the
various outputs. I'll go under that assumption.

Correct.
 

  The issue is that the delay between the moment GStreamer (and the
GstBaseSink base class) says "yes, this buffer time corresponds to this
clock time, render it at this clock time" and the moment it actually
gets displayed to the user is:
  1) not immediate
  2) not the same for every sink

  In order to get perfect sync between your displays (and also with the
audio, which you don't mention), the sinks need to know the latency
introduced in the render process.

  The current synchronization model in video sinks (and all other sink
element that use the standard GstBaseSink synchronization model) is
"wait-then-render".
  When a buffer comes in, GstBaseSink will figure out a target clock
running time, wait for that moment and then call your sink's "render"
method (which does the actual display/output/...).

  And this is where it gets tricky...

  In order to compensate for that render delay in the wait-then-render
model, your sink needs to report that to the base class (render_delay
property), so that it will subtract that value from the target clock
running_time and call your render method that much earlier...
  ... which will only work if:
   1) you can efficiently calculate that delay
   2) it is constant (some jitter might be acceptable)

  If you can't calculate that delay and if it's not roughly constant...
you're out of luck with the current model.
  You could try to do some empirical testing and come up with some
"better" values ("in average it's 100ms over usb and 50ms over
ethernet") but it will only mitigate the problem, it won't solve it.
This might be acceptable though for your use-case, you might want to try
that.

This sounds like a likely approach. I've been looking around at our competition and the current 'default' synchronization that GStreamer achieves is already better than many commercial products, so any improvement we can achieve would be good. We don't need 'perfect'. (And you're right, it probably can't be achieved with this hardware, but we are going for 'good, cheap, video-wall' so ...)

We haven't much worried about sound synchronization because many clients won't have sound turned on, and those that do will have systems that introduce delays (like surround sound systems) and so we'll probably need a hand-tuning parameter for sound sync in any particular space.

I will try adding some extra time to the 'render-delay' parameter in xvimagesink and see what I can accomplish, but that does make me curious. What is the downside to being over-aggressive in setting that parameter. ie, if I set it to 3 seconds, what is the downside, other than delaying start of video playing by 3 seconds?

I ask because we can't distinguish if the client has connected their zero-clients via USB or via Ethernet, and so can't give different values for delay that automatically compensate. Thus we're likely to set a default worst-case delay and let the client tune it with some sort of config parameter.


_______________________________________________



--
Stirling Westrup
Programmer, Entrepreneur.
https://www.linkedin.com/e/fpf/77228
http://www.linkedin.com/in/swestrup
http://technaut.livejournal.com
http://sourceforge.net/users/stirlingwestrup

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Diagnosing Frame Sync Issues?

Stirling Westrup
In reply to this post by Ian Davidson
On Tue, May 14, 2013 at 4:35 AM, Ian Davidson <[hidden email]> wrote:
Just a thought.  Does the time taken to render depend on what is being rendered?  For example, if one 'nth' of the total picture happened to be 'clear blue sky', it would (possibly) be quicker to render than another 'nth' which contained lots of detail.  Since both pieces start to be rendered at about the same time (remembering that the computer only has so many CPUs actually available), that could allow the 'easy' pieces to be displayed before the 'complicated' bits - and sometimes that might be noticeable.

Alas, this *is* a possible scenario. I know that the X drivers for the zero clients use a lossy video encoding to compress output frames for transmission over the wire and the target devices need to decode that and then display the results. Obviously a simple blue sky is likely to be easier and faster to encode/decode that a complex scene, but I don't see what I could do about this that wouldn't be prohibitively expensive.

That said, evidence so far is that this is NOT the ultimate cause of our issue as we often see the frame sync issue between two scenes of approximately the same complexity. Then again, that might just be because one frame of blue sky looks pretty much like another, so one wouldn't be as likely to notice frame jitter there.

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel