looking for clarity on pipeline latency

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

looking for clarity on pipeline latency

jim nualart
Hi All,

I'm hoping someone can chime in here with a little bit of explanation, as I think I have gotten myself somewhat confused on this topic.

I've read the following:
Before the pipeline goes to the PLAYING state, it will, in addition to selecting a clock and calculating a base-time, calculate the latency in the pipeline. It does this by doing a LATENCY query on all the sinks in the pipeline. The pipeline then selects the maximum latency in the pipeline and configures this with a LATENCY event.

So this is "automatic" and the application itself does not need to be involved at this point ... correct so far?

Now, what if I want to know what the latency value is (of what was configured automatically by the pipeline)? 
Can I use gst_pipeline_get_latency()?
Am I correct that this function would return the currently *configured* pipeline latency?
Or should I use a query with gst_query_new_latency()?
Though, I think this function would get me the current *actual* latency (vs what is configured), right?

Finally, if I want to change the latency value, is that done via gst_pipeline_set_latency() or gst_query_set_latency() ?

Much thanks in advance,
-jim

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: looking for clarity on pipeline latency

Nicolas Dufresne-5
Hi Jim,

See answers below.


Le mar. 15 sept. 2020 17 h 30, jim nualart <[hidden email]> a écrit :
Hi All,

I'm hoping someone can chime in here with a little bit of explanation, as I think I have gotten myself somewhat confused on this topic.

I've read the following:
Before the pipeline goes to the PLAYING state, it will, in addition to selecting a clock and calculating a base-time, calculate the latency in the pipeline. It does this by doing a LATENCY query on all the sinks in the pipeline. The pipeline then selects the maximum latency in the pipeline and configures this with a LATENCY event.

So this is "automatic" and the application itself does not need to be involved at this point ... correct so far?

Now, what if I want to know what the latency value is (of what was configured automatically by the pipeline)? 
Can I use gst_pipeline_get_latency()?
Am I correct that this function would return the currently *configured* pipeline latency?
Or should I use a query with gst_query_new_latency()?
Though, I think this function would get me the current *actual* latency (vs what is configured), right?

The latency query will give you the reported latency. Running that query on different sink element (that query is executed  sink to source) usually gives different values. By default, GstPipeline will select the maximum.

The applied latency is reported through an event. By default the same event is sent to all sinks, but special application may use different values. This is why reading the applied latency is not so straight forward.


Finally, if I want to change the latency value, is that done via gst_pipeline_set_latency() or gst_query_set_latency() ?

The first once will force a global latency to the specified latency. It is controlled by applications indeed.

The second is used by elements, when contributing to the latency query. This a "plugin" API.

You can also use the latency tracer to get a log of per element reported latency.


GST_TRACERS="latency(flags=reported)" GST_DEBUG=GST_TRACER:7 ...


Much thanks in advance,
-jim
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: looking for clarity on pipeline latency

jim nualart
In reply to this post by jim nualart
Hi Nicolas,
Thanks for the response. That does clear up some confusion. I think I am still having trouble separating when something is a "plugin" API vs an application level API.

Say I have a pipeline A ! B ! tee name=mytee ! C ! D1 mytee. ! D2

1. As the pipeline goes to playing state, the pipeline itself will calculate and apply the appropriate latency.

2. If say, I dynamically remove C, or if it makes more sense, say I change a property on C (and lets assume I do this properly), that *might* result in a message on the pipeline bus of GST_MESSAGE_LATENCY, correct?

3. At which point, the application could/should check to see if it (the application) needs to distribute a new latency value. The quote is "The application should recalculate and distribute a new latency." So my question is on how the application should recalculate the new latency. Should the application use gst_query_new_latency() on the *pipeline* itself (as an "element")? Or should it be a query explicitly on each sink (D1 & D2)? Or does the application need to query each element in the pipeline directly (like what is automatically done by the pipeline as it first goes to playing state)? Or something else entirely?

4. Assuming we do step 3 correctly, the application would then use gst_pipeline_set_latency() to distribute the new value, correct?

Apologies if I'm just missing something obvious and making this harder than it really is.
Thanks again,
-jim

 


Hi Jim,

See answers below.


Le mar. 15 sept. 2020 17 h 30, jim nualart <[hidden email]> a ?crit :

> Hi All,
>
> I'm hoping someone can chime in here with a little bit of explanation, as
> I think I have gotten myself somewhat confused on this topic.
>
> I've read the following:
> (from here,
> https://gstreamer.freedesktop.org/documentation/application-development/advanced/clocks.html#latency-compensation
> )
> Before the pipeline goes to the PLAYING state, it will, in addition to
> selecting a clock and calculating a base-time, calculate the latency in the
> pipeline. It does this by doing a LATENCY query on all the sinks in the
> pipeline. The pipeline then selects the maximum latency in the pipeline and
> configures this with a LATENCY event.
>
> So this is "automatic" and the application itself does not need to be
> involved at this point ... correct so far?
>
> Now, what if I want to know what the latency value is (of what was
> configured automatically by the pipeline)?
> Can I use gst_pipeline_get_latency()?
> Am I correct that this function would return the currently *configured*
> pipeline latency?
> Or should I use a query with gst_query_new_latency()?
> Though, I think this function would get me the current *actual* latency
> (vs what is configured), right?
>

The latency query will give you the reported latency. Running that query on
different sink element (that query is executed  sink to source) usually
gives different values. By default, GstPipeline will select the maximum.

The applied latency is reported through an event. By default the same event
is sent to all sinks, but special application may use different values.
This is why reading the applied latency is not so straight forward.


> Finally, if I want to change the latency value, is that done
> via gst_pipeline_set_latency() or gst_query_set_latency() ?
>

The first once will force a global latency to the specified latency. It is
controlled by applications indeed.

The second is used by elements, when contributing to the latency query.
This a "plugin" API.

You can also use the latency tracer to get a log of per element reported
latency.


GST_TRACERS="latency(flags=reported)" GST_DEBUG=GST_TRACER:7 ...


> Much thanks in advance,
> -jim
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20200916/8ba848b4/attachment-0001.htm>

------------------------------

Subject: Digest Footer

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


------------------------------

End of gstreamer-devel Digest, Vol 116, Issue 31
************************************************

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: looking for clarity on pipeline latency

Nicolas Dufresne-5


Le mer. 16 sept. 2020 16 h 30, jim nualart <[hidden email]> a écrit :
Hi Nicolas,
Thanks for the response. That does clear up some confusion. I think I am still having trouble separating when something is a "plugin" API vs an application level API.

Say I have a pipeline A ! B ! tee name=mytee ! C ! D1 mytee. ! D2

1. As the pipeline goes to playing state, the pipeline itself will calculate and apply the appropriate latency.

2. If say, I dynamically remove C, or if it makes more sense, say I change a property on C (and lets assume I do this properly), that *might* result in a message on the pipeline bus of GST_MESSAGE_LATENCY, correct?

3. At which point, the application could/should check to see if it (the application) needs to distribute a new latency value. The quote is "The application should recalculate and distribute a new latency." So my question is on how the application should recalculate the new latency. Should the application use gst_query_new_latency() on the *pipeline* itself (as an "element")? Or should it be a query explicitly on each sink (D1 & D2)? Or does the application need to query each element in the pipeline directly (like what is automatically done by the pipeline as it first goes to playing state)? Or something else entirely?

That just means apps need to call gst_bin_recalculate_latency() on the pipeline. Increasing or decreasing latency will cause small glitches, so at runtime we let the app decide if or when it wants to update it. One would set the pipeline latency larger then needed and always ignore the message too.


4. Assuming we do step 3 correctly, the application would then use gst_pipeline_set_latency() to distribute the new value, correct?

Apologies if I'm just missing something obvious and making this harder than it really is.
Thanks again,
-jim

 


Hi Jim,

See answers below.


Le mar. 15 sept. 2020 17 h 30, jim nualart <[hidden email]> a ?crit :

> Hi All,
>
> I'm hoping someone can chime in here with a little bit of explanation, as
> I think I have gotten myself somewhat confused on this topic.
>
> I've read the following:
> (from here,
> https://gstreamer.freedesktop.org/documentation/application-development/advanced/clocks.html#latency-compensation
> )
> Before the pipeline goes to the PLAYING state, it will, in addition to
> selecting a clock and calculating a base-time, calculate the latency in the
> pipeline. It does this by doing a LATENCY query on all the sinks in the
> pipeline. The pipeline then selects the maximum latency in the pipeline and
> configures this with a LATENCY event.
>
> So this is "automatic" and the application itself does not need to be
> involved at this point ... correct so far?
>
> Now, what if I want to know what the latency value is (of what was
> configured automatically by the pipeline)?
> Can I use gst_pipeline_get_latency()?
> Am I correct that this function would return the currently *configured*
> pipeline latency?
> Or should I use a query with gst_query_new_latency()?
> Though, I think this function would get me the current *actual* latency
> (vs what is configured), right?
>

The latency query will give you the reported latency. Running that query on
different sink element (that query is executed  sink to source) usually
gives different values. By default, GstPipeline will select the maximum.

The applied latency is reported through an event. By default the same event
is sent to all sinks, but special application may use different values.
This is why reading the applied latency is not so straight forward.


> Finally, if I want to change the latency value, is that done
> via gst_pipeline_set_latency() or gst_query_set_latency() ?
>

The first once will force a global latency to the specified latency. It is
controlled by applications indeed.

The second is used by elements, when contributing to the latency query.
This a "plugin" API.

You can also use the latency tracer to get a log of per element reported
latency.


GST_TRACERS="latency(flags=reported)" GST_DEBUG=GST_TRACER:7 ...


> Much thanks in advance,
> -jim
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20200916/8ba848b4/attachment-0001.htm>

------------------------------

Subject: Digest Footer

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


------------------------------

End of gstreamer-devel Digest, Vol 116, Issue 31
************************************************
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel