How to correctly timestamp buffers in an appsrc element

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

How to correctly timestamp buffers in an appsrc element

J. Krieg
Hello,

I’m currently working on an application to display live TV using
GStreamer on a Raspberry Pi 2B.

Therefore I use two appsrc elements (one for video and one for audio)
which are reading PES packets in 2 separate threads directly from the
V4L DVB demux device ‘/dev/dvb/adapter0/demux0’.
My current test pipelines are:

Video
  V4L DVB demux (DMX_OUT_TAP) -> appsrc ! h264parse ! v4l2h264dec !
queue ! kmssink
Audio
  V4L DVB demux (DMX_OUT_TAP) -> appsrc ! mpegaudioparse !
mpg123audiodec ! queue ! alsasink

I managed to get this working without timestamping the buffers at all
in both appsrc elements but then video and audio isn't synchronous.

I tried to implement timestamping the buffers according to
https://gstreamer.freedesktop.org/documentation/application-development/advanced/pipeline-manipulation.html?gi-language=c#inserting-data-with-appsrc
But when doing this I get slightly stuttering video and extremely
stuttering or no audio.

What I'm also struggling with is that in the link above the following is said:
"In live mode, you should timestamp the buffers with the pipeline
running-time when the first byte of the buffer was captured before
feeding them to appsrc."

But according to my tests the pipeline only changes its state from
PAUSED to PLAYING (where the clock of the pipeline is only available)
after some captured buffers have already fed into the pipeline.
So how could the buffers be timestamped with the running time the very
first time before they have been put into the pipeline which is in a
PAUSED state to get video and audio synchronous?

What am I doing wrong?
Any help or pointing in the right direction would be really appreciated.

Thanks,
Joerg

Code:
// Create a new empty buffer
gbuffer = gst_buffer_new_allocate(NULL, rc, NULL);

// Timestamp buffer
if (((CustomData *)data)->pipelineclock) {
    pipeline_clock_time = gst_clock_get_time(((CustomData
*)data)->pipelineclock);
    pipeline_running_time = pipeline_clock_time - g_pipeline_base_time;
    GST_BUFFER_PTS(gbuffer) = pipeline_running_time;
    GST_BUFFER_DURATION(gbuffer) = pipeline_running_time -
g_last_pipeline_running_time_a;
    g_last_pipeline_running_time_a = pipeline_running_time;
    printf("*** DEBUG *** dmx_read_a | pipeline running timestamp for
audio is in ns: %lld\n", pipeline_running_time);
} else {
    printf("*** DEBUG *** dmx_read_a | Sorry, pipelineclock NOT
available...\n");
    GST_BUFFER_PTS(gbuffer) = GST_CLOCK_TIME_NONE;
}

// Fill data into buffer
bc = gst_buffer_fill(gbuffer, 0, buf, rc);

// Push the buffer into the appsrc
g_signal_emit_by_name (((CustomData *)data)->aappsrc, "push-buffer",
gbuffer, &rb);

// Free the buffer now that we are done with it
gst_buffer_unref (gbuffer);
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How to correctly timestamp buffers in an appsrc element

J. Krieg
Hello,

No ideas?
Could anyone help please?
Unfortunately I can’t figure out this by myself.

Thank you very much.

Best Regards,
Joerg

Am Fr., 20. Nov. 2020 um 15:47 Uhr schrieb J. Krieg <[hidden email]>:

>
> Hello,
>
> I’m currently working on an application to display live TV using
> GStreamer on a Raspberry Pi 2B.
>
> Therefore I use two appsrc elements (one for video and one for audio)
> which are reading PES packets in 2 separate threads directly from the
> V4L DVB demux device ‘/dev/dvb/adapter0/demux0’.
> My current test pipelines are:
>
> Video
>   V4L DVB demux (DMX_OUT_TAP) -> appsrc ! h264parse ! v4l2h264dec !
> queue ! kmssink
> Audio
>   V4L DVB demux (DMX_OUT_TAP) -> appsrc ! mpegaudioparse !
> mpg123audiodec ! queue ! alsasink
>
> I managed to get this working without timestamping the buffers at all
> in both appsrc elements but then video and audio isn't synchronous.
>
> I tried to implement timestamping the buffers according to
> https://gstreamer.freedesktop.org/documentation/application-development/advanced/pipeline-manipulation.html?gi-language=c#inserting-data-with-appsrc
> But when doing this I get slightly stuttering video and extremely
> stuttering or no audio.
>
> What I'm also struggling with is that in the link above the following is said:
> "In live mode, you should timestamp the buffers with the pipeline
> running-time when the first byte of the buffer was captured before
> feeding them to appsrc."
>
> But according to my tests the pipeline only changes its state from
> PAUSED to PLAYING (where the clock of the pipeline is only available)
> after some captured buffers have already fed into the pipeline.
> So how could the buffers be timestamped with the running time the very
> first time before they have been put into the pipeline which is in a
> PAUSED state to get video and audio synchronous?
>
> What am I doing wrong?
> Any help or pointing in the right direction would be really appreciated.
>
> Thanks,
> Joerg
>
> Code:
> // Create a new empty buffer
> gbuffer = gst_buffer_new_allocate(NULL, rc, NULL);
>
> // Timestamp buffer
> if (((CustomData *)data)->pipelineclock) {
>     pipeline_clock_time = gst_clock_get_time(((CustomData
> *)data)->pipelineclock);
>     pipeline_running_time = pipeline_clock_time - g_pipeline_base_time;
>     GST_BUFFER_PTS(gbuffer) = pipeline_running_time;
>     GST_BUFFER_DURATION(gbuffer) = pipeline_running_time -
> g_last_pipeline_running_time_a;
>     g_last_pipeline_running_time_a = pipeline_running_time;
>     printf("*** DEBUG *** dmx_read_a | pipeline running timestamp for
> audio is in ns: %lld\n", pipeline_running_time);
> } else {
>     printf("*** DEBUG *** dmx_read_a | Sorry, pipelineclock NOT
> available...\n");
>     GST_BUFFER_PTS(gbuffer) = GST_CLOCK_TIME_NONE;
> }
>
> // Fill data into buffer
> bc = gst_buffer_fill(gbuffer, 0, buf, rc);
>
> // Push the buffer into the appsrc
> g_signal_emit_by_name (((CustomData *)data)->aappsrc, "push-buffer",
> gbuffer, &rb);
>
> // Free the buffer now that we are done with it
> gst_buffer_unref (gbuffer);
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How to correctly timestamp buffers in an appsrc element

Nicolas Dufresne-5
Le jeudi 26 novembre 2020 à 21:55 +0100, J. Krieg a écrit :
> Hello,
>
> No ideas?
> Could anyone help please?
> Unfortunately I can’t figure out this by myself.

By default, appsrc uses an open segment (start=0, end=infinity). That means your
timestamp must match the running-time. The running time can be obtained like
this:

  clock = gst_pipeline_get_clock(pipeline)
  if (clock) {
  time_now = gst_clock_get_time(clock)
  rt_time = time_now - gst_element_get_base_time (GST_ELEMENT
(pipeline));
   } else {
        rt_time = GST_CLOCK_TIME_NONE; /* or 0 depending on your use case */
   }

If you have raw audio data, it might be easier to calculate the timestamp base
ont he data lenght, starting from zero. Of if your data isn't live, you might
also calculate timestamp using the framerate of a video (starting from 0 again).

>
> Thank you very much.
>
> Best Regards,
> Joerg
>
> Am Fr., 20. Nov. 2020 um 15:47 Uhr schrieb J. Krieg <[hidden email]>:
> >
> > Hello,
> >
> > I’m currently working on an application to display live TV using
> > GStreamer on a Raspberry Pi 2B.
> >
> > Therefore I use two appsrc elements (one for video and one for audio)
> > which are reading PES packets in 2 separate threads directly from the
> > V4L DVB demux device ‘/dev/dvb/adapter0/demux0’.
> > My current test pipelines are:
> >
> > Video
> >   V4L DVB demux (DMX_OUT_TAP) -> appsrc ! h264parse ! v4l2h264dec !
> > queue ! kmssink
> > Audio
> >   V4L DVB demux (DMX_OUT_TAP) -> appsrc ! mpegaudioparse !
> > mpg123audiodec ! queue ! alsasink
> >
> > I managed to get this working without timestamping the buffers at all
> > in both appsrc elements but then video and audio isn't synchronous.
> >
> > I tried to implement timestamping the buffers according to
> > https://gstreamer.freedesktop.org/documentation/application-development/advanced/pipeline-manipulation.html?gi-language=c#inserting-data-with-appsrc
> > But when doing this I get slightly stuttering video and extremely
> > stuttering or no audio.
> >
> > What I'm also struggling with is that in the link above the following is
> > said:
> > "In live mode, you should timestamp the buffers with the pipeline
> > running-time when the first byte of the buffer was captured before
> > feeding them to appsrc."
> >
> > But according to my tests the pipeline only changes its state from
> > PAUSED to PLAYING (where the clock of the pipeline is only available)
> > after some captured buffers have already fed into the pipeline.
> > So how could the buffers be timestamped with the running time the very
> > first time before they have been put into the pipeline which is in a
> > PAUSED state to get video and audio synchronous?
> >
> > What am I doing wrong?
> > Any help or pointing in the right direction would be really appreciated.
> >
> > Thanks,
> > Joerg
> >
> > Code:
> > // Create a new empty buffer
> > gbuffer = gst_buffer_new_allocate(NULL, rc, NULL);
> >
> > // Timestamp buffer
> > if (((CustomData *)data)->pipelineclock) {
> >     pipeline_clock_time = gst_clock_get_time(((CustomData
> > *)data)->pipelineclock);
> >     pipeline_running_time = pipeline_clock_time - g_pipeline_base_time;
> >     GST_BUFFER_PTS(gbuffer) = pipeline_running_time;
> >     GST_BUFFER_DURATION(gbuffer) = pipeline_running_time -
> > g_last_pipeline_running_time_a;
> >     g_last_pipeline_running_time_a = pipeline_running_time;
> >     printf("*** DEBUG *** dmx_read_a | pipeline running timestamp for
> > audio is in ns: %lld\n", pipeline_running_time);
> > } else {
> >     printf("*** DEBUG *** dmx_read_a | Sorry, pipelineclock NOT
> > available...\n");
> >     GST_BUFFER_PTS(gbuffer) = GST_CLOCK_TIME_NONE;
> > }
> >
> > // Fill data into buffer
> > bc = gst_buffer_fill(gbuffer, 0, buf, rc);
> >
> > // Push the buffer into the appsrc
> > g_signal_emit_by_name (((CustomData *)data)->aappsrc, "push-buffer",
> > gbuffer, &rb);
> >
> > // Free the buffer now that we are done with it
> > gst_buffer_unref (gbuffer);
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel