Sending metadata across UDP

classic Classic list List threaded Threaded
10 messages Options
Reply | Threaded
Open this post in threaded view
|

Sending metadata across UDP

priyanka kataria
Hello,

I have an interesting problem:
Need to transfer some kind of metadata (say frame number) with each frame over UDP. Receiver on the other hand, extracts he frame numebr from each frame and maintains it for some other work.

Sample sender and receiver pipelines:
Sender: gst-launch-1.0 -v filesrc location=file.h264  ! h264parse ! rtph264pay ! udpsink port=5001
Receiver: gst-launch-1.0 -v udpsrc port=5001 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! decodebin ! autovideosink

Things I have already tried (I am still a beginner, so some of the below things may look stupid):
1. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_OFFSET".
But the set offset value is not reflected in the next element in the same pipeline only.

2. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_PTS".
The set PTS value is reflected in the next elements in the same pipeline, but gets lost across UDP.
I checked this by attaching a probe on "rtph264depay" element (src pad).

3. Using "gst_rtp_buffer_add_extension_twobytes_header()".
This method works for H264 files, but fails with MJPEG files, and my solution needs to be generic.
Here, I can provide more details with code if required.

4. Last thing I am trying is to mux KLV metadata into stream and send it across UDP.
I refer the following link: https://www.aeronetworks.ca/2018/05/mpeg-2-transport-streams.html.
This doesn't work though as written in the article but gave me an overview on how to use the pipeline.
Now I want to create my custom my KLV metadata file which contains only frame numbers and try to mux it.

Please help me in creating such file.

Also please share if there are any other working approaches I should try to append metadata in each frame buffer.

Thanks,
Priyanka

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Sending metadata across UDP

Nicolas Dufresne-5


Le jeu. 26 sept. 2019 05 h 25, priyanka kataria <[hidden email]> a écrit :
Hello,

I have an interesting problem:
Need to transfer some kind of metadata (say frame number) with each frame over UDP. Receiver on the other hand, extracts he frame numebr from each frame and maintains it for some other work.

Sample sender and receiver pipelines:
Sender: gst-launch-1.0 -v filesrc location=file.h264  ! h264parse ! rtph264pay ! udpsink port=5001
Receiver: gst-launch-1.0 -v udpsrc port=5001 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! decodebin ! autovideosink

Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.

For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.



Things I have already tried (I am still a beginner, so some of the below things may look stupid):
1. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_OFFSET".
But the set offset value is not reflected in the next element in the same pipeline only.

2. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_PTS".
The set PTS value is reflected in the next elements in the same pipeline, but gets lost across UDP.
I checked this by attaching a probe on "rtph264depay" element (src pad).

3. Using "gst_rtp_buffer_add_extension_twobytes_header()".
This method works for H264 files, but fails with MJPEG files, and my solution needs to be generic.
Here, I can provide more details with code if required.

That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.


4. Last thing I am trying is to mux KLV metadata into stream and send it across UDP.
I refer the following link: https://www.aeronetworks.ca/2018/05/mpeg-2-transport-streams.html.
This doesn't work though as written in the article but gave me an overview on how to use the pipeline.
Now I want to create my custom my KLV metadata file which contains only frame numbers and try to mux it.

Please help me in creating such file.

Also please share if there are any other working approaches I should try to append metadata in each frame buffer.

Thanks,
Priyanka
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Sending metadata across UDP

priyanka kataria
Hi Nicolas,

Thank you for quick reply.

> Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.

The actual pipeline which I use in the program has "rtpjitterbuffer", I shared a simple test pipeline.

>For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.

Point noted, will make the changes.

>That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
Here, I am attaching the source code for option 3 I tried.

"send_h264.c" and "recv_h264.c", work successfully and the frame number is appended to RTP buffer. Prints in both the probes output the correct value.

However, the probe function (pay_src_probe) in "send_jpeg.c" never gets called.
When I change the probe type from "GST_PAD_PROBE_TYPE_BUFFER" to "GST_PAD_PROBE_TYPE_BUFFER_LIST", then it gets called, and the frame numbers appended are wrong. i.e. for each frame around 100 times this function gets called and program slows down like a sloth.

I checked the source code for "rtph264pay" and rtpjpegpay" element, both of them are creating buffer lists which I guess is to push multiple RTP packets at one go, to the next element in pipeline.
But strangely H264 works fine and JPEG fails.

Please check if my code has some bug.

And do you have any suggestions on KLV metadata approach?

Thanks,
Priyanka

On Thu, Sep 26, 2019 at 7:11 PM Nicolas Dufresne <[hidden email]> wrote:


Le jeu. 26 sept. 2019 05 h 25, priyanka kataria <[hidden email]> a écrit :
Hello,

I have an interesting problem:
Need to transfer some kind of metadata (say frame number) with each frame over UDP. Receiver on the other hand, extracts he frame numebr from each frame and maintains it for some other work.

Sample sender and receiver pipelines:
Sender: gst-launch-1.0 -v filesrc location=file.h264  ! h264parse ! rtph264pay ! udpsink port=5001
Receiver: gst-launch-1.0 -v udpsrc port=5001 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! decodebin ! autovideosink

Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.

For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.



Things I have already tried (I am still a beginner, so some of the below things may look stupid):
1. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_OFFSET".
But the set offset value is not reflected in the next element in the same pipeline only.

2. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_PTS".
The set PTS value is reflected in the next elements in the same pipeline, but gets lost across UDP.
I checked this by attaching a probe on "rtph264depay" element (src pad).

3. Using "gst_rtp_buffer_add_extension_twobytes_header()".
This method works for H264 files, but fails with MJPEG files, and my solution needs to be generic.
Here, I can provide more details with code if required.

That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.


4. Last thing I am trying is to mux KLV metadata into stream and send it across UDP.
I refer the following link: https://www.aeronetworks.ca/2018/05/mpeg-2-transport-streams.html.
This doesn't work though as written in the article but gave me an overview on how to use the pipeline.
Now I want to create my custom my KLV metadata file which contains only frame numbers and try to mux it.

Please help me in creating such file.

Also please share if there are any other working approaches I should try to append metadata in each frame buffer.

Thanks,
Priyanka
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

recv_jpeg.c (4K) Download Attachment
recv_h264.c (4K) Download Attachment
send_jpeg.c (4K) Download Attachment
send_h264.c (4K) Download Attachment
cmd (622 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Sending metadata across UDP

Nicolas Dufresne-5
Le vendredi 27 septembre 2019 à 10:49 +0530, priyanka kataria a écrit :

> Hi Nicolas,
>
> Thank you for quick reply.
>
> > Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.
>
> The actual pipeline which I use in the program has "rtpjitterbuffer", I shared a simple test pipeline.
>
> >For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.
>
> Point noted, will make the changes.
>
> >That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
> Here, I am attaching the source code for option 3 I tried.
>
> "send_h264.c" and "recv_h264.c", work successfully and the frame number is appended to RTP buffer. Prints in both the probes output the correct value.
>
> However, the probe function (pay_src_probe) in "send_jpeg.c" never gets called.
> When I change the probe type from "GST_PAD_PROBE_TYPE_BUFFER" to "GST_PAD_PROBE_TYPE_BUFFER_LIST", then it gets called, and the frame numbers appended are wrong. i.e. for each frame around 100 times this function gets called and program slows down like a sloth.

There is no code for buffer list there, with buffer list you need to
use different API. Also, before you modify a buffer, you need to ensure
it's writable, making it writable will may change the GstBuffer, so you
have to update that pointer.

>
> I checked the source code for "rtph264pay" and rtpjpegpay" element, both of them are creating buffer lists which I guess is to push multiple RTP packets at one go, to the next element in pipeline.
> But strangely H264 works fine and JPEG fails.
>
> Please check if my code has some bug.
>
> And do you have any suggestions on KLV metadata approach?
>
> Thanks,
> Priyanka
>
> On Thu, Sep 26, 2019 at 7:11 PM Nicolas Dufresne <[hidden email]> wrote:
> >
> > Le jeu. 26 sept. 2019 05 h 25, priyanka kataria <[hidden email]> a écrit :
> > > Hello,
> > >
> > > I have an interesting problem:
> > > Need to transfer some kind of metadata (say frame number) with each frame over UDP. Receiver on the other hand, extracts he frame numebr from each frame and maintains it for some other work.
> > >
> > > Sample sender and receiver pipelines:
> > > Sender: gst-launch-1.0 -v filesrc location=file.h264  ! h264parse ! rtph264pay ! udpsink port=5001
> > > Receiver: gst-launch-1.0 -v udpsrc port=5001 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! decodebin ! autovideosink
> >
> > Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.
> >
> > For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.
> >
> >
> > > Things I have already tried (I am still a beginner, so some of the below things may look stupid):
> > > 1. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_OFFSET".
> > > But the set offset value is not reflected in the next element in the same pipeline only.
> > >
> > > 2. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_PTS".
> > > The set PTS value is reflected in the next elements in the same pipeline, but gets lost across UDP.
> > > I checked this by attaching a probe on "rtph264depay" element (src pad).
> > >
> > > 3. Using "gst_rtp_buffer_add_extension_twobytes_header()".
> > > This method works for H264 files, but fails with MJPEG files, and my solution needs to be generic.
> > > Here, I can provide more details with code if required.
> >
> > That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
> >
> > > 4. Last thing I am trying is to mux KLV metadata into stream and send it across UDP.
> > > I refer the following link: https://www.aeronetworks.ca/2018/05/mpeg-2-transport-streams.html.
> > > This doesn't work though as written in the article but gave me an overview on how to use the pipeline.
> > > Now I want to create my custom my KLV metadata file which contains only frame numbers and try to mux it.
> > >
> > > Please help me in creating such file.
> > >
> > > Also please share if there are any other working approaches I should try to append metadata in each frame buffer.
> > >
> > > Thanks,
> > > Priyanka
> > > _______________________________________________
> > > gstreamer-devel mailing list
> > > [hidden email]
> > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> >
> > _______________________________________________
> > gstreamer-devel mailing list
> > [hidden email]
> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Sending metadata across UDP

priyanka kataria
Hi Nicolas,

>There is no code for buffer list there, with buffer list you need to
use different API. Also, before you modify a buffer, you need to ensure
it's writable, making it writable will may change the GstBuffer, so you
have to update that pointer.

Thank you for the pointers.

I have attached the modified JPEG code which handles BUFFER_LIST.
However, for each frame I get around 100 lists, hence, same frame_count is printed around 100 times, both in case of send_jpeg.c and recv_jpeg.c.

And the program is also very slow, which I believe is due to attaching probe on each bufferlist.

But my pain point is I wanted to propagate a unique frame number across UDP to identify each frame and what I have got is around 100 lists per frame, which makes this frame number useless for me. How do I make use of this frame number which I have got?

Is my explanation clear enough to understand the purpose of sending frame number across UDP?

Thanks,
Priyanka Kataria

On Fri, Sep 27, 2019 at 7:03 PM Nicolas Dufresne <[hidden email]> wrote:
Le vendredi 27 septembre 2019 à 10:49 +0530, priyanka kataria a écrit :
> Hi Nicolas,
>
> Thank you for quick reply.
>
> > Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.
>
> The actual pipeline which I use in the program has "rtpjitterbuffer", I shared a simple test pipeline.
>
> >For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.
>
> Point noted, will make the changes.
>
> >That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
> Here, I am attaching the source code for option 3 I tried.
>
> "send_h264.c" and "recv_h264.c", work successfully and the frame number is appended to RTP buffer. Prints in both the probes output the correct value.
>
> However, the probe function (pay_src_probe) in "send_jpeg.c" never gets called.
> When I change the probe type from "GST_PAD_PROBE_TYPE_BUFFER" to "GST_PAD_PROBE_TYPE_BUFFER_LIST", then it gets called, and the frame numbers appended are wrong. i.e. for each frame around 100 times this function gets called and program slows down like a sloth.

There is no code for buffer list there, with buffer list you need to
use different API. Also, before you modify a buffer, you need to ensure
it's writable, making it writable will may change the GstBuffer, so you
have to update that pointer.

>
> I checked the source code for "rtph264pay" and rtpjpegpay" element, both of them are creating buffer lists which I guess is to push multiple RTP packets at one go, to the next element in pipeline.
> But strangely H264 works fine and JPEG fails.
>
> Please check if my code has some bug.
>
> And do you have any suggestions on KLV metadata approach?
>
> Thanks,
> Priyanka
>
> On Thu, Sep 26, 2019 at 7:11 PM Nicolas Dufresne <[hidden email]> wrote:
> >
> > Le jeu. 26 sept. 2019 05 h 25, priyanka kataria <[hidden email]> a écrit :
> > > Hello,
> > >
> > > I have an interesting problem:
> > > Need to transfer some kind of metadata (say frame number) with each frame over UDP. Receiver on the other hand, extracts he frame numebr from each frame and maintains it for some other work.
> > >
> > > Sample sender and receiver pipelines:
> > > Sender: gst-launch-1.0 -v filesrc location=file.h264  ! h264parse ! rtph264pay ! udpsink port=5001
> > > Receiver: gst-launch-1.0 -v udpsrc port=5001 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! decodebin ! autovideosink
> >
> > Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.
> >
> > For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.
> >
> >
> > > Things I have already tried (I am still a beginner, so some of the below things may look stupid):
> > > 1. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_OFFSET".
> > > But the set offset value is not reflected in the next element in the same pipeline only.
> > >
> > > 2. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_PTS".
> > > The set PTS value is reflected in the next elements in the same pipeline, but gets lost across UDP.
> > > I checked this by attaching a probe on "rtph264depay" element (src pad).
> > >
> > > 3. Using "gst_rtp_buffer_add_extension_twobytes_header()".
> > > This method works for H264 files, but fails with MJPEG files, and my solution needs to be generic.
> > > Here, I can provide more details with code if required.
> >
> > That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
> >
> > > 4. Last thing I am trying is to mux KLV metadata into stream and send it across UDP.
> > > I refer the following link: https://www.aeronetworks.ca/2018/05/mpeg-2-transport-streams.html.
> > > This doesn't work though as written in the article but gave me an overview on how to use the pipeline.
> > > Now I want to create my custom my KLV metadata file which contains only frame numbers and try to mux it.
> > >
> > > Please help me in creating such file.
> > >
> > > Also please share if there are any other working approaches I should try to append metadata in each frame buffer.
> > >
> > > Thanks,
> > > Priyanka
> > > _______________________________________________
> > > gstreamer-devel mailing list
> > > [hidden email]
> > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> >
> > _______________________________________________
> > gstreamer-devel mailing list
> > [hidden email]
> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

recv_jpeg.c (4K) Download Attachment
send_jpeg.c (5K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Sending metadata across UDP

Nicolas Dufresne-5
Le lundi 30 septembre 2019 à 17:13 +0530, priyanka kataria a écrit :

> Hi Nicolas,
>
> >There is no code for buffer list there, with buffer list you need to
> use different API. Also, before you modify a buffer, you need to ensure
> it's writable, making it writable will may change the GstBuffer, so you
> have to update that pointer.
>
> Thank you for the pointers.
>
> I have attached the modified JPEG code which handles BUFFER_LIST.
> However, for each frame I get around 100 lists, hence, same frame_count is printed around 100 times, both in case of send_jpeg.c and recv_jpeg.c.
>
> And the program is also very slow, which I believe is due to attaching probe on each bufferlist.
>
> But my pain point is I wanted to propagate a unique frame number across UDP to identify each frame and what I have got is around 100 lists per frame, which makes this frame number useless for me. How do I make use of this frame number which I have got?
>
> Is my explanation clear enough to understand the purpose of sending frame number across UDP?
I'm suspicious you don't check the probe type correctly, since if you
add both LIST and BUFFER flag on your probe, it means the probe data
can be both. If LIST was effectively create, you'd likely get 1 list
per frame I think.

That being said, for "frame base" metadata, you may put that info on
all packet, for redundancy, but then you need a pair of probes on the
receiver, one remember the last seen value, and the other will appy it
somehow to your reconstructed frames.

>
> Thanks,
> Priyanka Kataria
>
> On Fri, Sep 27, 2019 at 7:03 PM Nicolas Dufresne <[hidden email]> wrote:
> > Le vendredi 27 septembre 2019 à 10:49 +0530, priyanka kataria a écrit :
> > > Hi Nicolas,
> > >
> > > Thank you for quick reply.
> > >
> > > > Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.
> > >
> > > The actual pipeline which I use in the program has "rtpjitterbuffer", I shared a simple test pipeline.
> > >
> > > >For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.
> > >
> > > Point noted, will make the changes.
> > >
> > > >That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
> > > Here, I am attaching the source code for option 3 I tried.
> > >
> > > "send_h264.c" and "recv_h264.c", work successfully and the frame number is appended to RTP buffer. Prints in both the probes output the correct value.
> > >
> > > However, the probe function (pay_src_probe) in "send_jpeg.c" never gets called.
> > > When I change the probe type from "GST_PAD_PROBE_TYPE_BUFFER" to "GST_PAD_PROBE_TYPE_BUFFER_LIST", then it gets called, and the frame numbers appended are wrong. i.e. for each frame around 100 times this function gets called and program slows down like a sloth.
> >
> > There is no code for buffer list there, with buffer list you need to
> > use different API. Also, before you modify a buffer, you need to ensure
> > it's writable, making it writable will may change the GstBuffer, so you
> > have to update that pointer.
> >
> > >
> > > I checked the source code for "rtph264pay" and rtpjpegpay" element, both of them are creating buffer lists which I guess is to push multiple RTP packets at one go, to the next element in pipeline.
> > > But strangely H264 works fine and JPEG fails.
> > >
> > > Please check if my code has some bug.
> > >
> > > And do you have any suggestions on KLV metadata approach?
> > >
> > > Thanks,
> > > Priyanka
> > >
> > > On Thu, Sep 26, 2019 at 7:11 PM Nicolas Dufresne <[hidden email]> wrote:
> > > >
> > > > Le jeu. 26 sept. 2019 05 h 25, priyanka kataria <[hidden email]> a écrit :
> > > > > Hello,
> > > > >
> > > > > I have an interesting problem:
> > > > > Need to transfer some kind of metadata (say frame number) with each frame over UDP. Receiver on the other hand, extracts he frame numebr from each frame and maintains it for some other work.
> > > > >
> > > > > Sample sender and receiver pipelines:
> > > > > Sender: gst-launch-1.0 -v filesrc location=file.h264  ! h264parse ! rtph264pay ! udpsink port=5001
> > > > > Receiver: gst-launch-1.0 -v udpsrc port=5001 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! decodebin ! autovideosink
> > > >
> > > > Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.
> > > >
> > > > For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.
> > > >
> > > >
> > > > > Things I have already tried (I am still a beginner, so some of the below things may look stupid):
> > > > > 1. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_OFFSET".
> > > > > But the set offset value is not reflected in the next element in the same pipeline only.
> > > > >
> > > > > 2. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_PTS".
> > > > > The set PTS value is reflected in the next elements in the same pipeline, but gets lost across UDP.
> > > > > I checked this by attaching a probe on "rtph264depay" element (src pad).
> > > > >
> > > > > 3. Using "gst_rtp_buffer_add_extension_twobytes_header()".
> > > > > This method works for H264 files, but fails with MJPEG files, and my solution needs to be generic.
> > > > > Here, I can provide more details with code if required.
> > > >
> > > > That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
> > > >
> > > > > 4. Last thing I am trying is to mux KLV metadata into stream and send it across UDP.
> > > > > I refer the following link: https://www.aeronetworks.ca/2018/05/mpeg-2-transport-streams.html.
> > > > > This doesn't work though as written in the article but gave me an overview on how to use the pipeline.
> > > > > Now I want to create my custom my KLV metadata file which contains only frame numbers and try to mux it.
> > > > >
> > > > > Please help me in creating such file.
> > > > >
> > > > > Also please share if there are any other working approaches I should try to append metadata in each frame buffer.
> > > > >
> > > > > Thanks,
> > > > > Priyanka
> > > > > _______________________________________________
> > > > > gstreamer-devel mailing list
> > > > > [hidden email]
> > > > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> > > >
> > > > _______________________________________________
> > > > gstreamer-devel mailing list
> > > > [hidden email]
> > > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> > >
> > > _______________________________________________
> > > gstreamer-devel mailing list
> > > [hidden email]
> > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> >
> > _______________________________________________
> > gstreamer-devel mailing list
> > [hidden email]
> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (201 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Sending metadata across UDP

priyanka kataria
Hi Nicolas,

Thank you again.

I'm suspicious you don't check the probe type correctly, since if you add both LIST and BUFFER flag on your probe, 
> it means the probe data can be both. If LIST was effectively create, you'd likely get 1 list per frame I think.

I don't think I understand this completely.
Are you pointing towards the flag "GST_PAD_PROBE_TYPE_BUFFER_LIST" used in send_jpeg.c and "pay_src_probe" function handling both "GST_PAD_PROBE_TYPE_BUFFER_LIST" and "GST_PAD_PROBE_TYPE_BUFFER"?
If that is the case, I have attached modified source wherein I have commented code for GST_PAD_PROBE_TYPE_BUFFER, but the behavior is still the same, around 100 lists per frame.

> That being said, for "frame base" metadata, you may put that info on all packet, for redundancy, but then you need a pair of probes on the
> receiver, one remember the last seen value, and the other will appy it somehow to your reconstructed frames.

In modified "recv_joeg.c", I have added frame ID as metadata in "decode_buffer_probe" but the output I get is:
2222222 Frame id is : 1
2222222 Frame id is : 2
2222222 Frame id is : 62
2222222 Frame id is : 63
2222222 Frame id is : 64
2222222 Frame id is : 65
2222222 Frame id is : 66
2222222 Frame id is : 67
2222222 Frame id is : 68

Many of the frame IDs are not attached on the buffer or they are just not being printed, I am not sure.
But this is for sure that this method makes the program very slow and cannot be used for live stream.

Please correct me if I am wrong.

Thanks,
Priyanka

On Mon, Sep 30, 2019 at 9:55 PM Nicolas Dufresne <[hidden email]> wrote:
Le lundi 30 septembre 2019 à 17:13 +0530, priyanka kataria a écrit :
> Hi Nicolas,
>
> >There is no code for buffer list there, with buffer list you need to
> use different API. Also, before you modify a buffer, you need to ensure
> it's writable, making it writable will may change the GstBuffer, so you
> have to update that pointer.
>
> Thank you for the pointers.
>
> I have attached the modified JPEG code which handles BUFFER_LIST.
> However, for each frame I get around 100 lists, hence, same frame_count is printed around 100 times, both in case of send_jpeg.c and recv_jpeg.c.
>
> And the program is also very slow, which I believe is due to attaching probe on each bufferlist.
>
> But my pain point is I wanted to propagate a unique frame number across UDP to identify each frame and what I have got is around 100 lists per frame, which makes this frame number useless for me. How do I make use of this frame number which I have got?
>
> Is my explanation clear enough to understand the purpose of sending frame number across UDP?

I'm suspicious you don't check the probe type correctly, since if you
add both LIST and BUFFER flag on your probe, it means the probe data
can be both. If LIST was effectively create, you'd likely get 1 list
per frame I think.

That being said, for "frame base" metadata, you may put that info on
all packet, for redundancy, but then you need a pair of probes on the
receiver, one remember the last seen value, and the other will appy it
somehow to your reconstructed frames.

>
> Thanks,
> Priyanka Kataria
>
> On Fri, Sep 27, 2019 at 7:03 PM Nicolas Dufresne <[hidden email]> wrote:
> > Le vendredi 27 septembre 2019 à 10:49 +0530, priyanka kataria a écrit :
> > > Hi Nicolas,
> > >
> > > Thank you for quick reply.
> > >
> > > > Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.
> > >
> > > The actual pipeline which I use in the program has "rtpjitterbuffer", I shared a simple test pipeline.
> > >
> > > >For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.
> > >
> > > Point noted, will make the changes.
> > >
> > > >That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
> > > Here, I am attaching the source code for option 3 I tried.
> > >
> > > "send_h264.c" and "recv_h264.c", work successfully and the frame number is appended to RTP buffer. Prints in both the probes output the correct value.
> > >
> > > However, the probe function (pay_src_probe) in "send_jpeg.c" never gets called.
> > > When I change the probe type from "GST_PAD_PROBE_TYPE_BUFFER" to "GST_PAD_PROBE_TYPE_BUFFER_LIST", then it gets called, and the frame numbers appended are wrong. i.e. for each frame around 100 times this function gets called and program slows down like a sloth.
> >
> > There is no code for buffer list there, with buffer list you need to
> > use different API. Also, before you modify a buffer, you need to ensure
> > it's writable, making it writable will may change the GstBuffer, so you
> > have to update that pointer.
> >
> > >
> > > I checked the source code for "rtph264pay" and rtpjpegpay" element, both of them are creating buffer lists which I guess is to push multiple RTP packets at one go, to the next element in pipeline.
> > > But strangely H264 works fine and JPEG fails.
> > >
> > > Please check if my code has some bug.
> > >
> > > And do you have any suggestions on KLV metadata approach?
> > >
> > > Thanks,
> > > Priyanka
> > >
> > > On Thu, Sep 26, 2019 at 7:11 PM Nicolas Dufresne <[hidden email]> wrote:
> > > >
> > > > Le jeu. 26 sept. 2019 05 h 25, priyanka kataria <[hidden email]> a écrit :
> > > > > Hello,
> > > > >
> > > > > I have an interesting problem:
> > > > > Need to transfer some kind of metadata (say frame number) with each frame over UDP. Receiver on the other hand, extracts he frame numebr from each frame and maintains it for some other work.
> > > > >
> > > > > Sample sender and receiver pipelines:
> > > > > Sender: gst-launch-1.0 -v filesrc location=file.h264  ! h264parse ! rtph264pay ! udpsink port=5001
> > > > > Receiver: gst-launch-1.0 -v udpsrc port=5001 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! decodebin ! autovideosink
> > > >
> > > > Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.
> > > >
> > > > For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.
> > > >
> > > >
> > > > > Things I have already tried (I am still a beginner, so some of the below things may look stupid):
> > > > > 1. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_OFFSET".
> > > > > But the set offset value is not reflected in the next element in the same pipeline only.
> > > > >
> > > > > 2. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_PTS".
> > > > > The set PTS value is reflected in the next elements in the same pipeline, but gets lost across UDP.
> > > > > I checked this by attaching a probe on "rtph264depay" element (src pad).
> > > > >
> > > > > 3. Using "gst_rtp_buffer_add_extension_twobytes_header()".
> > > > > This method works for H264 files, but fails with MJPEG files, and my solution needs to be generic.
> > > > > Here, I can provide more details with code if required.
> > > >
> > > > That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
> > > >
> > > > > 4. Last thing I am trying is to mux KLV metadata into stream and send it across UDP.
> > > > > I refer the following link: https://www.aeronetworks.ca/2018/05/mpeg-2-transport-streams.html.
> > > > > This doesn't work though as written in the article but gave me an overview on how to use the pipeline.
> > > > > Now I want to create my custom my KLV metadata file which contains only frame numbers and try to mux it.
> > > > >
> > > > > Please help me in creating such file.
> > > > >
> > > > > Also please share if there are any other working approaches I should try to append metadata in each frame buffer.
> > > > >
> > > > > Thanks,
> > > > > Priyanka
> > > > > _______________________________________________
> > > > > gstreamer-devel mailing list
> > > > > [hidden email]
> > > > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> > > >
> > > > _______________________________________________
> > > > gstreamer-devel mailing list
> > > > [hidden email]
> > > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> > >
> > > _______________________________________________
> > > gstreamer-devel mailing list
> > > [hidden email]
> > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> >
> > _______________________________________________
> > gstreamer-devel mailing list
> > [hidden email]
> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

meta.c (4K) Download Attachment
recv_jpeg.c (5K) Download Attachment
cmd (1018 bytes) Download Attachment
meta.h (3K) Download Attachment
send_jpeg.c (5K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Sending metadata across UDP

Nicolas Dufresne-5


Le mar. 1 oct. 2019 08 h 10, priyanka kataria <[hidden email]> a écrit :
Hi Nicolas,

Thank you again.

I'm suspicious you don't check the probe type correctly, since if you add both LIST and BUFFER flag on your probe, 
> it means the probe data can be both. If LIST was effectively create, you'd likely get 1 list per frame I think.

I don't think I understand this completely.
Are you pointing towards the flag "GST_PAD_PROBE_TYPE_BUFFER_LIST" used in send_jpeg.c and "pay_src_probe" function handling both "GST_PAD_PROBE_TYPE_BUFFER_LIST" and "GST_PAD_PROBE_TYPE_BUFFER"?
If that is the case, I have attached modified source wherein I have commented code for GST_PAD_PROBE_TYPE_BUFFER, but the behavior is still the same, around 100 lists per frame.

> That being said, for "frame base" metadata, you may put that info on all packet, for redundancy, but then you need a pair of probes on the
> receiver, one remember the last seen value, and the other will appy it somehow to your reconstructed frames.

In modified "recv_joeg.c", I have added frame ID as metadata in "decode_buffer_probe" but the output I get is:
2222222 Frame id is : 1
2222222 Frame id is : 2
2222222 Frame id is : 62
2222222 Frame id is : 63
2222222 Frame id is : 64
2222222 Frame id is : 65
2222222 Frame id is : 66
2222222 Frame id is : 67
2222222 Frame id is : 68

Many of the frame IDs are not attached on the buffer or they are just not being printed, I am not sure.
But this is for sure that this method makes the program very slow and cannot be used for live stream.

Please correct me if I am wrong.


Looks like packet lost to me. If you need all frames, use a reliable protocol.


Thanks,
Priyanka

On Mon, Sep 30, 2019 at 9:55 PM Nicolas Dufresne <[hidden email]> wrote:
Le lundi 30 septembre 2019 à 17:13 +0530, priyanka kataria a écrit :
> Hi Nicolas,
>
> >There is no code for buffer list there, with buffer list you need to
> use different API. Also, before you modify a buffer, you need to ensure
> it's writable, making it writable will may change the GstBuffer, so you
> have to update that pointer.
>
> Thank you for the pointers.
>
> I have attached the modified JPEG code which handles BUFFER_LIST.
> However, for each frame I get around 100 lists, hence, same frame_count is printed around 100 times, both in case of send_jpeg.c and recv_jpeg.c.
>
> And the program is also very slow, which I believe is due to attaching probe on each bufferlist.
>
> But my pain point is I wanted to propagate a unique frame number across UDP to identify each frame and what I have got is around 100 lists per frame, which makes this frame number useless for me. How do I make use of this frame number which I have got?
>
> Is my explanation clear enough to understand the purpose of sending frame number across UDP?

I'm suspicious you don't check the probe type correctly, since if you
add both LIST and BUFFER flag on your probe, it means the probe data
can be both. If LIST was effectively create, you'd likely get 1 list
per frame I think.

That being said, for "frame base" metadata, you may put that info on
all packet, for redundancy, but then you need a pair of probes on the
receiver, one remember the last seen value, and the other will appy it
somehow to your reconstructed frames.

>
> Thanks,
> Priyanka Kataria
>
> On Fri, Sep 27, 2019 at 7:03 PM Nicolas Dufresne <[hidden email]> wrote:
> > Le vendredi 27 septembre 2019 à 10:49 +0530, priyanka kataria a écrit :
> > > Hi Nicolas,
> > >
> > > Thank you for quick reply.
> > >
> > > > Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.
> > >
> > > The actual pipeline which I use in the program has "rtpjitterbuffer", I shared a simple test pipeline.
> > >
> > > >For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.
> > >
> > > Point noted, will make the changes.
> > >
> > > >That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
> > > Here, I am attaching the source code for option 3 I tried.
> > >
> > > "send_h264.c" and "recv_h264.c", work successfully and the frame number is appended to RTP buffer. Prints in both the probes output the correct value.
> > >
> > > However, the probe function (pay_src_probe) in "send_jpeg.c" never gets called.
> > > When I change the probe type from "GST_PAD_PROBE_TYPE_BUFFER" to "GST_PAD_PROBE_TYPE_BUFFER_LIST", then it gets called, and the frame numbers appended are wrong. i.e. for each frame around 100 times this function gets called and program slows down like a sloth.
> >
> > There is no code for buffer list there, with buffer list you need to
> > use different API. Also, before you modify a buffer, you need to ensure
> > it's writable, making it writable will may change the GstBuffer, so you
> > have to update that pointer.
> >
> > >
> > > I checked the source code for "rtph264pay" and rtpjpegpay" element, both of them are creating buffer lists which I guess is to push multiple RTP packets at one go, to the next element in pipeline.
> > > But strangely H264 works fine and JPEG fails.
> > >
> > > Please check if my code has some bug.
> > >
> > > And do you have any suggestions on KLV metadata approach?
> > >
> > > Thanks,
> > > Priyanka
> > >
> > > On Thu, Sep 26, 2019 at 7:11 PM Nicolas Dufresne <[hidden email]> wrote:
> > > >
> > > > Le jeu. 26 sept. 2019 05 h 25, priyanka kataria <[hidden email]> a écrit :
> > > > > Hello,
> > > > >
> > > > > I have an interesting problem:
> > > > > Need to transfer some kind of metadata (say frame number) with each frame over UDP. Receiver on the other hand, extracts he frame numebr from each frame and maintains it for some other work.
> > > > >
> > > > > Sample sender and receiver pipelines:
> > > > > Sender: gst-launch-1.0 -v filesrc location=file.h264  ! h264parse ! rtph264pay ! udpsink port=5001
> > > > > Receiver: gst-launch-1.0 -v udpsrc port=5001 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! decodebin ! autovideosink
> > > >
> > > > Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.
> > > >
> > > > For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.
> > > >
> > > >
> > > > > Things I have already tried (I am still a beginner, so some of the below things may look stupid):
> > > > > 1. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_OFFSET".
> > > > > But the set offset value is not reflected in the next element in the same pipeline only.
> > > > >
> > > > > 2. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_PTS".
> > > > > The set PTS value is reflected in the next elements in the same pipeline, but gets lost across UDP.
> > > > > I checked this by attaching a probe on "rtph264depay" element (src pad).
> > > > >
> > > > > 3. Using "gst_rtp_buffer_add_extension_twobytes_header()".
> > > > > This method works for H264 files, but fails with MJPEG files, and my solution needs to be generic.
> > > > > Here, I can provide more details with code if required.
> > > >
> > > > That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
> > > >
> > > > > 4. Last thing I am trying is to mux KLV metadata into stream and send it across UDP.
> > > > > I refer the following link: https://www.aeronetworks.ca/2018/05/mpeg-2-transport-streams.html.
> > > > > This doesn't work though as written in the article but gave me an overview on how to use the pipeline.
> > > > > Now I want to create my custom my KLV metadata file which contains only frame numbers and try to mux it.
> > > > >
> > > > > Please help me in creating such file.
> > > > >
> > > > > Also please share if there are any other working approaches I should try to append metadata in each frame buffer.
> > > > >
> > > > > Thanks,
> > > > > Priyanka
> > > > > _______________________________________________
> > > > > gstreamer-devel mailing list
> > > > > [hidden email]
> > > > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> > > >
> > > > _______________________________________________
> > > > gstreamer-devel mailing list
> > > > [hidden email]
> > > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> > >
> > > _______________________________________________
> > > gstreamer-devel mailing list
> > > [hidden email]
> > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> >
> > _______________________________________________
> > gstreamer-devel mailing list
> > [hidden email]
> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Sending metadata across UDP

priyanka kataria
Hi Nicolas,

>Looks like packet lost to me. If you need all frames, use a reliable protocol.

I am creating program to handle live sources so I will have to use UDP protocol only.

Plus I think this behavior occurs because attaching the probe on each frame and modifying it, slows down the pipeline a lot.
Is this correct?

Thanks,
Priyanka

On Wed, Oct 2, 2019 at 12:03 AM Nicolas Dufresne <[hidden email]> wrote:


Le mar. 1 oct. 2019 08 h 10, priyanka kataria <[hidden email]> a écrit :
Hi Nicolas,

Thank you again.

I'm suspicious you don't check the probe type correctly, since if you add both LIST and BUFFER flag on your probe, 
> it means the probe data can be both. If LIST was effectively create, you'd likely get 1 list per frame I think.

I don't think I understand this completely.
Are you pointing towards the flag "GST_PAD_PROBE_TYPE_BUFFER_LIST" used in send_jpeg.c and "pay_src_probe" function handling both "GST_PAD_PROBE_TYPE_BUFFER_LIST" and "GST_PAD_PROBE_TYPE_BUFFER"?
If that is the case, I have attached modified source wherein I have commented code for GST_PAD_PROBE_TYPE_BUFFER, but the behavior is still the same, around 100 lists per frame.

> That being said, for "frame base" metadata, you may put that info on all packet, for redundancy, but then you need a pair of probes on the
> receiver, one remember the last seen value, and the other will appy it somehow to your reconstructed frames.

In modified "recv_joeg.c", I have added frame ID as metadata in "decode_buffer_probe" but the output I get is:
2222222 Frame id is : 1
2222222 Frame id is : 2
2222222 Frame id is : 62
2222222 Frame id is : 63
2222222 Frame id is : 64
2222222 Frame id is : 65
2222222 Frame id is : 66
2222222 Frame id is : 67
2222222 Frame id is : 68

Many of the frame IDs are not attached on the buffer or they are just not being printed, I am not sure.
But this is for sure that this method makes the program very slow and cannot be used for live stream.

Please correct me if I am wrong.


Looks like packet lost to me. If you need all frames, use a reliable protocol.


Thanks,
Priyanka

On Mon, Sep 30, 2019 at 9:55 PM Nicolas Dufresne <[hidden email]> wrote:
Le lundi 30 septembre 2019 à 17:13 +0530, priyanka kataria a écrit :
> Hi Nicolas,
>
> >There is no code for buffer list there, with buffer list you need to
> use different API. Also, before you modify a buffer, you need to ensure
> it's writable, making it writable will may change the GstBuffer, so you
> have to update that pointer.
>
> Thank you for the pointers.
>
> I have attached the modified JPEG code which handles BUFFER_LIST.
> However, for each frame I get around 100 lists, hence, same frame_count is printed around 100 times, both in case of send_jpeg.c and recv_jpeg.c.
>
> And the program is also very slow, which I believe is due to attaching probe on each bufferlist.
>
> But my pain point is I wanted to propagate a unique frame number across UDP to identify each frame and what I have got is around 100 lists per frame, which makes this frame number useless for me. How do I make use of this frame number which I have got?
>
> Is my explanation clear enough to understand the purpose of sending frame number across UDP?

I'm suspicious you don't check the probe type correctly, since if you
add both LIST and BUFFER flag on your probe, it means the probe data
can be both. If LIST was effectively create, you'd likely get 1 list
per frame I think.

That being said, for "frame base" metadata, you may put that info on
all packet, for redundancy, but then you need a pair of probes on the
receiver, one remember the last seen value, and the other will appy it
somehow to your reconstructed frames.

>
> Thanks,
> Priyanka Kataria
>
> On Fri, Sep 27, 2019 at 7:03 PM Nicolas Dufresne <[hidden email]> wrote:
> > Le vendredi 27 septembre 2019 à 10:49 +0530, priyanka kataria a écrit :
> > > Hi Nicolas,
> > >
> > > Thank you for quick reply.
> > >
> > > > Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.
> > >
> > > The actual pipeline which I use in the program has "rtpjitterbuffer", I shared a simple test pipeline.
> > >
> > > >For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.
> > >
> > > Point noted, will make the changes.
> > >
> > > >That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
> > > Here, I am attaching the source code for option 3 I tried.
> > >
> > > "send_h264.c" and "recv_h264.c", work successfully and the frame number is appended to RTP buffer. Prints in both the probes output the correct value.
> > >
> > > However, the probe function (pay_src_probe) in "send_jpeg.c" never gets called.
> > > When I change the probe type from "GST_PAD_PROBE_TYPE_BUFFER" to "GST_PAD_PROBE_TYPE_BUFFER_LIST", then it gets called, and the frame numbers appended are wrong. i.e. for each frame around 100 times this function gets called and program slows down like a sloth.
> >
> > There is no code for buffer list there, with buffer list you need to
> > use different API. Also, before you modify a buffer, you need to ensure
> > it's writable, making it writable will may change the GstBuffer, so you
> > have to update that pointer.
> >
> > >
> > > I checked the source code for "rtph264pay" and rtpjpegpay" element, both of them are creating buffer lists which I guess is to push multiple RTP packets at one go, to the next element in pipeline.
> > > But strangely H264 works fine and JPEG fails.
> > >
> > > Please check if my code has some bug.
> > >
> > > And do you have any suggestions on KLV metadata approach?
> > >
> > > Thanks,
> > > Priyanka
> > >
> > > On Thu, Sep 26, 2019 at 7:11 PM Nicolas Dufresne <[hidden email]> wrote:
> > > >
> > > > Le jeu. 26 sept. 2019 05 h 25, priyanka kataria <[hidden email]> a écrit :
> > > > > Hello,
> > > > >
> > > > > I have an interesting problem:
> > > > > Need to transfer some kind of metadata (say frame number) with each frame over UDP. Receiver on the other hand, extracts he frame numebr from each frame and maintains it for some other work.
> > > > >
> > > > > Sample sender and receiver pipelines:
> > > > > Sender: gst-launch-1.0 -v filesrc location=file.h264  ! h264parse ! rtph264pay ! udpsink port=5001
> > > > > Receiver: gst-launch-1.0 -v udpsrc port=5001 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! decodebin ! autovideosink
> > > >
> > > > Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.
> > > >
> > > > For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.
> > > >
> > > >
> > > > > Things I have already tried (I am still a beginner, so some of the below things may look stupid):
> > > > > 1. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_OFFSET".
> > > > > But the set offset value is not reflected in the next element in the same pipeline only.
> > > > >
> > > > > 2. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_PTS".
> > > > > The set PTS value is reflected in the next elements in the same pipeline, but gets lost across UDP.
> > > > > I checked this by attaching a probe on "rtph264depay" element (src pad).
> > > > >
> > > > > 3. Using "gst_rtp_buffer_add_extension_twobytes_header()".
> > > > > This method works for H264 files, but fails with MJPEG files, and my solution needs to be generic.
> > > > > Here, I can provide more details with code if required.
> > > >
> > > > That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
> > > >
> > > > > 4. Last thing I am trying is to mux KLV metadata into stream and send it across UDP.
> > > > > I refer the following link: https://www.aeronetworks.ca/2018/05/mpeg-2-transport-streams.html.
> > > > > This doesn't work though as written in the article but gave me an overview on how to use the pipeline.
> > > > > Now I want to create my custom my KLV metadata file which contains only frame numbers and try to mux it.
> > > > >
> > > > > Please help me in creating such file.
> > > > >
> > > > > Also please share if there are any other working approaches I should try to append metadata in each frame buffer.
> > > > >
> > > > > Thanks,
> > > > > Priyanka
> > > > > _______________________________________________
> > > > > gstreamer-devel mailing list
> > > > > [hidden email]
> > > > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> > > >
> > > > _______________________________________________
> > > > gstreamer-devel mailing list
> > > > [hidden email]
> > > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> > >
> > > _______________________________________________
> > > gstreamer-devel mailing list
> > > [hidden email]
> > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> >
> > _______________________________________________
> > gstreamer-devel mailing list
> > [hidden email]
> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Sending metadata across UDP

Nicolas Dufresne-5


Le ven. 4 oct. 2019 06 h 55, priyanka kataria <[hidden email]> a écrit :
Hi Nicolas,

>Looks like packet lost to me. If you need all frames, use a reliable protocol.

I am creating program to handle live sources so I will have to use UDP protocol only.

Was a deviated way to ask if your code tolerates lost, which will happen whatever you do about it over UDP/RTP.


Plus I think this behavior occurs because attaching the probe on each frame and modifying it, slows down the pipeline a lot.
Is this correct?

Can't say, by themselves, C written pad probes don't slow down much the flow, but memory operations could. We've done what you are doing on RPI2 over wifi, and it worked, so it makes your case a bit suspicious.


Thanks,
Priyanka

On Wed, Oct 2, 2019 at 12:03 AM Nicolas Dufresne <[hidden email]> wrote:


Le mar. 1 oct. 2019 08 h 10, priyanka kataria <[hidden email]> a écrit :
Hi Nicolas,

Thank you again.

I'm suspicious you don't check the probe type correctly, since if you add both LIST and BUFFER flag on your probe, 
> it means the probe data can be both. If LIST was effectively create, you'd likely get 1 list per frame I think.

I don't think I understand this completely.
Are you pointing towards the flag "GST_PAD_PROBE_TYPE_BUFFER_LIST" used in send_jpeg.c and "pay_src_probe" function handling both "GST_PAD_PROBE_TYPE_BUFFER_LIST" and "GST_PAD_PROBE_TYPE_BUFFER"?
If that is the case, I have attached modified source wherein I have commented code for GST_PAD_PROBE_TYPE_BUFFER, but the behavior is still the same, around 100 lists per frame.

> That being said, for "frame base" metadata, you may put that info on all packet, for redundancy, but then you need a pair of probes on the
> receiver, one remember the last seen value, and the other will appy it somehow to your reconstructed frames.

In modified "recv_joeg.c", I have added frame ID as metadata in "decode_buffer_probe" but the output I get is:
2222222 Frame id is : 1
2222222 Frame id is : 2
2222222 Frame id is : 62
2222222 Frame id is : 63
2222222 Frame id is : 64
2222222 Frame id is : 65
2222222 Frame id is : 66
2222222 Frame id is : 67
2222222 Frame id is : 68

Many of the frame IDs are not attached on the buffer or they are just not being printed, I am not sure.
But this is for sure that this method makes the program very slow and cannot be used for live stream.

Please correct me if I am wrong.


Looks like packet lost to me. If you need all frames, use a reliable protocol.


Thanks,
Priyanka

On Mon, Sep 30, 2019 at 9:55 PM Nicolas Dufresne <[hidden email]> wrote:
Le lundi 30 septembre 2019 à 17:13 +0530, priyanka kataria a écrit :
> Hi Nicolas,
>
> >There is no code for buffer list there, with buffer list you need to
> use different API. Also, before you modify a buffer, you need to ensure
> it's writable, making it writable will may change the GstBuffer, so you
> have to update that pointer.
>
> Thank you for the pointers.
>
> I have attached the modified JPEG code which handles BUFFER_LIST.
> However, for each frame I get around 100 lists, hence, same frame_count is printed around 100 times, both in case of send_jpeg.c and recv_jpeg.c.
>
> And the program is also very slow, which I believe is due to attaching probe on each bufferlist.
>
> But my pain point is I wanted to propagate a unique frame number across UDP to identify each frame and what I have got is around 100 lists per frame, which makes this frame number useless for me. How do I make use of this frame number which I have got?
>
> Is my explanation clear enough to understand the purpose of sending frame number across UDP?

I'm suspicious you don't check the probe type correctly, since if you
add both LIST and BUFFER flag on your probe, it means the probe data
can be both. If LIST was effectively create, you'd likely get 1 list
per frame I think.

That being said, for "frame base" metadata, you may put that info on
all packet, for redundancy, but then you need a pair of probes on the
receiver, one remember the last seen value, and the other will appy it
somehow to your reconstructed frames.

>
> Thanks,
> Priyanka Kataria
>
> On Fri, Sep 27, 2019 at 7:03 PM Nicolas Dufresne <[hidden email]> wrote:
> > Le vendredi 27 septembre 2019 à 10:49 +0530, priyanka kataria a écrit :
> > > Hi Nicolas,
> > >
> > > Thank you for quick reply.
> > >
> > > > Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.
> > >
> > > The actual pipeline which I use in the program has "rtpjitterbuffer", I shared a simple test pipeline.
> > >
> > > >For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.
> > >
> > > Point noted, will make the changes.
> > >
> > > >That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
> > > Here, I am attaching the source code for option 3 I tried.
> > >
> > > "send_h264.c" and "recv_h264.c", work successfully and the frame number is appended to RTP buffer. Prints in both the probes output the correct value.
> > >
> > > However, the probe function (pay_src_probe) in "send_jpeg.c" never gets called.
> > > When I change the probe type from "GST_PAD_PROBE_TYPE_BUFFER" to "GST_PAD_PROBE_TYPE_BUFFER_LIST", then it gets called, and the frame numbers appended are wrong. i.e. for each frame around 100 times this function gets called and program slows down like a sloth.
> >
> > There is no code for buffer list there, with buffer list you need to
> > use different API. Also, before you modify a buffer, you need to ensure
> > it's writable, making it writable will may change the GstBuffer, so you
> > have to update that pointer.
> >
> > >
> > > I checked the source code for "rtph264pay" and rtpjpegpay" element, both of them are creating buffer lists which I guess is to push multiple RTP packets at one go, to the next element in pipeline.
> > > But strangely H264 works fine and JPEG fails.
> > >
> > > Please check if my code has some bug.
> > >
> > > And do you have any suggestions on KLV metadata approach?
> > >
> > > Thanks,
> > > Priyanka
> > >
> > > On Thu, Sep 26, 2019 at 7:11 PM Nicolas Dufresne <[hidden email]> wrote:
> > > >
> > > > Le jeu. 26 sept. 2019 05 h 25, priyanka kataria <[hidden email]> a écrit :
> > > > > Hello,
> > > > >
> > > > > I have an interesting problem:
> > > > > Need to transfer some kind of metadata (say frame number) with each frame over UDP. Receiver on the other hand, extracts he frame numebr from each frame and maintains it for some other work.
> > > > >
> > > > > Sample sender and receiver pipelines:
> > > > > Sender: gst-launch-1.0 -v filesrc location=file.h264  ! h264parse ! rtph264pay ! udpsink port=5001
> > > > > Receiver: gst-launch-1.0 -v udpsrc port=5001 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! decodebin ! autovideosink
> > > >
> > > > Consider adding an rtpjitterbuffer on the receiver, in future GST version, you'll be able to use rtpsrc/rtpsink (or the rist variant) in order to get a full feature RTP stream without complex pipeline construction.
> > > >
> > > > For jpeg, consider configuring max-bitrate property on udpsink. As frames are spread out on a lot more packet, it tend to become bursty and may saturate the link or exhaust the udpsrc socket buffer-size.
> > > >
> > > >
> > > > > Things I have already tried (I am still a beginner, so some of the below things may look stupid):
> > > > > 1. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_OFFSET".
> > > > > But the set offset value is not reflected in the next element in the same pipeline only.
> > > > >
> > > > > 2. In Sender pipeline, attaching a probe on "h264parse" element and assigning incremental values to "GST_BUFFER_PTS".
> > > > > The set PTS value is reflected in the next elements in the same pipeline, but gets lost across UDP.
> > > > > I checked this by attaching a probe on "rtph264depay" element (src pad).
> > > > >
> > > > > 3. Using "gst_rtp_buffer_add_extension_twobytes_header()".
> > > > > This method works for H264 files, but fails with MJPEG files, and my solution needs to be generic.
> > > > > Here, I can provide more details with code if required.
> > > >
> > > > That is the method I would have used. It should work with any RTP packet, so you likely have or hit a bug.
> > > >
> > > > > 4. Last thing I am trying is to mux KLV metadata into stream and send it across UDP.
> > > > > I refer the following link: https://www.aeronetworks.ca/2018/05/mpeg-2-transport-streams.html.
> > > > > This doesn't work though as written in the article but gave me an overview on how to use the pipeline.
> > > > > Now I want to create my custom my KLV metadata file which contains only frame numbers and try to mux it.
> > > > >
> > > > > Please help me in creating such file.
> > > > >
> > > > > Also please share if there are any other working approaches I should try to append metadata in each frame buffer.
> > > > >
> > > > > Thanks,
> > > > > Priyanka
> > > > > _______________________________________________
> > > > > gstreamer-devel mailing list
> > > > > [hidden email]
> > > > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> > > >
> > > > _______________________________________________
> > > > gstreamer-devel mailing list
> > > > [hidden email]
> > > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> > >
> > > _______________________________________________
> > > gstreamer-devel mailing list
> > > [hidden email]
> > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> >
> > _______________________________________________
> > gstreamer-devel mailing list
> > [hidden email]
> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel