Encode YUV420 buffer with appsrc

classic Classic list List threaded Threaded
21 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Encode YUV420 buffer with appsrc

pchaurasia
Hi Folks

I am looking to dump my opencv output to encoder - for creation of
compressed bitstream. THis is how I am looking to do this ...


    VideoWriter m_videoWriter;

    m_videoWriter.open("appsrc ! autovideoconvert ! omxh265enc ! matroskamux
! filesink location=test.mkv ", 0, (double)30, cv::Size(1920, 1080), true);


This looks syntatically correct and generates a bitstream for me. However,
my data is in YUV420 format, and it resides in three individual cv::Mat
(each for Y, U and V - size of U and V cv::Mat is half of cv:Mat for Y).
Could someone please help in adapting the VideoWriter above to encode data
from my matrices - cv::Mat imgY, imgU, imgV ?


Furthermore, I looked around previous posts
(http://gstreamer-devel.966125.n4.nabble.com/appsrc-to-filesink-td970865.html)
but I am unable to comprehend how to tell element after appsrc that data
given out by appsrc will be YUV420 ? Is there a way to set 'format' property
of appsrc to YUV420 ?


Thanks






--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

Baby Octopus
You'll have to call gst_app_src_set_caps() from the application



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

pchaurasia
Thanks for helping out BabyOctopus. After following your suggestion - I am
still not able to get the format right.  Following is my code, this code is
generating a legal mkv bitstream but content is all junk. It seems there is
some issue in what I am feeding to encoder




    GstVideoInfo info;
    GstElement *appsrc;
    GstElement *pipeline;
    GstCaps *caps;

    string aaDebugPipeline          = "appsrc name=myappsrc !
autovideoconvert ! omxh265enc ! matroskamux ! filesink location=test.mkv ";    
    pipeline                        =
gst_parse_launch(aaDebugPipeline.c_str(),NULL);
    g_assert (pipeline);

    appsrc                          = gst_bin_get_by_name
(GST_BIN(pipeline), "myappsrc");
    g_assert (appsrc);

    gst_video_info_set_format(&info, GST_VIDEO_FORMAT_I420, 1920, 1080);
    caps                            = gst_video_info_to_caps(&info);
    gst_app_src_set_caps(GST_APP_SRC(appsrc), caps);


   m_videoWriter.open(aaDebugPipeline,0, (double)30, cv::Size(1920, 1080),
false);
   if (!m_videoWriter.isOpened()) {
      REPORT_ERROR("can't create writer\n");
      exit(1);
   }


// THis is how I am calling m_videoWriter -

       cv::Mat imgY = cv::Mat(1080 ,1920, CV_8UC1, framedata.dataY,2048);
       cv::Mat imgU = cv::Mat(540 ,960, CV_8UC1, framedata.dataU,1024);
       cv::Mat imgV = cv::Mat(540 ,960, CV_8UC1, framedata.dataV,1024);

       m_videoWriter.write(imgY);  

// I am able to successfully imshow (imgY)

Thanks





--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

pchaurasia
Is there a way to specify 'pitch' in caps ? in my cv::Mat image width and
pitch are different.

For example

cv::Mat imgY ; // width = 1920, pitch = 2048.
cv::Mat imgU ; // width = 960, pitch = 1024

These cv::Mats are created from output of GPU and I am able to imshow()
them.


Thanks,



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

Nicolas Dufresne-5
Le jeudi 05 octobre 2017 à 02:22 -0700, pchaurasia a écrit :
> Is there a way to specify 'pitch' in caps ? in my cv::Mat image width
> and
> pitch are different.

GStreamer calls the pitch a stride, and it's in bytes. It's not
specific using caps, but by adding GstVideoMeta. See function
gst_buffer_add_video_meta_full (). Be aware that offset is from a
GstBuffer perspective, if you have multiple GstMemory object, you need
to include the side of the previous GstMemory as needed.

>  
>
> For example
>
> cv::Mat imgY ; // width = 1920, pitch = 2048.
> cv::Mat imgU ; // width = 960, pitch = 1024
>
> These cv::Mats are created from output of GPU and I am able to
> imshow()
> them.
>
>
> Thanks,
>
>
>
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

pchaurasia
Thanks Nicolas..

I surfed around. However, I'm not able to figure out call to add
GstVideoMeta to appsrc - could you please specify what is call to add
GstVideoMeta to appsrc ?

Will it be gst_app_src_push_buffer() ? However this function does not seem
to take GstVideoMeta as input -

GstFlowReturn
gst_app_src_push_buffer (GstAppSrc *appsrc,
                         GstBuffer *buffer);

Thanks



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

Nicolas Dufresne-5


Le 5 oct. 2017 4:38 PM, "pchaurasia" <[hidden email]> a écrit :
Thanks Nicolas..

I surfed around. However, I'm not able to figure out call to add
GstVideoMeta to appsrc - could you please specify what is call to add
GstVideoMeta to appsrc ?

GstVideoMeta is a meta you attach to a buffer, not to the element. See gst_buffer_add_video_meta_full documentation for details.

Will it be gst_app_src_push_buffer() ? However this function does not seem
to take GstVideoMeta as input -

GstFlowReturn
gst_app_src_push_buffer (GstAppSrc *appsrc,
                         GstBuffer *buffer);

Thanks



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

pchaurasia
I would like to encode output of my opencv algorithm. THe buffer to encode is
not available exactly at 33 ms for encoder to encode at 30 fps.  It seems
there are two way I can send buffer-to-be-encoded to appsrc

a) VideoWriter::write method.

    GstCaps *caps;

    string aaDebugPipeline          = "appsrc name=myappsrc !
autovideoconvert ! omxh265enc ! matroskamux ! filesink location=test.mkv ";    
    m_ppipeline                        =
gst_parse_launch(aaDebugPipeline.c_str(),NULL);
    g_assert (m_ppipeline);

    m_pappsrc                          = gst_bin_get_by_name
(GST_BIN(m_ppipeline), "myappsrc");
    g_assert (m_pappsrc);

        caps = gst_caps_new_simple ("video/x-raw",
                        "format",G_TYPE_STRING,"I420",
                        "bpp",G_TYPE_INT,12,
                        "depth",G_TYPE_INT,8,
                        "width", G_TYPE_INT, 1920,
                        "height", G_TYPE_INT, 1080,
                        "pitch", G_TYPE_INT, 2048,
                        "framerate", GST_TYPE_FRACTION, 30, 1,
                        NULL);

    gst_app_src_set_caps(GST_APP_SRC(appsrc), caps);
     GstStateChangeReturn state_ret =
gst_element_set_state((GstElement*)m_ppipeline, GST_STATE_PLAYING);
     g_warning("set state returned %d\n", state_ret);  

   m_videoWriter.open(aaDebugPipeline,0, (double)30, cv::Size(1920, 1080),
false);
   if (!m_videoWriter.isOpened()) {
      REPORT_ERROR("can't create writer\n");
      exit(1);
   }


// call VideoWriter::write

m_videoWriter.write (cv::Mat imgY)  

However I am not able to specify format of buffer while calling
m_videoWriter::write() . Is there a way I can specify video frame format to
m_videoWriter.write() , in this methods ?

b) Can attach GstVideoMeta to a GstBuffer and pass that buffer to
gst_app_src_push_buffer().

void initAppSrc()
{
    GstStateChangeReturn state_ret;

    m_offset[0] = m_offset[1] = m_offset[2] = 0;
    m_stride[0] = 2048; // TBD : FIXME : no magic no
    m_stride[1] = 1024;
    m_stride[2] = 1024;

    m_ppipeline            = (GstPipeline*)gst_pipeline_new("mypipeline");
    m_pappsrc              = (GstAppSrc*)gst_element_factory_make("appsrc",
"aa-appsrc");
    m_pvideoConvert        = gst_element_factory_make("autovideoconvert",
"aa-videoconvert");
    m_pencoder             = gst_element_factory_make("omxh265enc",
"aa-videoencoder");
    m_pmux                 = gst_element_factory_make("matroskamux",
"aa-mux");
    m_pfsink               = gst_element_factory_make("filesink",
"aa-filesink");


    g_assert(m_ppipeline);
    g_assert(m_pappsrc);
    g_assert(m_pvideoConvert);
    g_assert(m_pencoder);
    g_assert(m_pmux);
    g_assert(m_pfsink);


    g_signal_connect(m_pappsrc, "need-data", G_CALLBACK(start_feed), this);
    g_signal_connect(m_pappsrc, "enough-data", G_CALLBACK(stop_feed), this);

    g_object_set( G_OBJECT( m_pfsink ), "location", "test1.mkv", NULL );

    gst_bin_add_many(GST_BIN(m_ppipeline), (GstElement*)m_pappsrc,
m_pvideoConvert, m_pencoder, m_pmux, m_pfsink, NULL);

    if(!gst_element_link_many((GstElement*)m_pappsrc, m_pvideoConvert,
m_pencoder, m_pmux, m_pfsink)){
       g_warning("failed to link appsrc, autovideoconvert, encoder, muxer,
and filesink");
    }

    state_ret = gst_element_set_state((GstElement*)m_ppipeline,
GST_STATE_PLAYING);
    g_warning("set state returned %d\n", state_ret);
}

static gboolean read_data(gst_app_t *app)
{

// I would need to wait for opencv output here ;
// Can this thread block ??

}

static void start_feed (GstElement * pipeline, guint size, gst_app_t *app)
{
 if (app->sourceid == 0) {
  GST_DEBUG ("start feeding");
  app->sourceid = g_idle_add ((GSourceFunc) read_data, app);
 }
}

static void stop_feed (GstElement * pipeline, gst_app_t *app)
{
 if (app->sourceid != 0) {
  GST_DEBUG ("stop feeding");
  g_source_remove (app->sourceid);
  app->sourceid = 0;
 }
}

void pushFrame()
{
    int size              = 1920*1080*1.5;
    m_pgstBuffer          = gst_buffer_new_wrapped_full( (GstMemoryFlags)0,
(gpointer)(img.data), size, 0, size, NULL, NULL );
    m_pgstVideoMeta       =
gst_buffer_add_video_meta_full(m_pgstBuffer,GST_VIDEO_FRAME_FLAG_NONE,
GST_VIDEO_FORMAT_I420, 1920,1080, 3, m_offset, m_stride );

    //ref buffer to give copy to appsrc
    gst_buffer_ref(m_pgstBuffer);


    GstFlowReturn ret;
    ret                  = gst_app_src_push_buffer((GstAppSrc*)m_pappsrc,
m_pgstBuffer);
    if(ret != GST_FLOW_OK)
    {
        g_printerr("could not push buffer\n");
        g_printerr("ret enum: %i\n", ret);
    }

    //dec. ref count so that we can edit data on next run
    gst_buffer_unref(m_pgstBuffer);

}


Here I am able to specify input GstBuffer format. But do not quite know how
to supply output of my opencv code to a function like pushFrame() ? Or
alternatively, if I make a queue between output of my opencv and appsrc read
function , read_data(), can this function block and wait for opencv output
to arrive ?

Thanks







--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

Nicolas Dufresne-5
Le vendredi 06 octobre 2017 à 11:41 -0700, pchaurasia a écrit :

> I would like to encode output of my opencv algorithm. THe buffer to
> encode is
> not available exactly at 33 ms for encoder to encode at 30 fps.  It
> seems
> there are two way I can send buffer-to-be-encoded to appsrc
>
> a) VideoWriter::write method.
>
>     GstCaps *caps;
>
>     string aaDebugPipeline          = "appsrc name=myappsrc !
> autovideoconvert ! omxh265enc ! matroskamux ! filesink
> location=test.mkv ";    
>     m_ppipeline                        =
> gst_parse_launch(aaDebugPipeline.c_str(),NULL);
>     g_assert (m_ppipeline);
>
>     m_pappsrc                          = gst_bin_get_by_name
> (GST_BIN(m_ppipeline), "myappsrc");
>     g_assert (m_pappsrc);
>
> caps = gst_caps_new_simple ("video/x-raw",
> "format",G_TYPE_STRING,"I420",
> "bpp",G_TYPE_INT,12,
> "depth",G_TYPE_INT,8,
> "width", G_TYPE_INT, 1920,
> "height", G_TYPE_INT, 1080,
> "pitch", G_TYPE_INT, 2048,
This field is invalid and will be ignored. You need to use both caps
and VideoMeta.

> "framerate", GST_TYPE_FRACTION, 30, 1,
> NULL);
>
>     gst_app_src_set_caps(GST_APP_SRC(appsrc), caps);
>      GstStateChangeReturn state_ret =
> gst_element_set_state((GstElement*)m_ppipeline, GST_STATE_PLAYING);
>      g_warning("set state returned %d\n", state_ret);  
>
>    m_videoWriter.open(aaDebugPipeline,0, (double)30, cv::Size(1920,
> 1080),
> false);
>    if (!m_videoWriter.isOpened()) {
>       REPORT_ERROR("can't create writer\n");
>       exit(1);
>    }
>
>
> // call VideoWriter::write
>
> m_videoWriter.write (cv::Mat imgY)  
>
> However I am not able to specify format of buffer while calling
> m_videoWriter::write() . Is there a way I can specify video frame
> format to
> m_videoWriter.write() , in this methods ?
>
> b) Can attach GstVideoMeta to a GstBuffer and pass that buffer to
> gst_app_src_push_buffer().
>
> void initAppSrc()
> {
>     GstStateChangeReturn state_ret;
>
>     m_offset[0] = m_offset[1] = m_offset[2] = 0;
>     m_stride[0] = 2048; // TBD : FIXME : no magic no
>     m_stride[1] = 1024;
>     m_stride[2] = 1024;
>
>     m_ppipeline            =
> (GstPipeline*)gst_pipeline_new("mypipeline");
>     m_pappsrc              =
> (GstAppSrc*)gst_element_factory_make("appsrc",
> "aa-appsrc");
>     m_pvideoConvert        =
> gst_element_factory_make("autovideoconvert",
> "aa-videoconvert");
>     m_pencoder             = gst_element_factory_make("omxh265enc",
> "aa-videoencoder");
>     m_pmux                 = gst_element_factory_make("matroskamux",
> "aa-mux");
>     m_pfsink               = gst_element_factory_make("filesink",
> "aa-filesink");
>
>
>     g_assert(m_ppipeline);
>     g_assert(m_pappsrc);
>     g_assert(m_pvideoConvert);
>     g_assert(m_pencoder);
>     g_assert(m_pmux);
>     g_assert(m_pfsink);
>
>
>     g_signal_connect(m_pappsrc, "need-data", G_CALLBACK(start_feed),
> this);
>     g_signal_connect(m_pappsrc, "enough-data", G_CALLBACK(stop_feed),
> this);
>
>     g_object_set( G_OBJECT( m_pfsink ), "location", "test1.mkv", NULL
> );
>
>     gst_bin_add_many(GST_BIN(m_ppipeline), (GstElement*)m_pappsrc,
> m_pvideoConvert, m_pencoder, m_pmux, m_pfsink, NULL);
>
>     if(!gst_element_link_many((GstElement*)m_pappsrc,
> m_pvideoConvert,
> m_pencoder, m_pmux, m_pfsink)){
>        g_warning("failed to link appsrc, autovideoconvert, encoder,
> muxer,
> and filesink");
>     }
>
>     state_ret = gst_element_set_state((GstElement*)m_ppipeline,
> GST_STATE_PLAYING);
>     g_warning("set state returned %d\n", state_ret);
> }
>
> static gboolean read_data(gst_app_t *app)
> {
>
> // I would need to wait for opencv output here ;
> // Can this thread block ??
It's usually a bad idea to block in an idle callback, if you had a UI
running ont he GMainLoop thread, you'd get a big disaster.

Instead, block directly in the need-data callback by looping and
pushing and stop when enough-data has callcack (just set a state when
this is called, it will likely be called while you are pushing in a re-
entrant fashion).

>
> }
>
> static void start_feed (GstElement * pipeline, guint size, gst_app_t
> *app)
> {
>  if (app->sourceid == 0) {
>   GST_DEBUG ("start feeding");
>   app->sourceid = g_idle_add ((GSourceFunc) read_data, app);
>  }
> }
>
> static void stop_feed (GstElement * pipeline, gst_app_t *app)
> {
>  if (app->sourceid != 0) {
>   GST_DEBUG ("stop feeding");
>   g_source_remove (app->sourceid);
>   app->sourceid = 0;
>  }
> }
>
> void pushFrame()
> {
>     int size              = 1920*1080*1.5;
>     m_pgstBuffer          = gst_buffer_new_wrapped_full(
> (GstMemoryFlags)0,
> (gpointer)(img.data), size, 0, size, NULL, NULL );
>     m_pgstVideoMeta       =
> gst_buffer_add_video_meta_full(m_pgstBuffer,GST_VIDEO_FRAME_FLAG_NONE
> ,
> GST_VIDEO_FORMAT_I420, 1920,1080, 3, m_offset, m_stride );
>
>     //ref buffer to give copy to appsrc
>     gst_buffer_ref(m_pgstBuffer);
>
>
>     GstFlowReturn ret;
>     ret                  =
> gst_app_src_push_buffer((GstAppSrc*)m_pappsrc,
> m_pgstBuffer);
>     if(ret != GST_FLOW_OK)
>     {
>         g_printerr("could not push buffer\n");
>         g_printerr("ret enum: %i\n", ret);
>     }
>
>     //dec. ref count so that we can edit data on next run
>     gst_buffer_unref(m_pgstBuffer);
>
> }
>
>
> Here I am able to specify input GstBuffer format. But do not quite know how
> to supply output of my opencv code to a function like pushFrame() ? Or
> alternatively, if I make a queue between output of my opencv and appsrc read
> function , read_data(), can this function block and wait for opencv output
> to arrive ?
appsrc already provide a queue, just configure it's size and enough-
data will tell you when it's full.

>
> Thanks
>
>
>
>
>
>
>
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (201 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

pchaurasia
Thanks Nicolas, for all the help.

I wrote a small stand alone code to make simple encoder pipeline driven by
appsrc. I think I followed all aforementioned suggestions from you.

I am able to run it. I do get a compressed video file (test1.mp4), however,
that file does not have much other than headers. I tried to play it using
mplayer and it would not play. I think, I am missing something. Would be
great if you could give a look to my code to spot the problem.

I am compiling this on Tx2 box - which has ubuntu on it. The compilation
command is at top of the file.

Thanks
main.cpp <http://gstreamer-devel.966125.n4.nabble.com/file/t377786/main.cpp>  



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

Antonio Ospite-2
On Tue, 10 Oct 2017 11:25:58 -0700 (MST)
pchaurasia <[hidden email]> wrote:

> Thanks Nicolas, for all the help.
>
> I wrote a small stand alone code to make simple encoder pipeline driven by
> appsrc. I think I followed all aforementioned suggestions from you.
>

With a test program it is easier to discuss.

If I may add a side note, it is possible to make the test program more
concise by using gst_parse_launch() to build the pipeline and retrieving
the app source with gst_bin_get_by_name(); something like:

data.pipeline = gst_parse_launch
    ("appsrc name=audio_source ! ... ! filesink location=test1.mp4",
     NULL);
data.app_source =
  gst_bin_get_by_name (GST_BIN (data.pipeline), "sound_source");

When asking for help try to provide the *minimum* program you can come
up with which still reproduces your problem.

> I am able to run it. I do get a compressed video file (test1.mp4), however,
> that file does not have much other than headers. I tried to play it using
> mplayer and it would not play. I think, I am missing something. Would be
> great if you could give a look to my code to spot the problem.

The problem seems to be about buffer timestamps, the buffers are created
successfully (you can verify that with this pipeline:

  "appsrc name=audio_source ! fakesink dump=1"
 
but then they are discarded, presumably because they have invalid
timestamps.

The following patch is just a proof-of-concept which produces a valid
encoded video (note that I had to change some headers and use x264enc on
my system):

$ diff -pruN main.cpp.orig main.cpp
--- main.cpp.orig 2017-10-10 23:15:30.185947947 +0200
+++ main.cpp 2017-10-10 23:33:50.467881200 +0200
@@ -9,11 +9,9 @@
 #include <string.h>
 
 
-#include "opencv2/imgcodecs.hpp"
-#include "opencv2/imgproc.hpp"
-#include "opencv2/videoio.hpp"
-#include <opencv2/highgui.hpp>
-#include <opencv2/video.hpp>
+#include "opencv2/imgproc/imgproc.hpp"
+#include <opencv2/highgui/highgui.hpp>
+#include <opencv2/video/video.hpp>
 
 
 #include <iostream>
@@ -158,6 +156,8 @@ static void start_feed (GstElement *sour
     //ref buffer to give copy to appsrc
     gst_buffer_ref(m_pgstBuffer);
 
+    GST_BUFFER_DTS (m_pgstBuffer) = 0;
+    GST_BUFFER_PTS (m_pgstBuffer) = 0;
 
     g_print ("Signalling push-buffer with new buffer\n");
 
@@ -253,7 +253,7 @@ main (int argc, char * argv[])
   data.app_source             = gst_element_factory_make ("appsrc", "audio_source");
   //data.app_sink               = gst_element_factory_make ("appsink", "app_sink");
   data.m_pvideoConvert        = gst_element_factory_make("autovideoconvert", "aa-videoconvert");
-  data.m_pencoder             = gst_element_factory_make("omxh265enc", "aa-videoencoder");
+  data.m_pencoder             = gst_element_factory_make("x264enc", "aa-videoencoder");
   data.m_pmux                 = gst_element_factory_make("matroskamux", "aa-mux");
   data.m_pfsink               = gst_element_factory_make("filesink", "aa-filesink");
   data.sourceid               = 0;
@@ -285,6 +285,8 @@ main (int argc, char * argv[])
   g_signal_connect (data.app_source, "need-data", G_CALLBACK (start_feed), &data);
   g_signal_connect (data.app_source, "enough-data", G_CALLBACK (stop_feed), &data);
 
+  g_object_set (data.app_source, "format", GST_FORMAT_TIME, NULL);
+
 
   /* Configure appsink */
   //g_object_set (data.app_sink, "emit-signals", TRUE, "caps", caps, NULL);


The final result may still not be "correct", tho.

Ciao,
   Antonio

--
Antonio Ospite
https://ao2.it
https://twitter.com/ao2it

A: Because it messes up the order in which people normally read text.
   See http://en.wikipedia.org/wiki/Posting_style
Q: Why is top-posting such a bad thing?
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

pchaurasia
Hi Antonio,

Thank you very much for your precise help. My apologies for leaving some
fluff behind in my code. I will make it concise.

With your changes I am able to 'move' and I do get a valid bitstream (which
plays with decoder ) and valid picture (which appears visually correct).
Thanks for the help.

When I move away from the experimental code (i.e. code where you suggested
changes) and use my actual production code, I have problem. We made with
your changes in out production code. Previously (i.e. without your recent
changes)we were  getting an mp4 bitstream,  which just had an header and it
wouldnot play. Now with your changes we are getting a legal bitstream which
is played by mplayer bit the picture is visually garbage.

I am suspecting that this could be due to our buffer format. In our
production code we have three matrices - one each for Y, U and V. The data
for each of these matrix, is 'not contiguous' in memory. My question is how
to specify start of each of these buffers to
gst_buffer_add_video_meta_full() call. The offset array that
gst_buffer_add_video_meta_full(), takes in as input, would be the way to
specify start of each of Y, U and V buffer. Can it take negative offsets ?
What is reference of this offset - i.e. I am thinking

offset[0] = 0; // this specifies start of Y buffer
offset[1] = YStart - UStart; // this specifies start of U from Y
offset[2] = YStart - VStart; // this specifies start of V from Y

Also strides are different Y and UV buffers, in our case. I specify strides
like -

stride[0] = 2048;           // Y Buffer stride
stride[1] = 1024;           // U buffer stride
stride[2] = 1024;           // V buffer stride

Our YUV buffer is 1920x1080 resolution. However, due to memory organization
- the strides are different. Please let me know your thoughts. Aforesaid
settings are still not working, in our production code, but with you changes
we do get legal bitstream which did not happen before your changes.

Thanks




--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

Antonio Ospite-2
On Sun, 15 Oct 2017 00:59:48 -0700 (MST)
pchaurasia <[hidden email]> wrote:

[...]

> When I move away from the experimental code (i.e. code where you suggested
> changes) and use my actual production code, I have problem. We made with
> your changes in out production code. Previously (i.e. without your recent
> changes)we were  getting an mp4 bitstream,  which just had an header and it
> wouldnot play. Now with your changes we are getting a legal bitstream which
> is played by mplayer bit the picture is visually garbage.
>
> I am suspecting that this could be due to our buffer format. In our
> production code we have three matrices - one each for Y, U and V. The data
> for each of these matrix, is 'not contiguous' in memory.

So does your data look something like this?

YYYYYYYYPP
YYYYYYYYPP
...
UUUUP
UUUUP
...
VVVVP
VVVVP
...

> My question is how
> to specify start of each of these buffers to
> gst_buffer_add_video_meta_full() call. The offset array that
> gst_buffer_add_video_meta_full(), takes in as input, would be the way to
> specify start of each of Y, U and V buffer. Can it take negative offsets ?
> What is reference of this offset - i.e. I am thinking
>
> offset[0] = 0; // this specifies start of Y buffer
> offset[1] = YStart - UStart; // this specifies start of U from Y
> offset[2] = YStart - VStart; // this specifies start of V from Y
>
> Also strides are different Y and UV buffers, in our case. I specify strides
> like -
>
> stride[0] = 2048;           // Y Buffer stride
> stride[1] = 1024;           // U buffer stride
> stride[2] = 1024;           // V buffer stride
>
> Our YUV buffer is 1920x1080 resolution. However, due to memory organization
> - the strides are different. Please let me know your thoughts. Aforesaid
> settings are still not working, in our production code, but with you changes
> we do get legal bitstream which did not happen before your changes.
>

The first quick experiment I'd do is to produce a 2048x1080 video
assuming contiguous data, and see how it looks like.

Could you also dump the original data to a file you can share?
Just one frame is enough.

Then you could make your self-contained test program read back the data
from a file for the appsrc to push downstream the pipeline, even if
it's always the same image this can still be used to validate the
final visual result you are after.

With this setup it would be easier for us to replicate your issue.

I never used gst_buffer_add_video_meta_full() so I might be interested
to play with it a little bit.

Ciao,
   Antonio

--
Antonio Ospite
https://ao2.it
https://twitter.com/ao2it

A: Because it messes up the order in which people normally read text.
   See http://en.wikipedia.org/wiki/Posting_style
Q: Why is top-posting such a bad thing?
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

pchaurasia
Hi Antonio,

Sorry about delay in my response. I have not been able to try your idea and
dump the image.  I will do so and get back soon.

Yes , our image is exactly in same format as described by you. Y plane
followed by padding on each line , followed by U plane (with padding on each
line) and V plane.

Thanks



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

Antonio Ospite-2
On Mon, 23 Oct 2017 11:23:41 -0700 (MST)
pchaurasia <[hidden email]> wrote:

> Hi Antonio,
>
> Sorry about delay in my response. I have not been able to try your idea and
> dump the image.  I will do so and get back soon.
>

Ah, take your time, and thanks for the heads up.

Ciao,
   Antonio

--
Antonio Ospite
https://ao2.it
https://twitter.com/ao2it

A: Because it messes up the order in which people normally read text.
   See http://en.wikipedia.org/wiki/Posting_style
Q: Why is top-posting such a bad thing?
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

pchaurasia
Hi Antonio,

1. When I set image width to 2048, I do get better looking video. However,
It seems colors are not perfectly right, in the video. I think I am 50%
there.
2. I dumped the image as raw pixel image of size 2048*1080*1.5. In YUV
Planar format the way you described above. I am attaching the image.
3. I am attaching the code which will load this image and try to encode.
This code also displays the image before encoding. WHen I run , I see that
pre-encoded image as displayed by imshow() give right colors - however - the
encoded image does not.

Thanks

frame0065.jpg
<http://gstreamer-devel.966125.n4.nabble.com/file/t377786/frame0065.jpg>  
main3.cpp
<http://gstreamer-devel.966125.n4.nabble.com/file/t377786/main3.cpp>  



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

Antonio Ospite-2
On Tue, 24 Oct 2017 02:39:41 -0700 (MST)
pchaurasia <[hidden email]> wrote:

> Hi Antonio,
>
> 1. When I set image width to 2048, I do get better looking video. However,
> It seems colors are not perfectly right, in the video. I think I am 50%
> there.

This is because the data format is not actually I420 but YV12, the U and
V plane are switched.

Setting the width equal to the stride helped to understand this.

> 2. I dumped the image as raw pixel image of size 2048*1080*1.5. In YUV
> Planar format the way you described above. I am attaching the image.

A very handy way to figure out the raw data format is to use the
rawvideoparse element, for example I can correctly interpret the frame
you sent with the following script:

-----------------------------------------------------------------------
#!/bin/sh

set -e
set -x

WIDTH=1920
HEIGHT=1080

STRIDE=2048

STRIDE1=$STRIDE
STRIDE2=$(($STRIDE / 2))
STRIDE3=$(($STRIDE / 2))

OFFSET1=0
OFFSET2=$(($STRIDE * $HEIGHT))
OFFSET3=$(($STRIDE * $HEIGHT + ($STRIDE / 2) * ($HEIGHT / 2)))

gst-launch-1.0 filesrc location=frame0065.jpg ! \
  rawvideoparse \
    width=$WIDTH \
    height=$HEIGHT \
    format=yv12 \
    plane-strides="<$STRIDE1,$STRIDE2,$STRIDE3>" \
    plane-offsets="<$OFFSET1,$OFFSET2,$OFFSET3>" ! \
  videoconvert ! pngenc ! filesink location=out.png
-----------------------------------------------------------------------

> 3. I am attaching the code which will load this image and try to encode.
> This code also displays the image before encoding. WHen I run , I see that
> pre-encoded image as displayed by imshow() give right colors - however - the
> encoded image does not.

To set the offsets you have to calculate the planes sizes in bytes like
above; then set the right pixelformat and you will have an almost correct
image. I say _almost_ because there is still a border on the right.

You can fix that by differentiating again between the width of the final
image and the stride of the raw data.

int stride = 2048;
int width = 1920;

and use these variables appropriately, as you were doing before.

I didn't do that in the patch below to keep the changes at a minimum.

$ diff -pruN main3.cpp.orig main3.cpp
--- main3.cpp.orig 2017-10-24 16:57:14.209146385 +0200
+++ main3.cpp 2017-10-24 17:38:22.660606798 +0200
@@ -47,7 +47,7 @@ typedef struct _CustomData {
 
 int width             = 2048;
 int height            = 1080;
-static int fno        = 60;
+static int fno        = 65;
 unsigned char         *imgdata;
 int dx                = 0;
 int dy                = 0;
@@ -64,8 +64,13 @@ static void start_feed (GstElement *sour
   gsize            m_offset[3];
   gint             m_stride[3];
 
+  // at the beginning of the data
+  m_offset[0] = 0;
+  // after the first plane
+  m_offset[1] = width * height;
+  // after the first and second plane
+  m_offset[2] = width * height + (width / 2) * (height / 2); // or: width * height * 1.25
 
-  m_offset[0] = m_offset[1] = m_offset[2] = 0;
   m_stride[0] = width;
   m_stride[1] = width/2;
   m_stride[2] = width/2;
@@ -109,7 +114,7 @@ static void start_feed (GstElement *sour
 #endif
 
     m_pgstBuffer           = gst_buffer_new_wrapped_full( (GstMemoryFlags)0, (gpointer)(inputImgGray.data), size, 0, size, NULL, NULL );
-    m_pgstVideoMeta        = gst_buffer_add_video_meta_full(m_pgstBuffer,GST_VIDEO_FRAME_FLAG_NONE, GST_VIDEO_FORMAT_I420, width,height, 3, m_offset, m_stride );
+    m_pgstVideoMeta        = gst_buffer_add_video_meta_full(m_pgstBuffer,GST_VIDEO_FRAME_FLAG_NONE, GST_VIDEO_FORMAT_YV12, width,height, 3, m_offset, m_stride );
 
     //ref buffer to give copy to appsrc
     gst_buffer_ref(m_pgstBuffer);
@@ -176,7 +181,7 @@ main (int argc, char * argv[])
   data.app_source             = gst_element_factory_make ("appsrc", "audio_source");
   //data.app_sink               = gst_element_factory_make ("appsink", "app_sink");
   data.m_pvideoConvert        = gst_element_factory_make("autovideoconvert", "aa-videoconvert");
-  data.m_pencoder             = gst_element_factory_make("omxh265enc", "aa-videoencoder");
+  data.m_pencoder             = gst_element_factory_make("x264enc", "aa-videoencoder");
   data.m_pmux                 = gst_element_factory_make("matroskamux", "aa-mux");
   data.m_pfsink               = gst_element_factory_make("filesink", "aa-filesink");
   data.sourceid               = 0;
@@ -192,11 +197,11 @@ main (int argc, char * argv[])
   }
 
 
-  gst_video_info_set_format(&info, GST_VIDEO_FORMAT_I420, width, height);
+  gst_video_info_set_format(&info, GST_VIDEO_FORMAT_YV12, width, height);
   caps                 = gst_video_info_to_caps(&info);
 
   caps = gst_caps_new_simple ("video/x-raw",
- "format",G_TYPE_STRING,"I420",
+ "format",G_TYPE_STRING,"YV12",
  "width", G_TYPE_INT, width,
  "height", G_TYPE_INT, height,
  "framerate", GST_TYPE_FRACTION, 30, 1,


Ciao ciao,
   Antonio

--
Antonio Ospite
https://ao2.it
https://twitter.com/ao2it

A: Because it messes up the order in which people normally read text.
   See http://en.wikipedia.org/wiki/Posting_style
Q: Why is top-posting such a bad thing?
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

pchaurasia
Hi Antonio

Apologies again about being late on this. I tried your suggestion of using
YV12. That does not help. I tried few different combinations of size,
strides and none of those guesses help.

After some more debug, I am suspecting issues in my understanding of
hardware. I'm seeking help from nvidia guys. My test code, for purpose of
reproduction of the issue on nvidia hardware can be found at -

 https://github.com/pcgamelore/SingleCameraPlaceholder

Would be great if you glance through this, as without actual hardware - it
will be hard for you to reproduce the issue. I am able to generate legal
h265 (mp4) bitstream of appsrc buffers. But it has lots of artifacts.

Thanks,



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

pchaurasia
Hi Antonio,

I seem to be running into another issue. This one comes when the encoder is
looking to terminate. The following lines in aaDebug.cpp line no.234 seem to
be blocking the code execution indefinitely :

state_ret = gst_element_set_state((GstElement*)m_ppipeline, GST_STATE_NULL);

The code can be found at :

 https://github.com/pcgamelore/SingleCameraPlaceholder

Please let me know if there is something which I am missing.

Thanks



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Encode YUV420 buffer with appsrc

Michael MacIntosh
Hey,

I was running into a similar issue, usually this is because the state
change deadlocked.  In my case, I was doing a state change to null too
soon after setting it to playing.  Adding a delay fixed the issue.

Looking at your code, it looks like you call stop recording, and then
shutdown, which calls stop recording.  So it looks like you may be
setting the state to null twice.  The second time while it is
asynchronously changing to null, which might be the cause of your
deadlock.  You could try polling the bus to wait for the state change to
finish.

Hope that helps!


On 12/6/2017 10:58 PM, pchaurasia wrote:

> Hi Antonio,
>
> I seem to be running into another issue. This one comes when the encoder is
> looking to terminate. The following lines in aaDebug.cpp line no.234 seem to
> be blocking the code execution indefinitely :
>
> state_ret = gst_element_set_state((GstElement*)m_ppipeline, GST_STATE_NULL);
>
> The code can be found at :
>
>   https://github.com/pcgamelore/SingleCameraPlaceholder
>
> Please let me know if there is something which I am missing.
>
> Thanks
>
>
>
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
12