Gstreamer output to a buffer then file On Jetson tegra Tx2

classic Classic list List threaded Threaded
11 messages Options
Reply | Threaded
Open this post in threaded view
|

Gstreamer output to a buffer then file On Jetson tegra Tx2

new baby
Hi Experts,

I am new to gstreamer  world of multimedia, i am stuck in gstreamer issues,
kindly help me solve it.
I am doing  capturing + encoding video + encoding audio + muxing on tegra
tx2.

Current task :
completed task.
1) capturing video+  encoding video *i used tegra multimedia API's* and
storing in buffer .


2) capturing audio + encoding audio + muxing *i  am using gstreamer API
code*.
gstreamer takes encode video data from my bufferers +  captures audio +
encode audio data and mux it and write to file .mp4
*
Problem :*

*I want the muxed data to be stored in local bufferers* (char* pst =
(char*)malloc(10000);)

1) How can i store gstreamer output  to local buffer?
or 2) How to pass local buffer to  gstreamer to store its output.

Below is the code & compilation procedure   in this link:
https://devtalk.nvidia.com/default/topic/1029025/how-to-capture-audio-on-tx2-/#5236282
<https://devtalk.nvidia.com/default/topic/1029025/how-to-capture-audio-on-tx2-/#5236282>  

I have attached the latest file also.

main.cpp <http://gstreamer-devel.966125.n4.nabble.com/file/t378175/main.cpp>  
 



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
adi
Reply | Threaded
Open this post in threaded view
|

AW: Gstreamer output to a buffer then file On Jetson tegra Tx2

Thornton, Keith
Hi, you could do this by ending one pipeline with appsink and starting a separate pipeline with appsrc. There are probably other ways to do this depending upon what you are really trying to achieve.

-----Ursprüngliche Nachricht-----
Von: gstreamer-devel [mailto:[hidden email]] Im Auftrag von aasim
Gesendet: Donnerstag, 1. Februar 2018 08:28
An: [hidden email]
Betreff: Gstreamer output to a buffer then file On Jetson tegra Tx2

Hi Experts,

I am new to gstreamer  world of multimedia, i am stuck in gstreamer issues, kindly help me solve it.
I am doing  capturing + encoding video + encoding audio + muxing on tegra tx2.

Current task :
completed task.
1) capturing video+  encoding video *i used tegra multimedia API's* and storing in buffer .


2) capturing audio + encoding audio + muxing *i  am using gstreamer API
code*.
gstreamer takes encode video data from my bufferers +  captures audio +
encode audio data and mux it and write to file .mp4
*
Problem :*

*I want the muxed data to be stored in local bufferers* (char* pst =
(char*)malloc(10000);)

1) How can i store gstreamer output  to local buffer?
or 2) How to pass local buffer to  gstreamer to store its output.

Below is the code & compilation procedure   in this link:
https://devtalk.nvidia.com/default/topic/1029025/how-to-capture-audio-on-tx2-/#5236282
<https://devtalk.nvidia.com/default/topic/1029025/how-to-capture-audio-on-tx2-/#5236282>  

I have attached the latest file also.

main.cpp <http://gstreamer-devel.966125.n4.nabble.com/file/t378175/main.cpp>  
 



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: Gstreamer output to a buffer then file On Jetson tegra Tx2

new baby
Hi  Thornton,

Thanks for quick reply, *can you suggest me any exampl*e. as i am not much
aware of gstreamer.

*i have pasted core code where actual command line is written.
{ gst_pipeline = (GstPipeline*)gst_parse_launch()..}*

i wanted to capture audio & video + encode audio & video + mux data and
store in buffer.
tricky part is video  part is handled by tegra API's and audio and mux part
by gstreamer.


below is the core code in two parts
1) audio part + mux part AND
2) Encoded video data passing to gstreamer.


*1) audio part + mux part *
static bool execute()
{
    GMainLoop *main_loop;
    GstPipeline *gst_pipeline = NULL;
    GError *err = NULL;
    GstElement *appsrc_;

    gst_init (0, NULL);
    main_loop = g_main_loop_new (NULL, FALSE);
    char launch_string_[1024];

    sprintf(launch_string_,
            "appsrc name=mysource !
video/x-h264,width=%d,height=%d,stream-format=byte-stream !",
            STREAM_SIZE.width(), STREAM_SIZE.height());
    sprintf(launch_string_ + strlen(launch_string_),
                " h264parse ! flvmux name=mux alsasrc device=plughw:2 !
audioresample ! audio/x-raw,rate=48000,channels=1 ! queue ! voaacenc
bitrate=32000 ! queue ! mux. mux.  ! queue ! filesink location=a.mp4 ");
    printf("\n cmd of gstremer = %s \n",launch_string_);
    gst_pipeline = (GstPipeline*)gst_parse_launch(launch_string_, &err);
    appsrc_ = gst_bin_get_by_name(GST_BIN(gst_pipeline), "mysource");
    gst_app_src_set_stream_type(GST_APP_SRC(appsrc_),
GST_APP_STREAM_TYPE_STREAM);
    gst_element_set_state((GstElement*)gst_pipeline, GST_STATE_PLAYING);

 // Create the CameraProvider object and get the core interface.
    UniqueObj<CameraProvider> cameraProvider =
UniqueObj<CameraProvider>(CameraProvider::create());
    ICameraProvider *iCameraProvider =
interface_cast<ICameraProvider>(cameraProvider);
    if (!iCameraProvider)
        ORIGINATE_ERROR("Failed to create CameraProvider");

    // Get the camera devices.
    std::vector<CameraDevice*> cameraDevices;
    iCameraProvider->getCameraDevices(&cameraDevices);
    if (cameraDevices.size() == 0)
        ORIGINATE_ERROR("No cameras available");
''''''
'''''
other code.
}





*2) Encoded video data passing to gstreamer.
*


bool ConsumerThread::encoderCapturePlaneDqCallback(struct v4l2_buffer
*v4l2_buf,
                                                   NvBuffer * buffer,
                                                   NvBuffer * shared_buffer,
                                                   void *arg)
{
    ConsumerThread *thiz = (ConsumerThread*)arg;

    if (!v4l2_buf)
    {
        thiz->abort();
        ORIGINATE_ERROR("Failed to dequeue buffer from encoder capture
plane");
    }

#if 1
   // printf("\n encoderCapturePlaneDqCallback \n");
    if (buffer->planes[0].bytesused > 0)
    {
        GstBuffer *gstbuf;
        GstMapInfo map = {0};
        GstFlowReturn ret;
        gstbuf = gst_buffer_new_allocate (NULL, buffer->planes[0].bytesused,
NULL);
        gstbuf->pts = thiz->timestamp;
        thiz->timestamp += 33333333; // ns

        gst_buffer_map (gstbuf, &map, GST_MAP_WRITE);
        memcpy(map.data, buffer->planes[0].data ,
buffer->planes[0].bytesused);
        gst_buffer_unmap(gstbuf, &map);

        g_signal_emit_by_name (thiz->m_appsrc_, "push-buffer", gstbuf,
&ret);
        gst_buffer_unref(gstbuf);
    }
    else
    {
        gst_app_src_end_of_stream((GstAppSrc *)thiz->m_appsrc_);
        sleep(1);
    }
#else
     thiz->m_outputFile->write((char *) buffer->planes[0].data,
                               buffer->planes[0].bytesused);

#endif
 '''''''''
''''''
other code
}



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
adi
Reply | Threaded
Open this post in threaded view
|

AW: AW: Gstreamer output to a buffer then file On Jetson tegra Tx2

Thornton, Keith
Hi,
start off by reading the section on inserting and removing data from the pipeline in
https://gstreamer.freedesktop.org/documentation/application-development/advanced/pipeline-manipulation.html
if you then still have questions get in touch again
Gruesse


-----Ursprüngliche Nachricht-----
Von: gstreamer-devel [mailto:[hidden email]] Im Auftrag von aasim
Gesendet: Donnerstag, 1. Februar 2018 10:33
An: [hidden email]
Betreff: Re: AW: Gstreamer output to a buffer then file On Jetson tegra Tx2

Hi  Thornton,

Thanks for quick reply, *can you suggest me any exampl*e. as i am not much aware of gstreamer.

*i have pasted core code where actual command line is written.
{ gst_pipeline = (GstPipeline*)gst_parse_launch()..}*

i wanted to capture audio & video + encode audio & video + mux data and store in buffer.
tricky part is video  part is handled by tegra API's and audio and mux part by gstreamer.


below is the core code in two parts
1) audio part + mux part AND
2) Encoded video data passing to gstreamer.


*1) audio part + mux part *
static bool execute()
{
    GMainLoop *main_loop;
    GstPipeline *gst_pipeline = NULL;
    GError *err = NULL;
    GstElement *appsrc_;

    gst_init (0, NULL);
    main_loop = g_main_loop_new (NULL, FALSE);
    char launch_string_[1024];

    sprintf(launch_string_,
            "appsrc name=mysource !
video/x-h264,width=%d,height=%d,stream-format=byte-stream !",
            STREAM_SIZE.width(), STREAM_SIZE.height());
    sprintf(launch_string_ + strlen(launch_string_),
                " h264parse ! flvmux name=mux alsasrc device=plughw:2 !
audioresample ! audio/x-raw,rate=48000,channels=1 ! queue ! voaacenc
bitrate=32000 ! queue ! mux. mux.  ! queue ! filesink location=a.mp4 ");
    printf("\n cmd of gstremer = %s \n",launch_string_);
    gst_pipeline = (GstPipeline*)gst_parse_launch(launch_string_, &err);
    appsrc_ = gst_bin_get_by_name(GST_BIN(gst_pipeline), "mysource");
    gst_app_src_set_stream_type(GST_APP_SRC(appsrc_),
GST_APP_STREAM_TYPE_STREAM);
    gst_element_set_state((GstElement*)gst_pipeline, GST_STATE_PLAYING);

 // Create the CameraProvider object and get the core interface.
    UniqueObj<CameraProvider> cameraProvider = UniqueObj<CameraProvider>(CameraProvider::create());
    ICameraProvider *iCameraProvider =
interface_cast<ICameraProvider>(cameraProvider);
    if (!iCameraProvider)
        ORIGINATE_ERROR("Failed to create CameraProvider");

    // Get the camera devices.
    std::vector<CameraDevice*> cameraDevices;
    iCameraProvider->getCameraDevices(&cameraDevices);
    if (cameraDevices.size() == 0)
        ORIGINATE_ERROR("No cameras available"); ''''''
'''''
other code.
}





*2) Encoded video data passing to gstreamer.
*


bool ConsumerThread::encoderCapturePlaneDqCallback(struct v4l2_buffer *v4l2_buf,
                                                   NvBuffer * buffer,
                                                   NvBuffer * shared_buffer,
                                                   void *arg) {
    ConsumerThread *thiz = (ConsumerThread*)arg;

    if (!v4l2_buf)
    {
        thiz->abort();
        ORIGINATE_ERROR("Failed to dequeue buffer from encoder capture plane");
    }

#if 1
   // printf("\n encoderCapturePlaneDqCallback \n");
    if (buffer->planes[0].bytesused > 0)
    {
        GstBuffer *gstbuf;
        GstMapInfo map = {0};
        GstFlowReturn ret;
        gstbuf = gst_buffer_new_allocate (NULL, buffer->planes[0].bytesused, NULL);
        gstbuf->pts = thiz->timestamp;
        thiz->timestamp += 33333333; // ns

        gst_buffer_map (gstbuf, &map, GST_MAP_WRITE);
        memcpy(map.data, buffer->planes[0].data ,
buffer->planes[0].bytesused);
        gst_buffer_unmap(gstbuf, &map);

        g_signal_emit_by_name (thiz->m_appsrc_, "push-buffer", gstbuf, &ret);
        gst_buffer_unref(gstbuf);
    }
    else
    {
        gst_app_src_end_of_stream((GstAppSrc *)thiz->m_appsrc_);
        sleep(1);
    }
#else
     thiz->m_outputFile->write((char *) buffer->planes[0].data,
                               buffer->planes[0].bytesused);

#endif
 '''''''''
''''''
other code
}



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: Gstreamer output to a buffer then file On Jetson tegra Tx2

new baby
Hello Thornton,

sure i will go through the link you provided. In that link (*Grabbing data
with appsink*) section is what i needed .
and get back to you....  thanks  

before i was going through the this link
https://gstreamer.freedesktop.org/documentation/tutorials/basic/short-cutting-the-pipeline.html
<https://gstreamer.freedesktop.org/documentation/tutorials/basic/short-cutting-the-pipeline.html>  






--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
adi
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: Gstreamer output to a buffer then file On Jetson tegra Tx2

new baby
Hi Thornton,

I went through the link and based on that i made changes in code , but seems
something is missing.

I am filling input encode video data in appsrc and from appsink getting
muxed data.

1) The appsrc part no problem its  working   but appsink is giving problem.


a) firstly it was hanging at function *g_signal_emit_by_name (sink,
"pull-preroll", &sample, NULL);* as data was not available in sink.  so
whole program was hanging.

b) I have added a counter to wait and get data in sink, now program is
running but data is always 13 bytes. i have written output muxed buffer data
to file (just to test).

Kindly help in solving.

i have attached the error screen short (for a & b points) and code for
better understanding of problem.

<http://gstreamer-devel.966125.n4.nabble.com/file/t378175/sink1.png>
<http://gstreamer-devel.966125.n4.nabble.com/file/t378175/sink.png>  


code  main.cpp
<http://gstreamer-devel.966125.n4.nabble.com/file/t378175/main.cpp>  



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
adi
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: Gstreamer output to a buffer then file On Jetson tegra Tx2

new baby
Hi Thornton,

I solved it,  Grate thanks for point me in right direction.  
*i have added. below code*

        /* get sink */
sink = gst_bin_get_by_name (GST_BIN (gst_pipeline), "sink");
g_object_set(G_OBJECT(sink), "emit-signals", TRUE, "sync", FALSE, NULL);
g_signal_connect(sink, "new-sample", G_CALLBACK(on_new_sample_from_sink),
NULL);

*But video is fast forward need to work on command *


        sprintf(launch_string_,
                        "appsrc name=mysource !
video/x-h264,width=%d,height=%d,stream-format=byte-stream !",
                        STREAM_SIZE.width(), STREAM_SIZE.height());

        sprintf(launch_string_ + strlen(launch_string_),
                        " h264parse ! flvmux name=mux alsasrc device=plughw:2 ! audioresample !
audio/x-raw,rate=48000,channels=1 ! queue ! voaacenc bitrate=32000 ! queue !
mux. mux.  ! queue ! "
                        "appsink name=sink ");






 



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
adi
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: Gstreamer output to a buffer then file On Jetson tegra Tx2

new baby
Hi Thornton,  
I am finding difficulty in in syncing audio and video , i want mp4 output
but not able to use mp4 muxers, because its not going in
on_new_sample_from_sink function itself where i am getting mux data &
writing it to in file.

Kindly help,   i Have attached the file also reference.  main.cpp
<http://gstreamer-devel.966125.n4.nabble.com/file/t378175/main.cpp>  
Thanks in advance


In the function encoderCapturePlaneDqCallback i am setting PTS & in this
function only i am filling data in appsrc.


                gstbuf->pts = thiz->timestamp;
                thiz->timestamp += 33333333; // ns

                gst_buffer_map (gstbuf, &map, GST_MAP_WRITE);
                memcpy(map.data, buffer->planes[0].data , buffer->planes[0].bytesused);
                gst_buffer_unmap(gstbuf, &map);

                g_signal_emit_by_name (thiz->m_appsrc_, "push-buffer", gstbuf, &ret);


Regards,
Aasim



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
adi
Reply | Threaded
Open this post in threaded view
|

AW: AW: AW: Gstreamer output to a buffer then file On Jetson tegra Tx2

Thornton, Keith
Hi, I'm afraid I can't see how you are attempting to sync your audio with your vidour video PTS is ok incrementing every 33 ms. Does your audio start at the same time, what is the sampling rate and who is supplying the pts for your audio?

-----Ursprüngliche Nachricht-----
Von: gstreamer-devel [mailto:[hidden email]] Im Auftrag von aasim
Gesendet: Freitag, 2. Februar 2018 14:03
An: [hidden email]
Betreff: Re: AW: AW: Gstreamer output to a buffer then file On Jetson tegra Tx2

Hi Thornton,
I am finding difficulty in in syncing audio and video , i want mp4 output but not able to use mp4 muxers, because its not going in on_new_sample_from_sink function itself where i am getting mux data & writing it to in file.

Kindly help,   i Have attached the file also reference.  main.cpp
<http://gstreamer-devel.966125.n4.nabble.com/file/t378175/main.cpp>
Thanks in advance


In the function encoderCapturePlaneDqCallback i am setting PTS & in this function only i am filling data in appsrc.


                gstbuf->pts = thiz->timestamp;
                thiz->timestamp += 33333333; // ns

                gst_buffer_map (gstbuf, &map, GST_MAP_WRITE);
                memcpy(map.data, buffer->planes[0].data , buffer->planes[0].bytesused);
                gst_buffer_unmap(gstbuf, &map);

                g_signal_emit_by_name (thiz->m_appsrc_, "push-buffer", gstbuf, &ret);


Regards,
Aasim



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Gstreamer output to a buffer then file On Jetson tegra Tx2

new baby
This post was updated on .
a)*For audio i am not adding any PTS values only for video i have
incremented PTS values (appsrc filling encoded data ).

b) audio start at same time but video is captured + encoded by different
apis and then filled in appsrc and audio is captured and muxed by gstreamer

c) cmd
sprintf(launch_string_,
                        "appsrc name=mysource !
video/x-h264,width=%d,height=%d,stream-format=byte-stream !",
                        STREAM_SIZE.width(), STREAM_SIZE.height());
        sprintf(launch_string_ + strlen(launch_string_),
                   " h264parse ! flvmux name=mux alsasrc device=plughw:2 !
audioresample ! audio/x-raw,rate=48000,channels=1 ! queue ! voaacenc
bitrate=32000 ! queue ! mux. mux.  ! queue ! "

d) audio sampling rate 48000

d1) Video frame rate is coming as 1000fps in vlc & mplayer

e)i am unable to use mp4  muxer as its giving problem(not going in
audio function so no output file write )

f)
After getting muxed output in local buffer i write in to file as .mp4
but muxer i am using is flv as mp4 muxer is not working.




For audio code just capturing from gstreamer and muxing with appsrcc . here
audio is playing properly but video is fast forward.
if i remove incremented PTS for video audio also plays in fast forward way.

Question
How to sync video and audio when video is captured  separately

This is audio code





/* called when the appsink notifies us that there is a new buffer ready for
 * processing */
static void  on_new_sample_from_sink (GstElement * elt)
{
        guint size;
        gpointer raw_buffer;
        GstBuffer *app_buffer, *buffer;
        GstElement *source;
        GstMapInfo map = {0};
        GstSample *sample;
        static GstClockTime timestamp;

        /* get the buffer from appsink */
        g_signal_emit_by_name (sink, "pull-sample", &sample,NULL);
        if(sample)
        {

                buffer = gst_sample_get_buffer (sample);
                gst_buffer_map (buffer, &map, GST_MAP_READ);
                raw_buffer = g_malloc0 (map.size);
                memcpy (raw_buffer, map.data, map.size);
                printf("\n output sample= %ld \n",map.size);
                m_outputFile1->write((char *) raw_buffer,map.size);

                gst_buffer_unmap (buffer,&map);
                gst_sample_unref(sample);
        }
}


main function



static bool execute()
{
        GMainLoop *main_loop;
        GstPipeline *gst_pipeline = NULL;
        GError *err = NULL;
        GstElement *appsrc_;

        gst_init (0, NULL);
        main_loop = g_main_loop_new (NULL, FALSE);
        char launch_string_[1024];

        sprintf(launch_string_,
                        "appsrc name=mysource !
video/x-h264,width=%d,height=%d,stream-format=byte-stream !",
                        STREAM_SIZE.width(), STREAM_SIZE.height());
        sprintf(launch_string_ + strlen(launch_string_),
                   " h264parse ! flvmux name=mux alsasrc device=plughw:2 !
audioresample ! audio/x-raw,rate=48000,channels=1 ! queue ! voaacenc
bitrate=32000 ! queue ! mux. mux.  ! queue ! "
                        "appsink name=sink ");
// sprintf(launch_string_ + strlen(launch_string_),
// " h264parse !  avmux_mp4 name=mux alsasrc device=plughw:2 !
audioresample ! audio/x-raw,rate=48000,channels=1 ! queue ! voaacenc
bitrate=32000 ! queue ! mux. mux.  ! queue ! "
// "appsink name=sink ");
                        // " h264parse ! qtmux ! filesink location=a.mp4 ");

        printf("\n cmd of gstremer = %s \n",launch_string_);

        gst_pipeline = (GstPipeline*)gst_parse_launch(launch_string_, &err);
        appsrc_ = gst_bin_get_by_name(GST_BIN(gst_pipeline), "mysource");

        /* get sink */
        sink = gst_bin_get_by_name (GST_BIN (gst_pipeline), "sink");


        g_object_set(G_OBJECT(sink), "emit-signals", TRUE, "sync", FALSE, NULL);
        g_signal_connect(sink, "new-sample", G_CALLBACK(on_new_sample_from_sink),
NULL);


        gst_app_src_set_stream_type(GST_APP_SRC(appsrc_),
GST_APP_STREAM_TYPE_STREAM);
        gst_element_set_state((GstElement*)gst_pipeline, GST_STATE_PLAYING);



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
gstreamer-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
adi
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Gstreamer output to a buffer then file On Jetson tegra Tx2

new baby
In reply to this post by Thornton, Keith
Hi Thornton,

cheers its done , thanks for your guidance it helped me alot .
culprit was frame rate of video it was going to 1000fps. so audio video sync
problem came.

One concern is i am using flv muxer but written file is mp4 but playing
fine.
if any idea plz let me know


Thanks  again  
 Regards,
Md aasim raza



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
adi