feeding images to imagefreeze

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

feeding images to imagefreeze

Brian McKeon-2

Hi all,

I'm hoping someone with more experience can point me in the right direction.

I'm trying to push jpeg images into a gstreamer pipeline. When an image gets put into the pipeline, I want it to stream that image until I signal the pipeline to pull the next image, and so on.

My python code copies each image into a gstbuffer and pushes that into an appsrc element. The appsrc element then feeds into a decoder and then into an imagefreeze element.

It works fine, except that I can't make the imagefreeze element pull additional buffers from appsrc.

As soon as it has one buffer, it doesn't pull any others. The buffers just queue up in the appsrc element with nowhere to go.

Ideally I'd like to be able to send a signal to imagefreeze to tell it when to pull the next image buffer from appsrc. But I haven't been able to find anyway of accomplishing this.

It occurred to me that maybe imagefreeze wasn't designed for this purpose and that I may need to go into the source code and add this feature myself.

But I first wanted to check with everyone to see if there's an easier way of accomplishing this.

All ideas are welcome! :)

Thanks so much!

Brian

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: feeding images to imagefreeze

Tim-Philipp Müller-2
On Wed, 2013-02-06 at 07:25 -0500, Brian McKeon wrote:

Hi Brian,

> I'm hoping someone with more experience can point me in the right
> direction.
>
> I'm trying to push jpeg images into a gstreamer pipeline. When an
> image gets put into the pipeline, I want it to stream that image until
> I signal the pipeline to pull the next image, and so on.
>
> My python code copies each image into a gstbuffer and pushes that into
> an appsrc element. The appsrc element then feeds into a decoder and
> then into an imagefreeze element.
>
> It works fine, except that I can't make the imagefreeze element pull
> additional buffers from appsrc.
>
> As soon as it has one buffer, it doesn't pull any others. The buffers
> just queue up in the appsrc element with nowhere to go.
>
> Ideally I'd like to be able to send a signal to imagefreeze to tell it
> when to pull the next image buffer from appsrc. But I haven't been
> able to find anyway of accomplishing this.
>
> It occurred to me that maybe imagefreeze wasn't designed for this
> purpose and that I may need to go into the source code and add this
> feature myself.
>
> But I first wanted to check with everyone to see if there's an easier
> way of accomplishing this.

What's your whole pipeline like? (Since you say "stream" the image)

Cheers
 -Tim

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: feeding images to imagefreeze

Brian McKeon-2

Hi Tim,

My pipeline is very basic at the moment because I wanted to start with a
proof of concept.

It looks like:

appsrc ! decodebin ! imagefreeze ! autovideosink


And let me restate my goal to try to add clarity:

My code is sent a jpeg image every so often. (i.e. the input isn't
continuous. New images can arrive at any time.)

I need to take this single image and create a continuous output stream
with it. Very much like what imagefreeze does.

This approach is currently working, but only for the very first image I
receive.

Every image after that gets enqueued into the appsrc element, but those
buffers never get pulled into the output stream. So the output just
keeps showing the first image.

I was really hoping there was a simple way to trigger the imagefreeze
element to tell it to pull the next buffer from appsrc.


Hopefully that makes more sense. But if not, please let me know and I'll
provide more info.


Thanks!
Brian



On 2/6/13 8:46 AM, Tim-Philipp Müller wrote:

> On Wed, 2013-02-06 at 07:25 -0500, Brian McKeon wrote:
>
> Hi Brian,
>
>> I'm hoping someone with more experience can point me in the right
>> direction.
>>
>> I'm trying to push jpeg images into a gstreamer pipeline. When an
>> image gets put into the pipeline, I want it to stream that image until
>> I signal the pipeline to pull the next image, and so on.
>>
>> My python code copies each image into a gstbuffer and pushes that into
>> an appsrc element. The appsrc element then feeds into a decoder and
>> then into an imagefreeze element.
>>
>> It works fine, except that I can't make the imagefreeze element pull
>> additional buffers from appsrc.
>>
>> As soon as it has one buffer, it doesn't pull any others. The buffers
>> just queue up in the appsrc element with nowhere to go.
>>
>> Ideally I'd like to be able to send a signal to imagefreeze to tell it
>> when to pull the next image buffer from appsrc. But I haven't been
>> able to find anyway of accomplishing this.
>>
>> It occurred to me that maybe imagefreeze wasn't designed for this
>> purpose and that I may need to go into the source code and add this
>> feature myself.
>>
>> But I first wanted to check with everyone to see if there's an easier
>> way of accomplishing this.
> What's your whole pipeline like? (Since you say "stream" the image)
>
> Cheers
>   -Tim
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: feeding images to imagefreeze

Tim-Philipp Müller-2
On Wed, 2013-02-06 at 09:15 -0500, Brian McKeon wrote:

Hi,

> My pipeline is very basic at the moment because I wanted to start with a
> proof of concept.
>
> It looks like:
>
> appsrc ! decodebin ! imagefreeze ! autovideosink
>
>
> And let me restate my goal to try to add clarity:
>
> My code is sent a jpeg image every so often. (i.e. the input isn't
> continuous. New images can arrive at any time.)
>
> I need to take this single image and create a continuous output stream
> with it. Very much like what imagefreeze does.
>
> This approach is currently working, but only for the very first image I
> receive.
>
> Every image after that gets enqueued into the appsrc element, but those
> buffers never get pulled into the output stream. So the output just
> keeps showing the first image.
>
> I was really hoping there was a simple way to trigger the imagefreeze
> element to tell it to pull the next buffer from appsrc.

Right, so imagefreeze takes a single static image as input and will
repeat that ad infinitum basically. I don't think it fits your use case.

If your output is a video sink, you should not need imagefreeze at all.
You just push the next image whenever you want. Until then, the video
sink should keep the old image around.

However, if you want to make sure you actually render/output N images
per second or so, regardless of how often images get pushed in, then
that won't work of course.

Something like this would almost do the trick then:

gst-launch-0.10 -v \
  videotestsrc ! video/x-raw-yuv,framerate=1/3 ! intervideosink \
  intervideosrc ! identity ! ffmpegcolorspace ! ximagesink

or

gst-launch-1.0 -v \
  videotestsrc ! video/x-raw,framerate=1/3 ! intervideosink  \
  intervideosrc ! identity silent=false ! videoconvert ! ximagesink

but you'll notice that it reverts to a black frame if it hasn't received
a buffer for a while (~1 second). I think this value is hard-coded
inside the intervideo* elements currently, so can't easily be changed
(no reason not to expose a property for this though)

Cheers
 -Tim





_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: feeding images to imagefreeze

Brian McKeon-2

Thanks so much Tim!

I didn't realize that was the default behavior for a video sink. Cool.

But there's a complication that I didn't mention earlier... ;)

Eventually, I'd like to be able to send this stream over rtp instead of
to a video sink. So from what you said, it sounds like this wouldn't
work for that case.

I'll spend some time looking at the example you provided though in case
that'll work for me.

But assuming that it doesn't, I think I may try my hand at modifying
imagefreeze. Do you have any thoughts about the viability of that?

(At this point, I'm just trying to figure out whether I'm barking up the
wrong tree or not.)

Cheers,
Brian



On 2/6/13 9:33 AM, Tim-Philipp Müller wrote:

> On Wed, 2013-02-06 at 09:15 -0500, Brian McKeon wrote:
>
> Hi,
>
>> My pipeline is very basic at the moment because I wanted to start with a
>> proof of concept.
>>
>> It looks like:
>>
>> appsrc ! decodebin ! imagefreeze ! autovideosink
>>
>>
>> And let me restate my goal to try to add clarity:
>>
>> My code is sent a jpeg image every so often. (i.e. the input isn't
>> continuous. New images can arrive at any time.)
>>
>> I need to take this single image and create a continuous output stream
>> with it. Very much like what imagefreeze does.
>>
>> This approach is currently working, but only for the very first image I
>> receive.
>>
>> Every image after that gets enqueued into the appsrc element, but those
>> buffers never get pulled into the output stream. So the output just
>> keeps showing the first image.
>>
>> I was really hoping there was a simple way to trigger the imagefreeze
>> element to tell it to pull the next buffer from appsrc.
> Right, so imagefreeze takes a single static image as input and will
> repeat that ad infinitum basically. I don't think it fits your use case.
>
> If your output is a video sink, you should not need imagefreeze at all.
> You just push the next image whenever you want. Until then, the video
> sink should keep the old image around.
>
> However, if you want to make sure you actually render/output N images
> per second or so, regardless of how often images get pushed in, then
> that won't work of course.
>
> Something like this would almost do the trick then:
>
> gst-launch-0.10 -v \
>    videotestsrc ! video/x-raw-yuv,framerate=1/3 ! intervideosink \
>    intervideosrc ! identity ! ffmpegcolorspace ! ximagesink
>
> or
>
> gst-launch-1.0 -v \
>    videotestsrc ! video/x-raw,framerate=1/3 ! intervideosink  \
>    intervideosrc ! identity silent=false ! videoconvert ! ximagesink
>
> but you'll notice that it reverts to a black frame if it hasn't received
> a buffer for a while (~1 second). I think this value is hard-coded
> inside the intervideo* elements currently, so can't easily be changed
> (no reason not to expose a property for this though)
>
> Cheers
>   -Tim
>
>
>
>
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel