- at the same time, show the live stream in a user interface, however the user is able to seek to abitrary positions in the entire stream.
- encode the live stream as is into a file
Hello,I am quite new to Gstreamer and it seems like an amazing tool to me.
I would like to create a pipeline that does the following:- get a video stream from a live source (i.e., webcam)
I know that I need a tee element for the two simultaneous tasks, but currently I am having trouble realizing the time-shift buffering part between the source and the videosink. I thought I could add a queue element before the sink to act as a (ring-)buffer to enable seeking.
This is my experimental pipeline so far (live videotestsrc instead of a webcam):
While this pipeline is running, I can send seek_simple signals to the queue, which results in the frame of the requested timestamp being shown (verified by the timeoverlay). But then the output pauses and will not resume playing (pipeline state remains PLAYING). Weirdly, when I query the current-level-* properties of the queue, it seems to be empty. In that case I don't understand where the seeked frame comes from. Does the pipeline itself also buffer data?
gst-launch-1.0 videotestsrc do-timestamp=true is-live=true name=src ! video/x-raw, format=RGB, width=640, height=480, framerate=20/1 ! videoconvert ! timeoverlay ! queue max-size-buffers=0 max-size-bytes=0 max-size-time=0 name=q ! autovideosink
At one point during my experiments, I saw the queue actually filling up quickly (I don't recall the parameters I was using then), using some GB or memory after only a few seconds.
I found the time-shift buffering scenario in the documentation which I think describes best what I am trying to do, but it's only a short paragraph and there is no dedicated "file-ringbuffer" element, so I tried my luck with the queue.
Is it possible to achieve my goal using Gstreamer? If so, which elements should I look at? And is there a way to get the total duration of a live pipeline? All query_duration calls return -1. Lastly, is there a way to buffer minutes worth of live video (e.g. 720p) in memory?
I'm curious to hear your thoughts! Many thanks in advance,
This is my experiment code in Python:
GObject.threads_init() Gst.init(None) pipeline_string = "videotestsrc do-timestamp=true is-live=true horizontal-speed=1 name=src " \ "! video/x-raw, format=RGB, width=640, height=480, framerate=20/1 " \ "! videoconvert " \ "! timeoverlay shaded-background=true " \ "! queue max-size-buffers=0 max-size-bytes=0 max-size-time=0 name=q " \ "! autovideosink" pipeline = Gst.parse_launch(pipeline_string) source = pipeline.get_by_name("src") queue = pipeline.get_by_name("q") print(pipeline.set_state(Gst.State.PLAYING)) time.sleep(5) print("Current source position:", source.query_position(Gst.Format.TIME)) print("Current queue position:", queue.query_position(Gst.Format.TIME)) print("Current pipeline position:", pipeline.query_position(Gst.Format.TIME)) seek = pipeline.query_position(Gst.Format.TIME) - 3 * Gst.SECOND print("seek to -3 sec:", seek) if queue.seek_simple(Gst.Format.TIME, Gst.SeekFlags.FLUSH | Gst.SeekFlags.KEY_UNIT, seek): print("Seek OK") print("Current source position:", source.query_position(Gst.Format.TIME)) print("Current queue position:", queue.query_position(Gst.Format.TIME)) print("Current pipeline position:", pipeline.query_position(Gst.Format.TIME)) print("Queue buffer level:", queue.get_property("current-level-buffers"), "- bytes:", humanize.naturalsize(queue.get_property("current-level-bytes"), binary=True), "- time:", queue.get_property("current-level-time") / Gst.SECOND) time.sleep(2) seek = source.query_position(Gst.Format.TIME) print("Current source position:", source.query_position(Gst.Format.TIME)) print("jump to front", seek) queue.seek_simple(Gst.Format.TIME, Gst.SeekFlags.FLUSH | Gst.SeekFlags.KEY_UNIT, seek)
These are the print outputs:
Current source position: (True, cur=5026269432)
Current queue position: (True, cur=5026269432)
Current pipeline position: (True, cur=4950176384)
seek to -3 sec: 1950209076
Current source position: (True, cur=1950209076)
Current queue position: (True, cur=1950209076)
Current pipeline position: (False, cur=0)
Queue buffer level: 0 - bytes: 0 Bytes - time: 0.0
Current source position: (True, cur=2126269432)
jump to front 2126269432
gstreamer-devel mailing list
Theoretically the queue2 element has what you are looking for and it has a ring buffer property. However, at least from my own personal experimentation you can't seek on the data in the ring buffer, even if it is muxed video. uridecodebin also has a ring buffer property, and it uses queue2, however it disables this functionality on streaming sources, so I am not exactly sure what the purpose of the ring buffer is actually for. There is the timeshift element:
But it may be deprecated for queue2.
You may be better served recording to a streamable mkv file and using another pipeline that reads from it. You could jump to live playback by using the current time of the recording pipeline, because you wont know the duration otherwise. In my experience this works fairly well, and may achieve the same effect you are trying to accomplish.
Hope that helps!
On 11/6/2017 2:36 PM, Fritjof Büttner wrote:
gstreamer-devel mailing list
|Free forum by Nabble||Edit this page|