Hi, i have next pipeline in attachment: 1.html
<http://gstreamer-devel.966125.n4.nabble.com/file/t377232/1.html> . Little
description of this pipeline, i have many chunks which was write early via
multifilesink, chunk duration is 120 seconds. I attach new pipline to this
folder via multifilesrc but output should be hlssink with chunk duration 15
secconds, i start this pipeline and on sink i have many 'custom-downstream'
events (and chunks from 120 very fast converting to chunks 15 sec and
playlist changing also very fast as result stream not playing in players).
But i want that stream should be common hls stream with chunk to chunk
conversion, if i insert rtmpsink instead hlssink pipeline work as expected,
users can see normal rtmp stream.
Thank you for reply but i think you misunderstood me, a prepared more
Description of the above pipeline is that we are saving chunks of 120
seconds from a HLS Linear stream, then a couple of hours later we are trying
to restream the same 120 second chunks to HLS with hlssink. When I start the
pipeline we are getting 'custom-downstream' events and the status of the
pipeline keeps changing and it’s not running constantly so I believe this
keeps happening because of the 120sec chunks are converting very fast to HLS
15 second chunks then there is not enough buffer left to keep the hlssink
running. If we use rtmpsink instead it works well and the pipeline runs as
expected. Please the dump of my pipeline maybe there is something that
missing that can solve my issue. I appreciate your help!
Your pipeline will probably consume buffers as fast as possible by remuxing
the 120 second segment to 15 second segments as quickly as possible. Since
playlist-length=5, your playlist will start dropping older segments, leaving
the client loosing track of what it is supposed to be playing.
When using rtmpsink, you are using tsdemux and flvmux to remux the stream.
This will give the pipeline a clue on what the actual rate of the video
stream is. Since rtmpsink's sync properly is true by default your stream
will play at real-time.
Note that there is little need to "keep the hlssink" running. It's done when
it's done and it is up to your WWW server to keep serving the files. You do
need, however, make sure that the playlist (.m3u8) contains all the segments
that you want to play (e.g. by setting playlist-length=0 and max-files=0 on
If you want to remux the 120 second segments to 15 second segments, you
should probably demux and mux the transport stream (e.g. - tsdemux - queue -
h264parse - queue - mpegtsmux and fiddle the audio branch in there as well
Hi Arjen Veenhuizen, thank you for reply, but in my piplene how you can see
after src inserted decodebin element. This decodebin when stream starting,
converting to tsdemux->multiqueue->h264parse. After that in dynamic
way it connecting to udb_conn_video_0 and as result i have pipleline like