I would like to write an open source playout solution with gstreamer. The goal is, to stream (with rtmp) playlist dynamical for every day. And this playlists must be editable in any time, except the current (and maybe the next, or last) clip of course.
I saw that gstreamer has some Bins what might be useful for that case. That is playbin3, uridecodebin3 and decodebin3. This ones have the signal "about-to-finish", what makes it possible to play gapless.
The biggest problem I see, is A/V sync, I read that this is not an easy case, and I believe this too. Most time video and audio have not the same length, and this cases a growing shift over time.
I found also the example playout.c in the gst-plugins-bad sources. This looks very good to me and I test it too, on my Mac it stops after some hours... Maybe this will go, when the output is not an autovideosink, but an rtmpsink.
This example is now from 2015, so I ask my self if now with the new Bins, it would be better to change some thing?
I'm also not so good in C programming, what makes it even harder for me. I have to learn the gstreamer API and logic and I have to learn more C. In python I feel more comfortable.
Could you please give me some though and suggestions, how you would go forward here?
I was far away from having a code, where I only needed to synchronizing audio/video.
But now maybe I found a solution. It is a bit unconventional, but it could work:
I use two pipelines in the first one I put playbin3 and give them inter-sinks. In the second pipeline I use intersources. I test it only with a hand full clips, and not with rtmp output. But this is the next step now.
It would be nice when uridecodebin3 had the same behavior like playbin3, then I could do everything in one pipeline.