I'm working on a video and data recorder.
Data will be saved in a SQLite DB and video in a mkv file.
Each data item is timestamped to replay the records with the right timings.
My problem is reproduce the video in sync with data.
At the moment I created a GstClock returning a time starting from 0 and then
advancing in sync with data timestamps.
So if data arrive slow from the DB source also the video go slow, and so on.
If I stop the replay the data time is frozen and stops to update the
GstClock so also video stops.
The thing works but seems to me a too funny approach 🙂.
Moreover I want to send and EOS to the pipeline when data stops because I
want flush alla data in the pipeline (a mkv to RTP streaming pipeline), so
the last video image will match the stop time. But the eos don't flow
because the Clock is still.
Any pointer on the video-data synchronization?
Just my 2 cents:
I had experimented with something where frame-specific metadata was stored
within the video file as H.264 SEI data. While parsing the H.264 video, the
SEI data was extracted and attached to the frame buffer as custom GstMeta
object. This way, each frame had its own metadata which could be used or
ignored as needed, and pipeline interaction was mostly unchanged. In my
case, the custom data was read by the appsink. Perhaps consider doing
something like this, and just inserting your data using a custom filter
I cannot attach data to video because I can got data without video.
More precisely, I can got zero to many video streams depending from a
So data and video must be independent, just correlated by time when
I understand that the video and the data are separate, I was suggesting that
you append the data from the DB to the video frames as they are being
decoded using a custom filter element. However, I don't think that I
understand your situation, can you explain or share your pipeline?
Instead of starting or stopping the GstClock, would it be better to just
allow the pipeline to play, then read whatever the current video time is and
use that to select/show data? Preloading/buffering data from the DB should
prevent slowdown or bad synchronization due to a slow response from the DB,
and then there should be no issue pausing or stopping the video because you
never write the clock values.
> I understand that the video and the data are separate, I was suggesting
> you append the data from the DB to the video frames as they are being
> decoded using a custom filter element. However, I don't think that I
> understand your situation, can you explain or share your pipeline?
> Instead of starting or stopping the GstClock, would it be better to just
> allow the pipeline to play, then read whatever the current video time is
> use that to select/show data? Preloading/buffering data from the DB should
> prevent slowdown or bad synchronization due to a slow response from the
> and then there should be no issue pausing or stopping the video because
> never write the clock values.
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________
> gstreamer-devel mailing list
when I say video and data are separate I mean that I can receive data to
store also without video. I cannot attach data to video because videos are
not always present. Moreover, I can receive many videos but data are not
related to one video or another.
To better understand:
data = the position of many aircrafts updated @60Hz
videos = the scene visible from an aircraft + the onboard instruments view.
The recording phase is straightforward: each data is saved with a timestamp.
Each video is saved as mkv and some metadata in the DB to known the
beginning time of each file, so I can compare it with data timestamps and
reproduce the right video at the right time.
In the replay phase I need to send video by RTP and data by HLA.
Those two systems are separate and I was looking for a clean/correct way to
implement the synchronisation.
Another problem is the trick play when coupled with RTP. The reverse
playback send also reversed RTP timestamps, so I got errors on timestamps by
many standard players (ffplay, vlc).
But if I display the stream RTP with a gstreamer pipeline I got good
results. It seems to me that gstreamer ignore timestamps or something. Have
I a chance to force RTP timestamps to go always up also in reverse playback?