How to set PTS for appsrc buffer with pausing and playing conditions.
I have a camera that feeds frames into an appsrc element. Each frame has an
associated timestamp (from the camera), so I thought it was logical to set
the buffer PTS to the frame's timestamp (minus the initial timestamp to have
it start at 0).
This works fine until I want to PAUSE the pipeline, then PLAY the pipeline.
In this case I find that my queue buffer is filling up right before my sink
when I switch to PLAY.
After looking at the documentation, I see that the pipeline's running_time
does not change when the pipeline is in the PAUSED state. I therefore
suspect that the buffer is being assigned a timestamp that is much later
than it should be (camera timestamp is incrementing while running_time does
not), and the frames are waiting before the sink for their presentation
1. Is this analysis correct?
2. What is the BEST way to remedy this? I could potentially keep track of
the PAUSE and PLAY times and adjust the buffer PTS accordingly so I can
still use the camera's timestamps, but I'm worried about the effect this may
have after many pauses and plays. Are there any other standard ways to do