I'm trying to create a pipeline that will take periodic snapshots from an rtsp stream (ip camera) using the h264 decoder hardware in the Raspberry Pi.
From this pipeline:
gst-launch-1.0 rtspsrc location=$cam user-id=$user user-pw=$pw latency=500 \
! rtph264depay \
! h264parse \
! omxh264dec ! videoscale \
! videorate drop-only=true max-rate=1 ! video/x-raw,framerate=1/30 ! jpegenc \
! multifilesink location=$HOME/data/snapshot-%04d.jpg
I get this error:
...
Progress: (request) Sent PLAY request
ERROR: from element /GstPipeline:pipeline0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0: Internal data stream error.
Additional debug info:
gstomxvideodec.c(2886): gst_omx_video_dec_loop (): /GstPipeline:pipeline0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0:
stream stopped, reason not-negotiated
Execution ended after 0:00:00.682983356
Setting pipeline to PAUSED ...
...
Based on my research it looks like videorate can't handle a framerate=0/1? Which I think means it's a variable frame rate?
I'm confused as to why this is happening because I can use this pipeline to display the video feed on the Raspberry Pi desktop:
gst-launch-1.0 rtspsrc location=$cam user-id=$user user-pw=$pw latency=500 ! \
rtph264depay ! h264parse ! omxh264dec ! videoconvert ! videoscale ! autovideosink
I can use this pipeline to capture .jpg snapshots form the rtsp stream using the main processor (but it uses 106% of the cpu)
gst-launch-1.0 rtspsrc location=$cam user-id=$user user-pw=$pw latency=500 \
! rtph264depay \
! avdec_h264 ! videoscale \
! videorate drop-only=true max-rate=1 ! video/x-raw,framerate=1/30 ! jpegenc \
! multifilesink location=$HOME/data/snapshot-%04d.jpg \
I'm also wondering if there is a way to get the omxh264dec plugin to output a downsampled frame rate and/or resolution? Or does that have to be done in the general processor?