I am having a problem with a pipeline and could use some assistance.
The application will eventually be embedded, but for now, is running on linux on my pc. The application will have a number of viewports. Each viewport allows a client to attach to it and provide:
- a jpeg image
- an H.264 Iframe
- an rtp/H.264 video
- a miracast video which is also rtp/H.264 but has some special issues.
If no client is connected to a particular viewport, it displays an "idle" jpeg image.
Basically, the client opens a connection to the viewport and specifies what the type of image or video will be. The client can then send images or start a video playing. Our pipeline looks like this:
filesrc -> jpegparse -> jpegdec -> videoconvert -> input-selector -> xvimagesink
appsrc -> jpegparse -> queue -> jpegdec -> videoconvert -> (input-selector)
appsrc -> h264parse -> avdec_h264 -> videoconvert -> (input-selector)
udpsrc -> rtpjitterbuffer -> rtph264depay -> avdec_h264 -> queue -> videoconvert -> (input-selector)
udpsrc -> (elements of RTP/H.264 miracast sub-pipeline) -> (input-selector)
I will show the internals of the miracast sub-pipeline in a minute.
All of these feed into a single input-selector and then into the sink. When the app starts, the input selector is set to take its input from the idle sub-pipeline pad. If a client connects and wants to send jpeg images, the input-selector is changed. Then, when the client sends an image, it is pushed to the appsrc connected to the jpeg sub-pipeline. If I disconnect the miracast pipeline, this works fine. I have a test client which can connect and send jpeg images. Each time a jpeg image is sent, the viewport displays the new image. If the client disconnects, the viewport switches back and displays the idle image.
The miracast pipeline is similar to the H.264 video pipeline. However, sometimes, the device which is sending the video sends it in the wrong orientation. So, the viewport may be in landscape orientation, and the device sends a landscape video in a portrait orientation with black borders. Then, the gstreamer pipeline displays this portrait orientation by adding more black borders. The result is a small image with borders. To detect this, we add a videocrop element and some other elements to allow us to detect the black border.
Here is the miracast pipeline:
udpsrc -> rtpjitterbuffer -> rtph264depay -> avdec_h264 -> tee -> queue -> videoconvert -> videocrop -> (input-selector)
-> queue -> videoconvert -> appsink
A GST_PAD_PROBE_TYPE_BUFFER is added to the appsink "sink" pad. Every few frames, the callback routine gets the image buffer and tests for the presence of a black border. If found, the videocrop attributes are set to change how the image is cropped.
The problem I am seeing is that if I connect the miracast sub-pipeline into the complete pipeline, then the pipeline stops working. On startup, the idle image is displayed. The client can connect and start to send jpeg images. However, only the first jpeg image is displayed. The other images are not displayed.
One thing I notice: I use GST_DEBUG_BIN_TO_DOT_FILE() to look at the pipeline. Even though the pipeline has been set to Playing, the appsink is still in its transition from Ready to Paused. I think this makes sense because it has not received any data because that part of the pipeline has not seen any data. So, it hasn't prerolled. However, why should that affect the rest of the pipeline?
As a bit of background, we are doing the pipeline this way with the input-selector because our previous attempt was too unstable and crashed too often. In that attempt, we had individual pipelines from src (appsrc or udpsrc) to sink (waylandsink on an embedded board), and when there was a connection with a new image or video type, we destroyed the old pipeline and created a new one. We found it frequently crashed down in the guts of wayland sink. This was an attempt to build a pipeline which did not need to create and destroy sinks.
Any help would be appreciated.
gstreamer-devel mailing list
|Free forum by Nabble||Edit this page|