Issues with rtspsrc bearing both audio and video

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Issues with rtspsrc bearing both audio and video

amararama
Hi all,

I'm having an issue with converting an rtsp source with embedded audio and
video.

Specifically I cannot seem to get the audio out of the source.

I have been able to get video from it using the below pipeline:

*gst-launch-1.0 rtspsrc location=rtsp://192.168.50.160/whp name=src src. !
rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! x264enc bitrate=10000
! rtph264pay ! udpsink host=192.168.50.164 port=8004*

However if I try to do the same thing with the audio stream the pipeline
picks the wrong output and tries to convert the video into audio and fails.

I looked into running the pipeline queuing audio in an attempt to see if the
queue would pick up on the audio and send it through the pipeline, however,
when setting up a queue with the pipeline below, the pipeline would never
enter the playing state:

*gst-launch-1.0 rtspsrc location=rtsp://192.168.50.160/whp latency=100
name=src src. ! queue ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert
! x264enc bitrate=10000 ! rtph264pay ! udpsink host=192.168.50.164 port=8004
src. ! queue ! fakesink*

The dotfile attached is the result of the queued pipeline, as you can see
while the playing status is stuck in the rtspsrc plugin.

<http://gstreamer-devel.966125.n4.nabble.com/file/t379612/I7VSc.png>

I then attempted to look at pulling out the audio separately using the same
method that the rtspsrc uses:
*gst-launch-1.0 --gst-debug=2  udpsrc uri=udp://0.0.0.0:54218 port=54218
caps="application/x-rtp, media=(string)audio, payload=(int)97" !
.recv_rtp_sink rtpsession .recv_rtp_src ! fakesink*

but that didnt give me anything worthwile either.

My ideal scenario is to get the queued pipeline working, however, beyond the
fact it never seems to enter a playing state, it also seems that the path
that the audio would go doesnt get picked up by fakesink and where the video
gets its own proxy(?) pads, as a part of rtpssrcdemux0; rtpssrcdemux1 gets
no such pads and fakesink as a result does not connect to the pipeline.

What am I doing wrong, I assume I am missing something fundamental here as I
imagine rtspsrc should be able to demux both sources and send them out to a
pipeline that I can then manipulate.

Help is much appreciated.




--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel