Le mercredi 31 octobre 2018 à 06:45 -0500, michi1994 a écrit :
> Dear helpfull people,
> I am very much new to the whole GStreamer-thing, therefore I would be happy
> if you could help me.
> I need to stream a near-zero-latency videosignal from a webcam to a server
> and them be able to view the stream on a website.
> The webcam is linked to a Raspberry Pi 3, because there are
> space-constraints on the mounting plattform. As a result of using the Pi I
> really can't transcode the video on the Pi itself. Therefore I bought a
> Logitech C920 Webcam, which is able to output a raw h264-stream.
> By now I managed to view the stream on my windows-machine, but didn't manage
> to get the whole website-thing working.
> My "achivements":
> gst-launch-1.0 -e -v v4l2src device=/dev/video0 !
> video/x-h264,width=1920,height=1080,framerate=30/1 ! rtph264pay pt=96
> config-interval=5 mtu=60000 ! udpsink host=192.168.0.132 port=5000
> My understanding of this command is: Get the signal of video-device0, which
> is a h264-stream with a certain width, height and framerate. Then pack it
> into a rtp-package with a high enough mtu to have no artefacts and capsulate
> the rtp-package into a udp-package and stream in to a ip+port.
> gst-launch-1.0 -e -v udpsrc port=5000 ! application/x-rtp,
> payload=96 ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! fpsdisplaysink
> sync=false text-overlay=false
> My understanding of this command is: Receive a udp-package at port 5000.
> Application says it is a rtp-package inside. I don't know what
> rtpjitterbuffer does, but it reduces the latency of the video a bit.
> rtph264depay says that inside the rtp is a h264-encoded stream. To get the
> raw data, which fpsdisplaysink understands we need to decode the h264 signal
> by the use of avdec_h264.
> My next step was to change the receiver-sink to a local tcp-sink and output
> that signal with the following html5-tag:
> <video width=320 height=240 autoplay>
> <source src="<a href="http://localhost:#port#">http://localhost:#port#">;
Only WebRTC will let you render a low latency stream into your browser.
It is a tad more complicated and will required you to code things. You
also need very recent (probably git master) or GStreamer as support
this is is really new. This repository shows briefly how this works and
how to setup.