I am able to stream the Video data from Target board to Host system using RTP/RTSP. I need to send the Custom data in each frame to server. Is it possible to integrate in the GST pipe-line adding in Client side and extract in server side?
You could do something like this (I did this in 0.10):
Use MPEG2 Transport stream as container.
Customize the mpegtsmux to accept a new kind of caps (e.g. private/x-mydata) or abuse the private/teletext caps.
You obviously need to program your pipeline in Python/C/whatever. This is not going to work from the command line.
Mux your video stream into MPEG2 TS and request an extra sink pad with forced caps for your custom data.
Now, connect an appsrc element to this newly requested pad.
For each video frame, push your custom data into the appsrc (as a gstBuffer) and make sure that the timestamp matches with the video frame timestamp.
It is important that the timestamps of the video and data buffers correspond to each other and that you insert custom data for every video frame. Otherwise, you will lose synchronisation and you pipeline could stall.
You can use the rtpmp2tpay element to payload your stream and send it via RTP/RTSP. Use rtp2mptdepay do depayload it at the receiving side and tsdemux to demultiplex.
Now, the tsdemux element will spawn an private/x-mydata or private/teletext pad when it demuxes you stream. Add a buffer probe to this pad and you can parse each custom data packet as it arrives.
Be aware that the private/teletext caps could add some padding bytes here and there and that it probably makes life easier if your custom data is =< 188 - 14 bytes (MPEG2 TS packet size - headers).