I am trying to set up an audio pipeline, going over udp over wifi, with as consistent ( and hopefully minimal ) latency as possible. The goal is a constant 40ms latency. To that end I have specified a buffer time in alsasink, and added rtpjitterbuffer to limit the maximum latency, preferring short bits of silence over latency jitter. My audio source is over i2s, playng a WAV file with 1ms bursts of sine wave every 100ms, so I can measure the total latency on a scope.
Without rtpjitterbuffer, I do not follow the same path through clock_convert_external, so the start and stop below are not 0, and the render function does not drop the buffer.
I suspect I am doing something wrong with latency and/or buffers, but I have tried a wide range of experiments that have not led me to a better understanding of the problem. I set the latency to 28 in rtpjitterbuffer because the reported latency of the pipeline is 12ms, and I am trying to get it to a steady 40. Setting it to 28000000 nanoseconds does not work either.
This is gstreamer 1.12.2, built in yocto and running on a beaglebone black with a wifi cape.