We’re using the Streaming Engine to build a system that needs to take video and audio from the browser using WebRTC, and deliver onward to a third party receiver using mpegts. We need it to be ultra low latency - think: real-time-communications.
Functionally, we’re looking good. But, the latency is too high. To keep the mpegts UDP stream consistent, we’ve found that we need to push everything through a Streaming Engine transcode. But, that’s adding about 600ms (even if we just use pass through). So, we get about 1200 to 1300ms end to end latency - and that’s about double what we need.
I’ve just about come to the end of options that I have been able to find to improve the latency. So, I have two questions to ask here:
- Is that extra 600ms from the transcoder about what I should be expecting?
- Is there a trick that I’ve missed to reduce the latency?
For reference, we’re running in a dev environment in an AWS Ubuntu T3.large. I don’t believe we’re any where near worrysome CPU utilisation.