We are using Elecard's G4 SDK to encode live audio and video and then send it to WMS for redistribution. Audio is AAC LC 48kbps, Video is h.264 Baseline 3.0 600kbps. Video resolution is 640x360 @ 30fps
When we use JWPlayer (RTMP) we notice the audio lags behind the video by 4-6 seconds. It's not consistent.
When we use iOS (HLS) we notice the audio lags behind the video 6 to 10 seconds.
I saw the article about troubleshooting live streaming and I have applied all the recommendations indicated, however the audio delay persists.
What is the meaning of the value of sortBufferSize? is it milliseconds? is it packets? is it MB? obviously, with my large delay of 6 seconds, a 750 ms buffer will not help.
I tried looking for a document that explains this values but can't find one. Can you provide a document that explains all the options for the Application.xml file?
Do you have any ideas or suggestions to try and determine where the audio delay is coming from?
The stream is currently live: