Wowza Community

How to reduce delay in HLS ?

Hi,

We want to make live streaming to iOS device Where input streams are from flash (Audio Codec - Nellymoser and Video Codec- sorenson spark/h.263/h.264) and output streams require HLS (Audio Codec - AAC and Video Codec- H.264).

We have faced 2 problems -

1 - How to convert audio codec nellymoser to AAC ?

Wowza transcoder addOn is not supported nellymoser as input

so, We are working with FFMPEG to transocode input stream from flash (Audio Codec - Nellymoser and Video Codec- sorenson spark/h.263/h.264) to required output stream (Audio Codec - AAC and Video Codec- H.264).

FFmpeg command used :

ffmpeg -i rtmp://[Wowza IP Address]:1935/DemoApp/testing -c:a libvo_aacenc -ar 44100 -ab 48k -c:v libx264 -preset ultrafast -tune zerolatency -r 24 -g 48 k -keyint_min 48 -f flv rtmp://[Wowza IP Address]:1935/DemoApp/testingios

2 - How to support HLS for iOS devices ?

To support HLS, We have configured our Application.xml as per steps provided in https://www.wowza.com/docs/how-to-configure-apple-hls-packetization-cupertinostreaming

It fulfilled our requirement but we are facing a DELAY in HLS, Significant delay (arround 10 to 15 Sec). We need optimised delay.

We have worked with Key frame as well. We have optimised our packetization settings like cupertinoChunkDurationTarget =2000 and -r 24 -g 48 k -keyint_min 48 at FFmpeg side.But there is still delay in iOS side.

How to resolve DELAY or Reduce DELAY in HLS ?

Thanks in advance.

Hi,

The steps you have taken are all correct and reducing the “cupertinoChunkDurationTarget” will lower the delay from the default which is around 30 seconds.

When looking in the logs you should be able to see the duration of the chunks created for Apple HLS. As you mentioned they should be 2 seconds depending on keyframe interval.

If you can increase the keyframe interval to be every 1 second the “cupertinoChunkDurationTarget” can be set to 1 second also which will further reduce the delay.

The log line should look something like this

LiveStreamPacketizerCupertino.endChunkTS[live/_definst_/myStream_160p]: Add chunk: id:1 mode:TS[H264,MP3] a/v/k:102/200/8 duration:8000

Obviously with a duration equal to the keyframe interval if the “cupertinoChunkDurationTarget” is set to 1 second.

Regards,

Jason

Are you referring to start time or latency, or both? Start time is how long it takes for a user to start seeing video after starting playback. Latency is how far behind the stream is compared to the encoder.

Can you show several of those lines? Wowza shows the 10 of them (it continues chunking of course but only logs 10)

The one you show is 2 seconds duration and has one key frame, which is right according to your FFmpeg command which shows a 2 second key frame frequency ( -r 24 -g 48 k -keyint_min 48 ), and indicates cupertinoChunkDurationTarget is “2000”. If you change -g to 24 you will have 1 second key frame frequency, then you can make cupertinoChunkDurationTarget 1000, which should yield packets about like this: a/v/k:14/8/1 duration:1000.

Richard

The cupertino settings are good in that you are getting a consistent 2 second chunk with 1 key frame. You can do the same with smooth streaming following this guide (or you might want to remove the smoothstreamingpacketizer if you are not using a Silverlight player).

How are you testing? Is it local or remote, is it iOS device or HLS desktop playback with JW Player or VLC? What is the bitrate of the stream (not evident in your ffmpeg command)?

You can go down to 1 second key frame frequency and chunkDurationTarget 1000. You can reduce the bitrate of the stream. You can remove network factors if you are testing remote to see what the delay is without that. Understand that iOS devices need 3 chunks before playback, so if the bitrate is very high and the device is remote and a slower connection, it still has to download 3 large chunks.

Richard

Hi Jason,

Thank you for your reply…

We are getting this,

INFO server comment - LiveStreamPacketizerCupertino.endChunkTS[DemoApp/definst/testingios]: Add chunk: id:8 mode:TS[H264,AAC] a/v/k:28/16/1 duration:2000

and the keyframe interval is 1 second the “cupertinoChunkDurationTarget” can be set to 1 second.

We are still facing 9-10 sec delay. How do we achieve less delay in HLS ?

Hi Richard,

Wowza logs -

INFO server comment - LiveStreamPacketizerCupertino.endChunkTS[DemoApp/definst/testing_ios]: Add chunk: id:1 mode:TS[H264,AAC] a/v/k:0/20/1 duration:2000

INFO server comment - LiveStreamPacketizerCupertino.endChunkTS[DemoApp/definst/testing_ios]: Add chunk: id:2 mode:TS[H264,AAC] a/v/k:25/20/1 duration:2500

INFO server comment - LiveStreamPacketizerSmoothStreaming.handlePacket[DemoApp/definst/testing_ios]: Fragment durations: [2.0,2.5,2.0]

INFO server comment - LiveStreamPacketizerSmoothStreaming.flushPendingVideo: Bitrate[DemoApp/definst/testing_ios]: 203316

INFO server comment - LiveStreamPacketizerSmoothStreaming.addFragment[DemoApp/definst/testing_ios]: Add chunk: type:video id:0 count:20 duration:2000

INFO server comment - LiveStreamPacketizerCupertino.endChunkTS[DemoApp/definst/testing_ios]: Add chunk: id:3 mode:TS[H264,AAC] a/v/k:29/20/1 duration:2000

INFO server comment - LiveStreamPacketizerSanJose.endChunkTS[DemoApp/definst/testing]: Add chunk: id:1 a/v/k:116/90/3 duration:8153

INFO server comment - LiveStreamPacketizerCupertino.endChunkTS[DemoApp/definst/testing_ios]: Add chunk: id:4 mode:TS[H264,AAC] a/v/k:28/20/1 duration:2000

INFO server comment - LiveStreamPacketizerSmoothStreaming.addFragment[DemoApp/definst/testing_ios]: Add chunk: type:video id:1 count:20 duration:2500

INFO server comment - LiveStreamPacketizerSmoothStreaming.flushPendingAudio: Bitrate[DemoApp/definst/testing_ios]: 48319

INFO server comment - LiveStreamPacketizerSmoothStreaming.addFragment[DemoApp/definst/testing_ios]: Add chunk: type:audio id:0 count:87 duration:2020

INFO server comment - LiveStreamPacketizerSanJose.endChunkTS[DemoApp/definst/testing_ios]: Add chunk: id:1 a/v/k:336/101/5 duration:10500

INFO server comment - LiveStreamPacketizerCupertino.endChunkTS[DemoApp/definst/testing_ios]: Add chunk: id:5 mode:TS[H264,AAC] a/v/k:29/20/1 duration:2000

INFO server comment - LiveStreamPacketizerSmoothStreaming.addFragment[DemoApp/definst/testing_ios]: Add chunk: type:video id:2 count:20 duration:2000

INFO server comment - LiveStreamPacketizerSmoothStreaming.addFragment[DemoApp/definst/testing_ios]: Add chunk: type:audio id:1 count:87 duration:2020

INFO server comment - LiveStreamPacketizerCupertino.endChunkTS[DemoApp/definst/testing_ios]: Add chunk: id:6 mode:TS[H264,AAC] a/v/k:29/20/1 duration:2000

INFO server comment - LiveStreamPacketizerSmoothStreaming.addFragment[DemoApp/definst/testing_ios]: Add chunk: type:video id:3 count:20 duration:2000

INFO server comment - LiveStreamPacketizerSmoothStreaming.addFragment[DemoApp/definst/testing_ios]: Add chunk: type:audio id:2 count:87 duration:2020

and the keyframe interval is 2 second the “cupertinoChunkDurationTarget” can be set to 2 second.

We are still facing 9-10 sec delay. How do we achieve less delay in HLS ?