Wowza Community

live streaming h.264 to flash with zero buffer (low latency)

I still haven’t found any out of the box solution to achieve smooth playback with zero buffer on the flash client.

However with flash 10.1 it gets much better as with earlier versions.

And even the actual minimum buffer can be now lowered to stuning 0.5s (before 3s/64 frames), but still grows in time and eventually comes to 5s and more, so it requires a special mechanism to reset it once in a while.

But for real low latency this is still unusable, so the only acceptable approach is to use zero buffer.

In this case with flash 10.1 the playback is smooth in about randomly 50% of the time and in other 50% it becomes choppy with fps arround 15-20.

After analysing this I found out that playback is smooth until audio packets are arriving after video packets (without audio it is impossible to achieve smooth playback), so obviosly flash buffers the video until it receives the corresponding audio packet to synhronize it with.

This way the buffer is not zero, but still very low (0.1-0.3s) to be considered lowlatency.

The random behaviour is probably due to wowza handling audio and video packets seperately and leaving the control to OS handling the sockets…

So from this perspective the solution is quite obvious. The stream server should be able to ensure sending the audio packet after the corresponding video packet in configurable time/buffer distance.

So my only question is: HOW TO DO THIS ?

You should not even notice and audioLag of anything less than 200 millseconds.

Charlie

I don’t about that conclusion. Maybe. I don’t think there is such a configuration option. I think the best way to achieve smooth playback with 0 buffer is to have a bitrate that is well below network capacity, and use flv files.

Richard

Have you tried adding a very small sort buffer as suggested here (#1):

http://www.wowza.com/community/t/-/57

Try a very small buffer to see if that will keep the video and audio packet deliver in sync. That is what it is supposed to do.

Charlie

You can add a fixed delay to the audio channel. You can add the following property to the Streams/Properties (value is in milliseconds):

<Property>
	<Name>audioLag</Name>
	<Value>75</Value>
	<Type>Integer</Type>
</Property>

Charlie

There is not a way to tap into the sorting mechanism in the API. It is buried deep in the live stream receiver code.

Yes, sorting is applied to incoming RTP streams as well. A jitter buffer will also add latency. So if you have a jitter buffer and sorting enabled both will add latency.

Charlie

This has nothing to do with the bandwidth. I’ve Done lot’s of testing and troubleshooting and I am standing by the conclusion above.

I’ve tried to play with the passthrough module to somehow delay audio packets,

but haven’t found the right method to do it.

First thing that comes to my mind is to have seperate flushInterval parameter for audio and for video if those are in the seperate queues of cours.

Please make some further investigation on this. I see a lot of people asking for smooth flash live streaming in flash with lowlatency.

I’ve tried that and also de-jitter buffer and neither helped.

Can I use somehow the same mechanism (by overriding some method) as it is used with sortingPackets

to delay only audio Packets ?

Or can you provide some extra parameter for this with a new version ?

This will only add the actual lag to the audio, meaning video and audio will be out of sync on the client.

We just want to achieve that the audio packets will arrive a bit later and don’t mess with the sync. So the flash client will buffer video until it receives the corresponding audio packet and then play both with original sync.

However I could try to use sorting packets in combination with negative audio lag (if it is supported ?), which should do what I want, because sorting will make sure synced packets will arrive at the same time, but the client will have to delay (buffer) video packets because of the negative audio lag.

But then again, video and audio will not be in sync, which is not a solution.

But it’s a good proof of concept.

Ok since there probably is no negative audioLag, I’ve tried

with videoLag, but did not help.

As far as I understand sorting packets assures to send synced video and audio packets at the same time (or at least putting them in the socket queue).

So the solution would be to instead of sorting together packets with

audioSyncValue==videoSyncValue

you could just make this existing sorting a bit more flexible by sorting together

audioSyncValue+AUDIO_PACKET_DELAY==videoSyncValue+VIDEO_PACKET_DELAY

where AUDIO_PACKET_DELAY and VIDEO_PACKET_DELAY are configurable stream properties.

This should not be much of a work. Maybe I could try it myself (decompiling the code) if you cold tell me where this sorting is implemented.

But if this works, Wowza team should by me couple of beers at IBC 2010 :slight_smile:

I will need some assistance here.

Can you tell me where this sorting packets is impelemented and if there is some method I can override to implement my own mechanism ?

And second question:

If the sorting is enabled, is it used also in the incomming mediacaster stream (e.g. using rtsp) ?

The reason why I am asking is bacause I noticed that when I enabled sorting (250ms buffer) one of the streams in the client had a constantly larger latency (approx. 2 second difference from another). And this was the latency on wowza and not on the client.