1. Start pulling the stream using MediaCasterStreamMap.acquire(). The stream is using a 10 second keyframe interval for the purposes of the test.
2. Start a thread that tests for a keyframe using getLastKeyframe(), sleeping for 20ms between tests
3. logging a message when getLastkeyFrame returns an AMFPacket, rather than null
When pulling from FMS, getLastKeyFrame always returns an AMFPacket within 1000ms (which includes the time it takes for the mediacaster to get hooked up to FMS), but when pulling from Wowza, it can take upwards of 12000ms.
So, am I correct in assuming that Wowza just doesn't send any video data over the stream until a keyframe is available on the source stream, where FMS will generate a keyframe even if there isn't one immediately available on the source?
FMS is most likely going back in time to grab the key frame from an internal buffer. This will add latency to the stream. When using the live stream repeater Wowza Media Server does not do this. We do not go back in time to grab a previous key frame. We intead wait for the next key frame. This is so we do not inject latency into the origin/edge connection. We do go back in time if the connection is not origin/edge and this is based on the player NetConnection.bufferTime.
I take it there's nothing exposed in the IMediaCasterNetConnection interface that would allow me to send a value on the repeater connection to get Wowza to include a bufferTime property in the request?
Yes that's correct, there's nothing exposed in the IMediaCasterNetConnection interface that would allow me to send a value on the repeater connection to get Wowza to include a bufferTime property in the request
Let's look at this from the opposite direction. Do you see any clear path on the origin side to force the origin to start sending video data from the last buffered keyframe when the client is a live stream repeater?