I have an interesting scenario where I have some live feeds from cameras that are 'started' on demand.
In a nutshell.
Call is made from client (in this case iPhone) to server.
Server starts a specific stream from a live source (IP Camera)
First the transcoder starts (Gstreamer | ffmpeg) and is then pipe to a stream that is started on the fly through jmxcl.
This all works (hurrah!) except there is obviously some lag between when the call is made to start the show and when the stream is ready to be played. Some times as little as 5 seconds, some times as much as 15 seconds.
Instead of trying to guesstimate the time to then hand the stream URL to the iOS player framework, is there some call that I can make to Wowza so I know that the stream is 'ready' and passing on data?
Would jmxcl getIOInByteRate be useful in this instance do you think? Or is there some other way that I can detect the stream is 'ready'.
If it is not ready, and I pass the URL to the iOS media player, I just get a broken video icon.