Wowza Community

Synchronize two live streams going to two flash players.

I have two cameras shooting the same lecture. They are setup to stream from Wowza as two HLS streams (h264,aac). I have a website with two flashplayers (based on OSMF: http://osmfhls.kutu.ru/). If start both players only after the window.onload event, they will start synchronized most of the time, but there is no guarantee, and they will almost certainly drift out of sync after a while.

Is there a way to make sure the streams are kept in sync? (I can change some of the setup if required, but adaptive bitrate is a must).

If you need tight timing control, HLS is not the solution, even with caching disabled, there can be a lot of drift…

I’d suggest you stick with RTMP and use JWplayer, they have a special adaptive bitrate function for RTMP…

There is a lot involved in trying to do something like this. Injecting cuepoints in each stream server-side in some way that can be used to sync them later (server timecode or markers added by user action, perhaps), reading them in Flash Actionscript, and using that info to sync in the client. We don’t have any examples of this at the moment but obviously it would involve monitoring these cuepoints in each stream the probably using NetStream.seek() commands to make adjustments, e.g. catch streams up that are have fallen behind. I’m sure there are many approaches.

To read in Flash you have to have a callback that matches the signature of the cuepoint. Feferring to the example cuepoint in the cuepoint injection article linked above, it is named “onTextData”, it is packaged in an AMFDataMixedArray and contains 3 fields or properties: text, language and trackid

AMFDataMixedArray data = new AMFDataMixedArray();
data.put("text", new AMFDataItem(text));
data.put("language", new AMFDataItem(language));
data.put("trackid", new AMFDataItem(trackid));
stream.sendDirect("onTextData", data);

On the Flash side you need a callback with matching name and Object (corresponding to the AMFDataMixedArray), which will contain the text, language and trackid data. This is created using the NetStream:

var nsPlayClientObj:Object = new Object();
nsPlay = new NetStream(nc);
nsPlay.client = nsPlayClientObj;
nsPlayClientObj.onTextData = function(obj:Object):void
{
	trace(obj.text);
	trace(obj.language);
	trace(obj.trackid);
}
nsPlay.play("syncme.mp4");

If you are going to mark cuepoints by user action, you need a Flash client that plays the live stream with NetStream.bufferLength set to 0 and controls (button, data inputs) and Actionscript methods to use NetConnection.call() to call server-side callback functions. The ServerSideModules example that ships with Wowza in the /examples folder is a working reference of all these methods.

Richard

Yeah, RTMP on iOS is not possible at all, AFAIK… But since one of the great things about Wowza is that it makes your streams available on all protocols, why not use HLS for iOS/Android/whatever else, and RTMP for computer-based playback?

I think other Flash players like Flowplayer also have a multi-bitrate system by reading Wowza’s SMIL files, but I have don’t really have experience with them, maybe someone else can chime in.

As for keeping the streams in sync… This is far beyond my own skills, but I suspect some kind of javascript to make the players talk to each other would be a good approach. Either that, or implementing a UDP-based stream (with UDP, the player will show whatever it receives, never asking for retransmissions. But don’t ask me how to set it up, ahhaah)

Anyway I hope someone more knowledgeable will reply, I am sure there’s a way to achieve what you need :slight_smile:

You can inject cuepoints in the streams while they are recording, a timestamp that you can use during playback in the client to correct sync.

How to inject cue points or metadata

There is not an easy way to do this.

Salvadore

I should have said, I also have an html5 fallback for iOS-devices (and other non-flash environments). RTMP is not an option for those devices I believe (but please tell me if I’m mistaken). I only need one player for these devices however.

Is JWplayer the only player with adaptive bitrate functiun for RTMP?

I don’t need the streams to be tightly in sync. But they need to be within a threshold (maybe 500ms), and more importantly not drift further apart over time.

Is there not some way to sync up the two players, using timing information injected by Wowza?

…why not use HLS for iOS/Android/whatever else, and RTMP for computer-based playback?

I still need to test this option, but it sounds like a good way to go, thanks for your help :slight_smile:

You can inject cuepoints in the streams while they are recording, a timestamp that you can use during playback in the client to correct sync.

How to inject cue points or metadata

There is not an easy way to do this.

I have created a Wowza-module to inject cuepoints, but without testing if it actually works. To achieve this I will have to recompile my flash-player with code to somehow fetch those cuepoints. And then make them available to the javascript interface to seek the stream to that point? I don’t really know much about actionscript/flash/flex-stuff. Maybe I’ll look into this some more tomorrow. If you, or anyone else, have any info on how to use these cuepoints in the grind-player (or a different player that supports HLS) that would be great!

Thank you,

Rune.

Hi,

the example suggested (injecting cue points) is for live streams, though they can be preserved in recordings. You can also re-stream a VOD file as a pseudo-live stream using something like the StreamPublisher module. Though as mentioned, the actual implementation for getting your streams in sync may be non-trivial.

Regards,

Paul

I still need to test this option, but it sounds like a good way to go, thanks for your help :slight_smile:

I have created a Wowza-module to inject cuepoints, but without testing if it actually works. To achieve this I will have to recompile my flash-player with code to somehow fetch those cuepoints. And then make them available to the javascript interface to seek the stream to that point? I don’t really know much about actionscript/flash/flex-stuff. Maybe I’ll look into this some more tomorrow. If you, or anyone else, have any info on how to use these cuepoints in the grind-player (or a different player that supports HLS) that would be great!

Thank you,

Rune.

Hi All,

Was there anything further on this thread?

Thanks!

Hi,

The injected metadata should be included in the livestream recordings as long as you enable metadata capture in your recording options. In the Manager UI recording method, this is the Record data option. This metadata will be accessible to any RTMP client.

Michelle

Hello

Is this solution work also in my case?

Description: I have two or more videos that i need will be in sync on client side.

I’m totally new in wowza.

The videos are encoded in h264 and published via rtmp to wowza server with the same host and machine.

So the final goal is to play this videos on the same computer with 3 different players totally in sync.

Using the injection procedure does it works?

The id or time-code injected are for wowza server side? Or exist something similar wich can i add to h264 flow generated from the encoder machine?

Excuse me for my bad english.

Thanks in advance.

F

Hi,

the example suggested (injecting cue points) is for live streams, though they can be preserved in recordings. You can also re-stream a VOD file as a pseudo-live stream using something like the StreamPublisher module. Though as mentioned, the actual implementation for getting your streams in sync may be non-trivial.

Regards,

Paul

Thanks for the answer, i know unfortunately that is not trivial. My coder will do the job :slight_smile: In my case the stream will be live streams.

I’m searching over the net if it is possible to do, and in yout experience is possible to inject the cue point metadata in the catured h264 video directly? It will be transpare t to wowza server/client via rtmp?

Thanks

F