Wowza Community

Inject Cuepoint

I added a note to the post:

Note this requires Wowza Server 2.0.0.15 (patch 15) or higher

This is a Wowza Application Module. You need the Wowza IDE to compile this to a jar file in the Wowza lib folder, then you will be adding a Module reference to an Application.xml Modules list. The Wowza IDE is here:

http://wowza.com/ide.html

Download the IDE and read the included guide to get started.

Richard

Doesn’t seem like a good idea, it would not be faster. cuepoints is data stored with the stream. Video, audio, data. SharedObjects are stored on the server. There is also a way to store data in an external file that is called cuepoints, but that’s not the kind of cuepoint generally discussed around streaming media.

Richard

I don’t think so, not that I know of, not with Wowza. That’s the advantage of the external method of cuepoints (xml file) used by many player systems: they are editable.

Richard

JW Player and Flowplayer have a Captions Plugin:

http://www.longtailvideo.com/addons/plugins/84/Captions?q=

http://flowplayer.org/plugins/flash/captions.html

Richard

I thought your issue was editing cuepoints after, i.e., the recorded video.

Richard

I don’t think there is, as I said. You can use an xml solution when you play the recording.

Captionate works with flv files only, under 500mb, and they do not mention editing, just injecting. I’ve never used it.

Richard

nsClientObj:Object = new Object();
nsClientObj.setCaption = function(caption:String):void
{
trace(caption);
}
ns.client = nsClientObj;
ns = new NetStream(nc);
ns.addEventListener(NetStatusEvent.NET_STATUS, statusHandler);

Where “setCaption” is the cuepoint name.

Richard

This is ActionScript used in Flash CS authoring tool, or Flex/Flash Builder IDE.

Do you want our list of independent developers? If so, send email to support@wowza.com with link back to this thread.

Richard

I’m using Wowza 3.0 and SendDirect() is not working with VoD streams.

I have to change to Send() for get it to work.

How to use this example reference the code:

package com.wowza.example.module;

import com.wowza.wms.amf.*;

import com.wowza.wms.client.*;

import com.wowza.wms.module.*;

import com.wowza.wms.request.*;

import com.wowza.wms.stream.*;

public class ModuleInjectCuePoint extends ModuleBase {

public void setCaption (IClient client, RequestFunction function, AMFDataList params)

{

Where put this code I need compile?

thanks for help

I dont quite understand!

With wirecast I’m sending over wowza a lifestream. Our translators sit in different countries and see the lifestream. They translate simultanously and want to inject their translations as subtitles into the livestream. Each language has a separate jw player.

How can my translators send their text to wowza?

Do I need to write a desktop application connecting to wowza(how to connect?) which uses cuepoints to inject the text into the livestream?

Another thing is also how to separate the different languages injected into a single livestream?

What is this bit of code doing?

byte[] dataData = amfList.serialize();

int size = dataData.length;

synchronized(stream)

{

long timecode = Math.max(stream.getAudioTC(), stream.getVideoTC());

stream.setDataTC(timecode);

stream.setDataSize(size);

stream.addDataData(dataData, 0, size);

}

I think

“stream.sendDirect(“setCaption”, data);”

should be

“stream.send(“setCaption”, data);”

In the Wowza IDE and in the documents I cant find a sendDirect method. Can u provide me with a link?

What about the idea using shared objects instead of cuepoints for captions?

The clients will see the captions faster than with cuepoints but on the other hand the captions will be lost unless I save it on the server in a separate file. I use different shared objects for different languages.

Do u think that this is feasible?

Once cuepoints are injected into the a live stream.

Is it possible to edit & change the cuepoints afterwards?

We need to correct the subtitle text if there are some errors.

Can u give me some hints how to do it with external xml files?

Maybe a link?

I know the plugin. But we cant use them for live broadcasts.

At the moment I implemented a solution with shared objects. But the problem is that the subtitles are 20 sec ahead the video. The stream is buffering too much.

During life broadcast we do subtitles in 6 languages. I rewrote the captions plugin to make it fit our needs. I also wrote a tool for our translators.

The tool is sending instantly the text using shared objects to the viewers and at the end of the broadcast an xml file is saved containing the subtitles.

If I switch from shared objects to cuepoints is it possible to change the cuepoints after the broadcast?

Or I simply oppress the cuepoints and use the xml file.

I dont quite understand how to read the cuepoints from a live stream.

U wrote:

// set up netstream for the callback on the live stream

nsPlayClientObj:Object = new Object();

nsPlayClientObj.setCaption = function(caption:String):void

{

trace(caption);

}

nsPlay.client = nsPlayClientObj;

Can u be more specific?

I made a new NetStream class & added a handler.

ns = new NetStream(nc);

ns.addEventListener(NetStatusEvent.NET_STATUS, statusHandler);

Buw where should I place your code so that it reads the cue points when there is one?