Results 1 to 7 of 7

Thread: No onTextData calls received for Captions in Psuedo-Live stream

  1. #1

    Default No onTextData calls received for Captions in Psuedo-Live stream

    Hi,
    I am running a pseudo-live stream based on a playlist, as described here.

    I have TTML files alongside the videos, and they are properly loaded & parsed by the "vodcaptionproviderttml" TimedTextProvider, however the caption data is never sent to my Flash clients connected over RTMP.

    When I put a breakpoint inside 'getLanguageRenditionAMF()', you can see here that the TTML is being loaded & processed. In this image the 'trackIndex' is '99', I also hit this breakpoint twice with the same data on trackIndex=2.



    This is about as far as the provider gets. There are no calls to any of the other methods past that point.

    I have added breakpoints & trace statements into my Flash client for the onTextData method, and it is never called.

    Is there a configuration step I am missing?
    Could this be because the Application/Streams/StreamType is set to 'live'? Do I need to perform some trickery to get the VODTimedTextProvider to operate on a 'live' stream?

    Thanks!
    Chris

  2. #2
    Join Date
    Jun 2011
    Posts
    1,037

    Default

    Hi Chris,
    Currently there is no mechanism to ingest the vod caption files into the pseudo live stream as you have been attempting to do.

    You would need to inject caption data into the live stream using a module such as the example ModulePublishOnTextData
    found here.

    and then convert injected ontextdata into cea-608 captions, if required, using ModuleOnTextDataToCEA608,
    also referenced on that page.

    Daren

  3. #3

    Default

    Hi Daren,
    Thank you for the help, I was afraid that may be the case.

    Since I am able to load the TTML data, and it is processed properly, could I use that Timed Text provider to broadcast the onTextData calls?

    The example code shows loading text from a raw file and converting it to AMF - but would I be able to use com.wowza.wms.timedtext.vod.model.VODCaptionProviderTTML.getTimedTextAMF() to retrieve the caption data for a given time frame?

    I cannot find documentation on these APIs, so I am kind of flying blind. Any direction you can provide will be much appreciated! (Once I get it working I'd be happy to write up an article on it if you like)

    Thanks again
    Chris

  4. #4
    Join Date
    Dec 2007
    Posts
    21,962

    Default

    Chris,

    Take a look at this module as a starting place. It converts SRT to onTextData, so it will have to be modified to use TTML file instead.

    package test;
    
    import java.nio.charset.Charset;
    import java.util.*;
    
    import com.wowza.wms.amf.*;
    import com.wowza.wms.application.IApplicationInstance;
    import com.wowza.wms.livestreamrecord.model.*;
    import com.wowza.wms.media.h264.H264SEIMessages;
    import com.wowza.wms.media.model.*;
    import com.wowza.wms.module.ModuleBase;
    import com.wowza.wms.stream.*;
    import com.wowza.wms.timedtext.model.*;
    
    // Module created by brian and scott to
    // 1. read SRT data and inject as CEA608 data into a live stream created by stream demo publisher
    // 2. Create a VOD asset with CEA608 data in itand create
    public class ModulePublishSRTAsOnTextData extends ModuleBase
    {
    	public class MySEIListener implements IMediaStreamH264SEINotify
    	{
    		TimedTextEntry currCaption = null;
    		
    		public void onVideoH264Packet(IMediaStream stream, AMFPacket packet, H264SEIMessages seiMessages)
    		{
    			String text = null;
    			boolean sendEvent = false;
    			long currTime = packet.getAbsTimecode();
    			
    			if (!hasSrtFile())
    				return;
    			
    			TimedTextEntry caption = getCaption(currTime);
    			// set text to current active caption
    			if (caption != null && caption != currCaption)
    			{
    				text = caption.getText();
    				sendEvent = true;
    			}
    			// if we have an event, send it
    			if (sendEvent)
    			{
    				sendTextDataMessage(stream, text);
    				this.currCaption = caption;
    				getLogger().info("------- packet Time="+currTime+" "+text);
    			}
    		}
    	}
    	
    	public class MyMediaStreamListener implements IMediaStreamActionNotify3
    	{
    		private Map<String, ILiveStreamRecord> recorders = new HashMap<String, ILiveStreamRecord>();
    
    
    		public void onPublish(IMediaStream stream, String streamName, boolean isRecord, boolean isAppend)
    		{
    			IApplicationInstance appInstance = stream.getStreams().getAppInstance();
    			
    			if (!stream.isTranscodeResult())
    			{
    				// read the .srt file for this stream if it exits
    				List<TimedTextEntry> list = simpleSRTParse(appInstance, stream);
    				setTimedTextList(list);
    				if (hasSrtFile())
    					startRecording(stream, streamName);
    			}
    		}
    
    		public void onUnPublish(IMediaStream stream, String streamName, boolean isRecord, boolean isAppend)
    		{
    			// clear the list
    			setTimedTextList(null);
    			stopRecording(stream, streamName);
    		}
    
    		public void onMetaData(IMediaStream stream, AMFPacket metaDataPacket)
    		{
    		}
    
    		public void onPauseRaw(IMediaStream stream, boolean isPause, double location)
    		{
    		}
    
    		public void onPause(IMediaStream stream, boolean isPause, double location)
    		{
    		}
    
    		public void onPlay(IMediaStream stream, String streamName, double playStart, double playLen, int playReset)
    		{
    		}
    
    		public void onSeek(IMediaStream stream, double location)
    		{
    		}
    
    		public void onStop(IMediaStream stream)
    		{
    		}
    
    		public void onCodecInfoVideo(IMediaStream stream, MediaCodecInfoVideo codecInfoVideo)
    		{
    		}
    
    		public void onCodecInfoAudio(IMediaStream stream, MediaCodecInfoAudio codecInfoAudio)
    		{
    		}
    		
    		private void startRecording(IMediaStream stream, String streamName)
    		{
    			//create a livestreamrecorder instance to create .mp4 files
    			ILiveStreamRecord recorder = new LiveStreamRecorderMP4();
    		    recorder.setRecordData(false);
    		    recorder.setStartOnKeyFrame(true);
    		    recorder.setVersionFile(true);
    		       
    			// add it to the recorders list
    			synchronized (recorders)
    			{
    				ILiveStreamRecord prevRecorder = recorders.get(streamName);
    				if (prevRecorder != null)
    					prevRecorder.stopRecording();
    				recorders.put(streamName, recorder);
    			}
    			// start recording, create 1 minute segments using default content path
    			//System.out.println("--- startRecordingSegmentByDuration for 60 minutes");
    			//recorder.startRecordingSegmentByDuration(stream, null, null, 60*60*1000);
    			// start recording, create 1MB segments using default content path
    			//System.out.println("--- startRecordingSegmentBySize for 1MB");
    			//recorder.startRecordingSegmentBySize(stream, null, null, 1024*1024);
    			// start recording, create new segment at 1:00am each day.
    			//System.out.println("--- startRecordingSegmentBySchedule every "0 1 * * * *");
    			//recorder.startRecordingSegmentBySchedule(stream, null, null, "0 1 * * * *");
    			
    		    // start recording, using the default content path, do not append (i.e. overwrite if file exists)
    			getLogger().info("--- startRecording");
    			String filePath = "C:\\temp\\"+streamName+"-cc.mp4";
    			recorder.startRecording(stream, filePath, false);
    		    //recorder.startRecording(stream, false);
    		    
    		    // log where the recording is being written
    			getLogger().info("startRecording[" + stream.getContextStr() + "]: new Recording started:" + recorder.getFilePath());
    
    		}
    
    		private void stopRecording(IMediaStream stream, String streamName)
    		{
    			ILiveStreamRecord recorder = null;
    			synchronized (recorders)
    			{
    				recorder = recorders.remove(streamName);
    			}
    			
    			if (recorder != null)
    			{
    				// grab the current path to the recorded file
    				String filepath = recorder.getFilePath();
    				
    				// stop recording
    				recorder.stopRecording();
    				getLogger().info("stopRecording[" + stream.getContextStr() + "]: File Closed:" + filepath);
    			}
    			else
    			{
    				getLogger().info("stoprecording[" + stream.getContextStr() + "]: streamName:" + streamName + " stream recorder not found");
    			}
    		}
    	}
    
    	// local vars
    	private List<TimedTextEntry> timedTextList = null;
    	private boolean charsetTest = false;
    	private final Charset UTF8_CHARSET = Charset.forName("UTF-8"); 
    	private boolean foundSrt = false;
    
    	// app startup processing
    	public void onAppStart(IApplicationInstance appInstance)
    	{
    		getLogger().info("ModulePublishSRTAsOnTextData.onAppStart["+appInstance.getContextStr()+"]");
    		
    		String onTextDataFile = "${com.wowza.wms.context.VHostConfigHome}/content/ontextdata.txt";
    		//publishInterval = appInstance.getProperties().getPropertyInt("publishOnTextDataPublishInterval", publishInterval);
    		//onTextDataFile = appInstance.getProperties().getPropertyStr("publishOnTextDataFile", onTextDataFile);
    		charsetTest = appInstance.getProperties().getPropertyBoolean("publishOnTextCharsetTest", charsetTest);
    
    		Map<String, String> pathMap = new HashMap<String, String>();
    		pathMap.put("com.wowza.wms.context.VHost", appInstance.getVHost().getName());
    		pathMap.put("com.wowza.wms.context.VHostConfigHome", appInstance.getVHost().getHomePath());
    		pathMap.put("com.wowza.wms.context.Application", appInstance.getApplication().getName());
    		pathMap.put("com.wowza.wms.context.ApplicationInstance", appInstance.getName());
    	}
    	
    	// hookup stream listeners
    	public void onStreamCreate(IMediaStream stream)
    	{
    		stream.addClientListener(new MyMediaStreamListener());
    		stream.addVideoH264SEIListener(new MySEIListener());
    	}
    	
    	// save the timedTextList
    	private void setTimedTextList(List<TimedTextEntry> list)
    	{
    		this.timedTextList = list;
    	}
    	
    	// find and parse .srt file for the specified stream
    	private List<TimedTextEntry> simpleSRTParse(IApplicationInstance appInstance, IMediaStream stream)
    	{	
    		List<TimedTextEntry> list = null;
    		String extension = ITimedTextConstants.TIMED_TEXT_READER_EXTENSION_SRT;
    		String fileName = stream.getName()+"."+extension;
    		String contentPath = stream.getStreamFileForRead().getParent();  // get stream content path
    		
    		// create and configure a MediaReaderItem for use with TimedTextReaderFactory
    		MediaReaderItem mri = new MediaReaderItem(ITimedTextConstants.TIMED_TEXT_READER_EXTENSION_SRT, ITimedTextConstants.DEFAULT_TIMED_TEXT_READER_SRT);
    		mri.setFileExtension(ITimedTextConstants.TIMED_TEXT_READER_EXTENSION_SRT);
    		// create a TimedTextReader for the .srt file associated with this stream
    		ITimedTextReader reader = TimedTextReaderFactory.getInstance(appInstance, mri, contentPath, fileName, extension);
    		
    		if (reader != null)
    		{
    			reader.open();
    			TimedTextRepresentation tt = reader.getTimedText();
    			reader.close();
    			if (tt != null)
    			{
    				TimedTextLanguageRendition rend = tt.getLanguageRendition(Locale.getDefault().getISO3Language());
    				// get the list of TimedTextItems
    				list = rend.getTimedText();
    				this.foundSrt = true;
    			}
    			else
    			{
    				getLogger().info("--- No srt file found for "+contentPath+"\\"+stream.getName());
    			}
    		}
    		//dumpTimedTextList(list);
    		return list;
    	}
    	
    	// send OnTextData event
    	private void sendTextDataMessage(IMediaStream stream, String text)
    	{
    		try
    		{
    			AMFDataObj amfData = new AMFDataObj();
    			
    			amfData.put("text", new AMFDataItem(text));
    			amfData.put("language", new AMFDataItem("eng"));
    			amfData.put("trackid", new AMFDataItem(99));
    							
    			stream.sendDirect("onTextData", amfData);
    			((MediaStream)stream).processSendDirectMessages();				
    		}
    		catch(Exception e)
    		{
    			getLogger().error("ModulePublishSRTAsOnTextData#PublishThread.sendTextDataMessage["+stream.getContextStr()+"]: "+e.toString());
    			e.printStackTrace();
    		}
    	}
    
    	// get the caption active during the time passed in
    	private TimedTextEntry getCaption(long time)
    	{
    		TimedTextEntry entry = null;
    		Iterator<TimedTextEntry> itr = this.timedTextList.iterator();
    		while(itr.hasNext())
    		{
    			TimedTextEntry tte = itr.next();
    			if (tte.getStartTime() <= time && time < tte.getEndTime())
    			{
    				entry = tte;
    				break;
    			}
    		}
    		return entry;
    	}
    	
    	private boolean hasSrtFile()
    	{
    		return this.foundSrt;
    	}
    
    	// dump the list of TimedTextEntries created from .srt file
    	private void dumpTimedTextList(List<TimedTextEntry> list)
    	{
    		Iterator<TimedTextEntry> itr = list.iterator();
    		getLogger().info("--- TimedTextList ----");
    		while(itr.hasNext())
    		{
    			TimedTextEntry tte = itr.next();
    			getLogger().info("s:"+tte.getStartTime()+", "+tte.getText()+", e:"+tte.getEndTime());
    		}
    		getLogger().info("--- ------------ ----");
    	}
    	
    }
    Richard

  5. #5

    Default

    Thank you Richard! That was exactly what I needed!

  6. #6
    Join Date
    Sep 2016
    Posts
    7

    Default

    Hello Richard I need have the same scenario I have a playlist but using SRT Files and I used your provider I can see in the log of Wowza that the captions are loaded but they never appear in my stream do you know If I missed something?.
    I activated in my application the option Ondata events in live streams.
    I'm using Eclipse to debug your code but It works fine.

    Thank you!

  7. #7

    Default

    I had to do some tweaking to the code to get it to work when using a playlist. The problem I found was that each individual caption file (SRT/TTML) contains its own time codes, each of which will start at 0. The problem is that when running a playlist, only the first video will actually begin at timecode=0.

    The second video actually begins at a timecode equal to the duration of 1st video.

    I had to keep track of the time that each video changes, and use that to calculate the offset between to apply to each video's caption file.

    I'll see if I can track down the code

Similar Threads

  1. Injecting "closely synced" live captions via AMF onTextData: Timecode alignment?
    By videodoctor in forum Server-side Modules and Code Samples Discussion
    Replies: 2
    Last Post: 02-14-2014, 02:19 PM
  2. How can we initiate onTextData on live stream?
    By tan-tan in forum Live Streaming and Encoder Discussion
    Replies: 1
    Last Post: 08-28-2013, 06:12 AM
  3. How can I check if the WowzaServer have already received the live stream for publish?
    By shadowlin in forum Live Streaming and Encoder Discussion
    Replies: 2
    Last Post: 07-04-2012, 11:53 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •