Page 1 of 2 12 LastLast
Results 1 to 10 of 17

Thread: Access frame data as they are transcoded

  1. #1

    Default Access frame data as they are transcoded

    I am trying to access the raw frame data during transcoding. I have the transcoder setup and working and have build a built a server module project in eclipse which is getting run but I cannot get onAfterDecodeFrame to fire. Here is the code I have so far:

    class FrameGrabber implements ITranscoderVideoDecoderNotify
    {
    	public void onAfterDecodeFrame(TranscoderSessionVideo sessionVideo, TranscoderStreamSourceVideo sourceVideo, long frameCount) {
    		getLogger().info("Frame: " + String.valueOf(frameCount));
    	}
    }
    When I receive and record an RTSP stream, there are no errors in the log but I do not see the frame count in the logs as I was expecting. I think I need to add it to the appInstance with something like addLiveStreamVideoDecoderListener but this does not exist. How can I get this code to run after each frame is decoded? What am I missing?

    Many thanks.

  2. #2

    Default

    Hi,

    So to get this to happen you need to add all the listeners to the appinstance, so

    PHP Code:
        appInstance.addLiveStreamTranscoderListener(new MyTranscoderCreateNotifier();

       class 
    MyTranscoderCreateNotifier implements ILiveStreamTranscoderNotify
               
    {

                    public 
    MyTranscoderCreateNotifier ( )
                    {
                    }
                   
                      public 
    void onLiveStreamTranscoderCreate(ILiveStreamTranscoder liveStreamTranscoderIMediaStream stream)
                      {
                             ((
    LiveStreamTranscoder)liveStreamTranscoder).addActionListener(new MyTranscoderActionNotifier());
                      }
                      public 
    void onLiveStreamTranscoderDestroy(ILiveStreamTranscoder liveStreamTranscoderIMediaStream stream)
                      {
                      }
                      public 
    void onLiveStreamTranscoderInit(ILiveStreamTranscoder liveStreamTranscoderIMediaStream stream)

                      {
                      }      
               }

    class 
    MyTranscoderActionNotifier extends LiveStreamTranscoderActionNotifyBase
               
    {
          
                           public 
    MyTranscoderActionNotifier( )
                           {

                           }
                   
                      public 
    void onInitStop(LiveStreamTranscoder liveStreamTranscoder)
                      {

                             while(
    true)
                             {
                                   
    TranscoderStream transcoderStream liveStreamTranscoder.getTranscodingStream();
                                   if (
    transcoderStream == null)
                                          break;
                                   
    TranscoderSession transcoderSession liveStreamTranscoder.getTranscodingSession();
                                   
    TranscoderSessionVideo transcoderVideoSession transcoderSession.getSessionVideo();
                                   
    transcoderVideoSession.addFrameListener(new MyTranscoderVideoDecoderNotifyBase());
                                   break;
                             }
                      }
               }

    class 
    MyTranscoderVideoDecoderNotifyBase extends TranscoderVideoDecoderNotifyBase
           
    {
                      public 
    MyTranscoderVideoDecoderNotifyBase ( )
                      {
                      }
                   
                      public 
    void onBeforeScaleFrame(TranscoderSessionVideo sessionVideoTranscoderStreamSourceVideo sourceVideolong frameCount)
                      {
                        }

                      public 
    void onBeforeDecodeFrame(TranscoderSessionVideo sessionVideoTranscoderStreamSourceVideo sourceVideolong frameCount)
                      {
                        }

                      public 
    void onAfterDecodeFrame(TranscoderSessionVideo sessionVideoTranscoderStreamSourceVideo sourceVideolong frameCount)
                      {
                        }

           } 
    Be warned though, if you process anything within this part it must keep up with decoding otherwise things will stall and the output from the transcoder waiting for your process.

    Andrew.

  3. #3

    Default

    Thanks very much Andrew, I have got this working!

    I would now like to dump the current frame data to a file (ideally in YUV or some other image format). I have looked at sessionVideo and sourceVideo and see popScalerDecodedFrame and grabFrame but these return Java objects and cant see how to access the raw image data.

    What is the best way to access the image data of the current frame? I assume onAfterDecodeFrame is the best place to do this.

    Many thanks for you help.

  4. #4

    Default

    Hi,

    So in onBeforeScaleFrame you could add a class which implements ITranscoderFrameGrabResult, an example add

    PHP Code:
    sourceVideo.grabFrame(new CreateShot(),640,480); 
    This would create a new snapshot of the frame, and the size. When it does this you can convert the videoframe into a bufferedimage.

    PHP Code:
        class CreateShot implements ITranscoderFrameGrabResult
        
    {
            public 
    CreateShot ( )
                {
                }
            
            public 
    void onGrabFrame(TranscoderNativeVideoFrame videoFrame)
                {            
                
    BufferedImage image TranscoderStreamUtils.nativeImageToBufferedImage(videoFrame);
                if ( 
    image == null )
                {
                
    // error message here
                
    }
                        else
                    {
                    
    // Do something with the result
                    
    }
                }
        } 
    Andrew.

  5. #5

    Default

    Thanks again Andrew, this was very helpful and I am now able to access data from the individual frames!

    Are these frames from the original video received in the stream or the frames produced as a result of transcoding? I don't actually need the resulting transcoded video, I just need to analysis the frames received before the video has finished streaming.

    Is there a way to disable the encoding process performed by the transcoder? I have tried PassThru in the transcoder template but then onBeforeScaleFrame no longer gets called (which is understandable). I would like avoid using the resources needed to re-encode if possible as this is expensive and of no use in this context. Is there a way to get the transcoder to just decode the frames from the stream without then encoding?

    Finally, is it possible to get rtp-live-record to store the H264 data in an FLV container by default father then an MP4? I am already receiving RTMP streams from a Flash client which gets saved as an FLV file and I would like to avoid having different file types produces by different clients.

    Thanks again for your help, it is very much appreciated!

  6. #6

    Default

    Hi,

    Using this method you are using the decoding page, hence TranscoderVideoDecoderNotifyBase

    There may be a way to disable the encoding, however it may require very low level API calls and not sure if it would be possible, you could have to dig around in the template controls, so ILiveStreamTranscoderActionNotify. By default the transcoder checks if any encodes are in use then does the decoding, if there arent any then it doesnt do any.

    The simplest solution I can suggest is do a very small encoding, so frame of 200x200 and set the bitrate of say 150k. Do not go too low otherwise CPU does go the other way as the transcoder libraries try very hard to get lower.

    You can not really save h264 in a FLV container, so no not possible.

    Andrew.

  7. #7

    Default

    Thanks for this, producing a small transcoded video is working well. I was also able to produce an FLV file using the transcoder and PassThru encoding which if I understand correctly, doesn't touch the video data and simply repackages it in a different container.

    So, I have things nearly working as required but I have one last problem. I am recording a short 5 second video at 15 fps so I should get about 75 frames. However, the transcoder only gives me about 60 frames and when I extract the frames from the recorded video file using FFmepeg I get more like 75 frames.

    I did a bit of investigation and started logging the frameCount of onBeforeScaleFrame and onAfterDecodeFrame which produced interesting results. onAfterDecodeFrame is called 15 times before the first onBeforeScaleFrame but they both end at the same time which results in onBeforeScaleFrame being called 15 times less then onAfterDecodeFrame which explains the 15 missing frames. So I changed to using onAfterDecodeFrame which produces the same number of frames as FFmpeg does but the first 15 frames are exactly the same. Looking at the logs, onBeforeDecodeFrame is called 15 times with a sequentially frameCount, then my method logs being called 15 times with a non-sequential frameCount and then they are called alternatively as before.

    I can understand that onBeforeScaleFrame may need to wait until the next keyframe before it can best process the set of previous frames which would explain why it starts 15 frames in but I would expect it to carry on for an additional 15 frames after onAfterDecodeFrame has finished but it doesn't. It feels like it is stopped once the stream finishes when actually it should be called 15 more times because it starts 15 frames later.

    I can understand that onAfterDecodeFrame may need to wait until the next keyframe before it can best decode the set of previous frames which would explain why the first 15 frames all get exported at once (in whatever order they return) after the first 15 frames have been received. However, I would not expect them to all be the same so it feels like just the first or last frame of that set is being used for all 15. This would also imply that I would get a burst of frames after each keyframes but this does not happen after the second keyframe.

    I wondered if my image export was using too many resources and stalling the transcoder as you suggested so I commented out all code expect for logging but I still see exactly the same effects. Should onBeforeScaleFrame and onAfterDecodeFrame not be called the same number of times for each stream?

    Either way I do it, I am losing 1 second which is 20% of the video so it is a problem. Could I be experiencing a bug in Wowza? Why did you suggest using onBeforeScaleFrame rather then onAfterDecodeFrame? How can I ensure that I get access to every frame of the video during transcoding?

    Also, is there any way to detect that this is the last frame of the video in the transcoder?

    Thanks again for your help.

  8. #8

    Default

    I just upgraded Wowza from 3.6.3 to 3.6.4 in case there was a bugfix that may have helped but I still get exactly the same behaviour.

    Still struggling to work out a way to access all frames during the transcoding...

  9. #9

    Default

    Hi,

    So you will have issues with such a small reference timescale. It may be keyframes it may be how the frames are encoded hard to say. If the stream is being published and then unpublished it may be doing so before the buffer is flushed. I suspect due to the time the transcoder takes to start up, 200-300ms (perhaps more) then you will lose initial packets, or your expectation of initial packets, which may be the cause.

    There is no way to detect the last frame in the transcoder, but you could use getPlayPackets() in the IMediaStream object to cound the packets left.

    Andrew.

  10. #10

    Default

    Thanks for your reply Andrew. This got me thinking and I did some more tests. I streamed a video that was easy to identify each frame and I can confirm that we are not missing frames at the beginning. Frame 1 produced by FFmpeg exactly matched frame 1 produced by onAfterDecodeFrame and onBeforeScaleFrame. Frame 2 from FFmpeg exactly matches frame 2 from onBeforeScaleFrame and exactly matches frame 16 from onAfterDecodeFrame so the first 15 repeated frames produced by onAfterDecodeFrame are repeats of frame 1 so we are not loosing any data from the beginning of the stream which is good news.

    I tried an experiment and added Thread.sleep(5000) into onLiveStreamTranscoderDestroy and this time many more frames were produced but the last 15 were all like this (which is not what was recorded):

    Frame43

    The last 15 are all similar but not exactly the same and it didn't produce the same amount of frames as FFmpeg even with the strange red frames but I think this confirms that the buffer is being flushed or the transcoding is being stopped too early.

    Is there any way to stop the buffer being flushed or ensure that transcoding continues all the way to the end of the stream?

    I feel like this could be considered as a bug as we have not lost any data but the last second of frames are not being transcoded. I don't think onAfterDecodeFrame repeating the first frame 15 times would be considered expected behaviour particularly as it does produce the correct number of frames.

    This is currently a show stopper for our project so any help you can give will be very much appreciated.

Page 1 of 2 12 LastLast

Similar Threads

  1. Replies: 30
    Last Post: 08-02-2012, 05:11 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •