Page 1 of 2 12 LastLast
Results 1 to 10 of 18

Thread: Reducing latency

  1. #1

    Default Reducing latency

    I'm working on a project for a new client. This guy has a Wowza server running on an EC2 instance. His streams are audio-only, and he's quite concerned about getting his latency (from microphone input on the encoder to headphone output on the player) as low as possible - ideally under 1/2 second.

    I've followed the advice in http://www.wowza.com/forums/content....re-to-playback, and when I get my hands on an Apple device to test on I'll also implement http://www.wowza.com/forums/content....s-(Flash-HTTP). So far, encoding a stream using FMLE and using the OSMF player I can get my latency down around a second. I don't seem to notice a difference between including a video stream or sending just an audio stream.

    Besides the two articles mentioned above and the discussions that go with them, is there anything more I should be looking at to get my stream latency down? I am using stream type of live-record-lowlatency.

  2. #2
    Join Date
    Dec 2007
    Posts
    21,962

    Default

    One second is great for sanjose playback. If you are going to get any better it will probably have to be rtmp.

    Richard

  3. #3

    Default

    Quote Originally Posted by rrlanham View Post
    One second is great for sanjose playback. If you are going to get any better it will probably have to be rtmp.

    Richard
    I don't have a problem with that. I suspect the client will be dealing mostly with Flash Player users, so that should work. That's how I'm doing my testing currently. Does it sounds reasonable to try for 1/2 second latency with RTMP streaming? If so, what additional things should I be looking at?

  4. #4
    Join Date
    Dec 2007
    Posts
    21,962

    Default

    You have already read the guide on low latency streaming. I don't have other info.

    Richard

  5. #5

    Default

    Ok, thanks.

  6. #6

    Default

    RTMP is TCP based protocol.
    It means, you will have latency less than 1/2 seconds in ideal network only. e.g. local network or optical channel.
    Zero buffers on playback-side may a bit reduce latency but, in general, this problem is not resolved for RTMP.

    Wikipedia:

    Difference between RTMP and RTMFP

    The principle difference is how the protocols communicate over the network. RTMFP is based on User Datagram Protocol (UDP), whereas RTMP is based on Transmission Control Protocol (TCP). UDP‐based protocols have some specific advantages over TCP‐based protocols when delivering live streaming media, such as decreased latency and overhead, and greater tolerance for dropped/missing packets, at the cost of decreased reliability.
    You can also read this article to get explanation - why fixed latency 1/2 seconds might not be reached in arbitrary network.

    If you want just reduce latency in RTMP.
    There are some settings, but it does not guarantee latency 1/2 seconds everywhere.
    1. NetStream.bufferTime = 0 for client-side
    2. Wowza/conf/Streams.xml for server-side:

    <Stream>
        <Name>my-low-latency</Name>
        <Description>my-low-latency</Description>
        <ClassBase>com.wowza.wms.stream.live.MediaStreamLive</ClassBase>
        <ClassPlay>com.wowza.wms.stream.live.MediaStreamLivePlay</ClassPlay>
        <Properties>
            <Property>
                <Name>maxliveaudiolatency</Name>
                <Value>8000</Value>
            </Property>
            <Property>
                <Name>instantOn</Name>
                <Value>false</Value>
                <Type>Boolean</Type>
            </Property>
            <Property>
                <Name>flushInterval</Name>
                <Value>20</Value>
                <Type>Integer</Type>
            </Property>
            <Property>
                <Name>onFlushNotifyClients</Name>
                <Value>true</Value>
                <Type>Boolean</Type>
            </Property>
            <Property>
                <Name>disableLowBandwidthThrottling</Name>
                <Value>false</Value>
                <Type>Boolean</Type>
            </Property>
            <Property>
                <Name>behindDropDFrames</Name>
                <Value>3000</Value>
                <Type>Integer</Type>
            </Property>
            <Property>
                <Name>behindDropPFrames</Name>
                <Value>3000</Value>
                <Type>Integer</Type>
            </Property>
            <Property>
                <Name>behindDropKFrames</Name>
                <Value>3000</Value>
                <Type>Integer</Type>
            </Property>
            <Property>
                <Name>behindDropAudio</Name>
                <Value>3000</Value>
                <Type>Integer</Type>
            </Property>
        </Properties>
    </Stream>
    Pay attention:
    1) maxliveaudiolatency = 8000
    2) flushInterval - less flushInterval gives lower latency, but increases CPU utilization.
    3) onFlushNotifyClients = true
    4) behindDropAudio = 3000

    You can play with this params to get lower latency. It will cost you CPU overhead and some quality degradation.
    Last edited by Alex.TBS; 07-19-2012 at 12:16 AM.

  7. #7

    Default

    FYI,

    I get .5sec (or less) video latency using:
    1. Stock un-tuned Wowza server.
    2. Wowza VideoChat example with the video window reduced to 180p.

    The round-trip ping latency between my computer to the server is 140ms, which is rather high. So, if your ping latency is lower, you should be able to achieve even better results. I think 100-200ms is possible.

  8. #8

    Default

    Alex, your linked article is probably incorrect. If lost packets increased stream latency, then we would see latency grow over time on connections with packet loss. I know this is not the case after much testing on such a link. Selective Acknowledgement (SACK) in OSI layer 2 is enabled by default which mitigates the issue. Also, I'm sure most streaming players ignore lost packets rather than increasing playback latency.

    I'm guessing the increased latency of TCP over UDP comes from the initial handshake TCP has which UDP doesn't.

    Of course, my uninformed speculation could be wrong...

  9. #9

    Default

    randall,

    In your example, you wrote:
    "The round-trip ping latency between my computer to the server is 140ms, which is rather high. So, if your ping latency is lower, you should be able to achieve even better results. I think 100-200ms is possible."

    Yes, it is possible to have low latency when RTT is 140ms. Loss rate should be ~0 for such results. Try ping command like: $ping -l 96 -n 1500 host
    to simulate packets. If you have low-latency and good enough quality, you ping lost rate should be 0 for two-five thousand test packets.

    RTT, lost rate impacts on resulted latency and quality.
    Yes, you can set bufferTime 0, but it will resolve latency promlem by dropping of behindhand packets. And you will see video freezes and lags.
    So, if you have UDP stream 50 packets/per second and loss rate 1%, you will lose 5 packets(1%) for 10 seconds stream.
    But, if latency is 4 seconds or even 1 seconds(RTMP stream), you will need drop 500*1/10=50 packets(10%) to return zero-latency.
    Last edited by Alex.TBS; 07-20-2012 at 06:12 AM.

  10. #10

    Default

    Ok, I think I see your point now. For regular streaming a single dropped packet on a high-latency connection will cause the stream to pause a minimum of time equal to the latency while waiting for TCP re-transmission, whereas UDP will just lose one packet. You're right, but if you have 1-4 second latency, you're probably not doing live-chat. Therefore with a regular stream you can set buffers to mitigate this effect, so I'm not seeing the benefit with UDP.

    Now lets imagine we're doing a "real-time" live chat on an average connection with 140ms latency. Say our bitrate gives us a packet every 50ms, and one packet is dropped. Once again, with TCP the receiver must wait for re-transmission:

    After:
    50ms next packet is received
    70ms sender gets DUP-ACK
    70ms receiver gets retransmission

    So, we have maybe a ~200ms glitch in our TCP stream. But with UDP we only have a 50ms glitch. Sound right?

    That makes sense for audio, where packets are discrete, but with H.264 video baseline profile we have P-Frames which are built from the previous frames. In UDP a dropped frame would cause corruption for a little more than half the keyint on average whereas with TCP the stream can be rebuilt correctly, which ideally results in a frozen picture for 200ms, instead of a green blocking/corruption for ~2sec on a 4sec keyint. So, I think UDP might be better for low-latency voice applications, but TCP would be better for video, and especially better for recording.

Page 1 of 2 12 LastLast

Similar Threads

  1. Best Method for reducing Latency
    By derrick217 in forum General Forum
    Replies: 2
    Last Post: 07-16-2012, 08:34 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •