Guide: Do You Need Low-Latency Streaming?

Financial trading. Gaming. Sports. News. User-generated content (UGC) and social networking. You name the market, and live-streaming video is all over it. What’s more, audiences are clamoring to create and consume live video that is highly interactive, allowing viewers to communicate both with broadcasters and with each other—and app and service providers are rushing to meet this growing demand.

In light of the increased desire for interactive streaming experiences, one complaint is common from video streamers and viewers: latency is too high. This issue prompts a number of questions from providers, such as:

To better answer these questions, we’ve updated our guide on the four primary latency scenarios and the streaming use cases that align with them.

Ready to learn more? Download Live-Streaming Video: Latency Scenarios for Every Use Case.


What Is the Cause of Latency?

The term “latency” describes the delay between the time from when a live stream is captured on camera to when it appears on a viewer’s screen—sometimes referred to as the “glass-to-glass” measurement.

A significant amount of delay may be involved when streaming live video, especially as compared to traditional TV broadcasts. Even though “live” cable feeds have five to 10 seconds of latency, viewers perceive the broadcast signal as being instantly available, in real time, as soon as they turn on their TV—and they expect live-streaming video to perform the same way.

For any streaming workflow, there are a number of factors that can cause latency, including:

  • Encoding and/or transcoding settings.
  • Distribution methods.
  • Network conditions and routing.
  • The video capture workflow (even at a camera level).
  • The players and hardware being used to view live streams.

Many of the streaming technologies commonly used today introduce anywhere from a few seconds to over one minute of latency. For some use cases, this may not be a problem—but for others, it can be catastrophic.


What Is Low Latency Live Streaming?

Streaming at or near real time is crucial to the user experience for a number of use cases. For example, game-streaming services and two-way conferencing platforms rely on low latency to deliver authentic interactions. For live-streaming news and sports apps, latency must keep pace with cable and satellite TV broadcasts to deliver current coverage and prevent spoilers for viewers.

But all of this presents a challenge for the streaming industry. A balance must be struck between three conflicting constraints, only two of which can be met effectively: low latency, high resolution and all-conditions playback. To complicate things further, universal support for Flash Player in browsers—which has long been the leader in delivering low-latency audio and video streams—is fading fast.

So, what are the alternatives? Luckily, there are several options, depending on your workflow. These include:

  • Traditional streaming protocols (Real-Time Streaming Protocol, or RTSP; Real-Time Messaging Protocol, or RTMP).
  • HTTP-based adaptive streaming protocols (Adobe’s HTTP Dynamic Streaming, or HDS; Apple’s HTTP Live Streaming, or HLS; Microsoft’s Smooth Streaming; Dynamic Adaptive Streaming over HTTP, or MPEG-DASH).
  • Technologies such as WebRTC, Secure Reliable Transport (SRT) and WebSockets.


How Do I Fix My Latency?

There are a number of solutions available for reducing latency, depending on your streaming workflow. The right solution for you will depend on the amount of latency you can live with.

For example, in an HLS-based workflow that doesn’t depend on real-time interactivity, you can use Wowza Streaming Engine™ software to tune HLS streaming for lower-latency delivery. For low-latency streaming to smaller audiences and/or where high-resolution video isn’t a requirement, using a traditional streaming protocol such as RTSP or RTMP may suffice.

But when your use case relies on interactivity, you need ultra low latency streaming delivery—which relies on specialized technology.


What Is Considered Ultra Low Latency?

“Ultra low latency” means streaming with just a few seconds of latency (or less). To achieve ultra low latency delivery, you need streaming technology that’s built for speed.

Technologies such as WebRTC, QUIC and low latency HTTP Streaming using MPEG-DASH or CMAFm delivery can also provide ultra low latency streaming that works for certain scenarios. The solution that’s right for you will depend on factors such as your audience size and network conditions.

Our guide—adapted from content created by two of the foremost experts in streaming media, Chris Knowlton and Timothy Siglin—will tell you all you need to know about delivery options and the use cases they’re best suited for. Download the guide here.


Free Wowza Guide to Latency Scenarios for Every Use Case

Search Wowza Resources



Follow Us


About Holly Regan