What Is HLS (HTTP Live Streaming)?

September 6, 2019 by

What Is HLS (HTTP Live Streaming)

 

In a few short months, Flash will finally die — which is why Apple’s HTTP Live Streaming (HLS) protocol has become the preferred way to deliver streaming video to users. HLS is an adaptive HTTP-based protocol used for transporting video and audio data from media servers to viewers’ screens. Regardless of whether you’re watching live video via an app on your phone or on-demand content on your TV, chances are that HLS streaming is involved. This is especially likely if you’re using an Apple device.

Videos delivered using HTTP are not technically “streams.” Rather, they’re progressive downloads sent via regular web servers. The industry shifted in favor of these because HTTP-based protocols deliver the best video quality and viewer experience possible — no matter the connection, software, or device. MPEG-DASH and Apple’s HLS are the two most common HTTP-based protocols.

In our 2019 Low-Latency Report, more than 45% of participants indicated that they use the HLS protocol for content distribution. The popularity of HLS over its alternatives can be chalked up to playback compatibility and quality of experience. This is because all Mac, Android, Microsoft, and Linux devices can play streams delivered using HLS.

 

Apple HLS Snapshot

Summary: Features: Timeline:
  • HLS is a protocol developed for delivering live and on-demand streaming content to Apple devices that leverages HTTP technology for scalability and adaptive bitrate streaming.
  • Closed captions
  • Fast forward and rewind
  • Alternate audio and video
  • Fallback alternatives
  • Timed metadata
  • Ad insertion
  • Content protection

 

Apple HLS Technical Specifications

  • Audio Codecs: AAC-LC, HE-AAC+ v1 & v2, MP3
  • Video Codecs: H.265, H.264
  • Playback Compatibility: Great (All Google Chrome browsers; Android, Linux, Microsoft, and MacOS devices; several set-top boxes, smart TVs, and other players)
  • Benefits: Adaptive bitrate and widely supported
  • Drawbacks: Quality of experience is prioritized over low latency
  • Latency: 6-30 seconds (lower latency only possible when tuned)
  • Variant Formats: Low-Latency HLS (see below), PHLS (Protected HTTP Live Streaming)

 

Adaptive Bitrate Streaming With HLS

To deliver the highest-quality stream possible to everyone watching — including those with small screens and poor connections — HLS streaming dynamically adapts the resolution to each individual’s circumstances. Called adaptive bitrate streaming, this allows broadcasters to deliver high-quality streams to users with outstanding bandwidth and processing power, while also accommodating those lacking in the speed and power departments.

Rather than creating one live stream at one bitrate, a transcoder (usually located in the server) is used to create multiple streams at different bitrates and resolutions. The media server then sends the highest-resolution stream possible for each viewer’s screen and connection speed.

Creating multiple renditions of a single stream helps prevent buffering or stream interruptions. Plus, as a viewer’s signal strength goes from two bars to three, the stream dynamically adjusts to deliver a superior rendition.

 

Adaptive Bitrate Streaming to Multiple Devices

 

How HLS Works

Unlike most HTTP-based protocols, which use the ISO Base Media File Format, HLS uses the MPEG-2 Transport Stream container. Apple has also announced support for the CMAF fragmented .mp4 format, but this is still being adopted.

In other words, HLS streams are delivered in the .ts format, while protocols like DASH almost uniformly use .mp4 containers. These .ts files typically contain H.264 encoded video and AAC encoded audio.

As described above, these are broken up into segments of data (also called chunks or packets) at the media server rather than being delivered as a continuous flow of information. This is what allows the player to shift between different renditions depending on available resources.

 

16:9 Aspect Ratio H.264/AVC Framerate
416 x 234 145 ≤30 fps
640 x 360 145 ≤30 fps
768 x 432 145 ≤30 fps
960 x 540 145 ≤30 fps
1280 x 720 145 same as source
1280 x 720 145 same as source
1280 x 720 145 same as source
1920 x 1080 145 same as source
1920 x 1080 145 same as source

HLS Encoding Targets

 

.M3U8 Manifest File

These segments are indexed into a media playlist so that the video player understands how to organize the data. A master .m3u8 playlist file must also be created — think of this as the index of indexes — to instruct the player on how to jump between the variant-specific playlists. This is also referred to as the manifest file. Anyone delivering the stream can then distribute the content by embedding the .m3u8 reference URL in a web page or creating an application that downloads the file.

HLS Streaming Workflow

Source: Apple

 

Segment Size and Latency

Up until 2016, Apple recommended using ten-second segments for HLS. The specification also required three segments to be loaded before playback could begin. By sticking with the ten-second recommendation, broadcasters would start out with 30 seconds of latency based on segment size alone. Apple has since decreased the default segment size to six seconds, but that still means that a ‘live’ stream might lag almost 20 seconds behind.

A popular way to decrease this latency is to reduce the size of segments. Shorter chunks enable faster download times, thereby speeding things up. But that’s not the only route to speedy streaming with HLS. Apple recently announced the specs for an extension called Apple Low-Latency HLS.

 

Apple Low-Latency HLS

Apple designed this extension to drive latency down at scale. The protocol relies on HTTP/2 PUSH delivery combined with shorter media chunks. Unlike standard HLS, Apple Low-Latency HLS doesn’t yet support adaptive bitrate streaming — but it is on the roadmap.

  • Playback Compatibility: Any players that aren’t optimized for Low-Latency HLS can fall back to standard (higher-latency) HLS behavior
  • Benefits: Low latency meets HTTP-based streaming
  • Drawbacks: As an emerging spec, vendors are still implementing support
  • Latency: 3 seconds or less

Do note, Apple Low-Latency HLS is not to be confused with the open-source Low-Latency HLS solution (LHLS) that Periscope made. The main difference between the two is the delivery method. Unlike Apple’s extension, Periscope’s version uses chunked transfer encoding.

Because leaders across the industry are working to support Apple’s extension, it will likely become a preferred technology for live sports, e-gaming, and interactive streaming.

 

HLS Scalability

Unlike the RTMP protocol used in conjunction with Flash player, HLS can easily scale for delivery using ordinary web servers across a global content delivery network (CDN). By sharing the workload across a network of servers, CDNs accommodate viral viewership spikes and larger-than-expected live audiences. They also help improve viewer experience by caching audio and video segments.

By comparison, CDN support for RTMP is rapidly declining. RTMP also requires the use of a dedicated streaming server, making it more resource-heavy to deploy.

 

HLS Hardware and Software

For instructions on how to configure your HLS stream, check out Apple’s recommendations.

Most content distributors choose to encode their video using RTMP and then transcode the video to support HLS once it reaches the server. Encoders like the Wowza ClearCaster™ appliance can be used to do this and more. It’s also wise to deliver video in additional formats (such as MPEG-DASH) to make sure that viewers across a broader range of devices can view the content. A cloud-based service like Wowza Streaming Cloud™ or streaming server software like Wowza Streaming Engine™ can be used for live transcoding.

 

HTML5 Player

Lastly, viewers will need either a compatible device or an HTML5 player. HLS has become the de facto standard in the wake of Adobe Flash’s decline, meaning that most devices and browsers already have this functionality built in. Anyone using Wowza Streaming Cloud or Wowza Streaming Engine also can use the embeddable Wowza Player for free.

 

When Not to Use HLS

Any use case requiring sub-one-second delivery — such as web conferencing, real-time device control for cameras and drones, or situational awareness — requires a protocol like WebRTC (Web Real-Time Communications). Even Apple Low-Latency HLS comes with an inherent delay that’s unacceptable for some scenarios.

 

When to Use HLS

Because HLS is currently the most widely used protocol for media streaming, it’s a safe bet for the majority of broadcasts. Anyone streaming to connected devices should at least consider it — especially when broadcasting live events and sports, where quality is key. Latency is worth considering, but as support is implemented for Apple Low-Latency HLS, sub-three-second delivery should become more common. This will make it suitable for interactive streaming, online gambling, e-gaming, and more.

When streaming to mobile devices, HLS is a must-have. Just consider the role that iPhones play in the cellular landscape. Many smart TVs, set-top boxes, and players also default to HLS, so broadcasters trying to reach users in their living rooms should also look to HLS. And finally, for anyone still using RTMP for delivery to Flash, it’s time to make the switch.

That said, reaching the broadest audience possible starts with accommodating additional video formats. By transcoding the stream into a variety of formats, you can ensure video scalability no matter the device.

You can use Wowza Streaming Engine for transcoding using your own servers — whether they’re on premises or in a third-party cloud platform. Alternatively, Wowza Streaming Cloud might be a better fit for those who want to get up and running quickly.