What Is HLS (HTTP Live Streaming)? [Update]

May 7, 2020 by

 

In 2020, Flash will finally die — which is why Apple’s HTTP Live Streaming (HLS) protocol has become the preferred way to deliver streaming video to users. HLS is an adaptive HTTP-based protocol used for transporting video and audio data from media servers to viewers’ screens. Regardless of whether you’re watching live video via an app on your phone or on-demand content on your TV, chances are that HLS streaming is involved. This is especially likely if you’re using an Apple device.

Videos delivered using HTTP are not technically “streams.” Rather, they’re progressive downloads sent via regular web servers. The industry shifted in favor of these because HTTP-based protocols deliver the best video quality and viewer experience possible — no matter the connection, software, or device. MPEG-DASH and Apple’s HLS are the two most common HTTP-based protocols.

In our 2019 Video Streaming Latency Report, more than 45% of participants indicated that they use the HLS protocol for content distribution. The popularity of HLS over its alternatives can be chalked up to playback compatibility and quality of experience. This is because all Mac, Android, Microsoft, and Linux devices can play streams delivered using HLS.

 

Apple HLS Snapshot

Summary: Features: Timeline:
  • HLS, which stands for HTTP Live Streaming, is a protocol developed for delivering live and on-demand streaming content that leverages HTTP technology for scalability and adaptive bitrate streaming.
  • Closed captions
  • Fast forward and rewind
  • Alternate audio and video
  • Fallback alternatives
  • Timed metadata
  • Ad insertion
  • Content protection

 

Apple HLS Technical Specifications

  • Audio Codecs: AAC-LC, HE-AAC+ v1 & v2, xHE-AAC, Apple Lossless, FLAC
  • Video Codecs: H.265, H.264
  • Playback Compatibility: Great (All Google Chrome browsers; Android, Linux, Microsoft, and MacOS devices; several set-top boxes, smart TVs, and other players)
  • Benefits: Adaptive bitrate, reliable, and widely supported.
  • Drawbacks: Quality of experience is prioritized over low latency.
  • Latency: While HLS traditionally delivered latencies of 6-30 seconds, the Low-Latency HLS extension has now been incorporated as a feature set of HLS, promising to deliver sub-2-second latency.

 

HLS Adoption

HLS is the most common streaming protocol in use today. According to our 2019 Video Streaming Latency Report, more than 45% of broadcasters use it. As indicated below, RTMP is also a popular choice, but most broadcasters repackage RTMP streams into the HLS protocol once they reach the streaming server. This helps ensure that the stream will play across a range of devices without requiring viewers to download any plug-ins.

Streaming Protocols Currently in Use Today

HLS Scalability

Unlike the RTMP protocol used in conjunction with Flash player, HLS can easily scale for delivery using ordinary web servers across a global content delivery network (CDN). By sharing the workload across a network of servers, CDNs accommodate viral viewership spikes and larger-than-expected live audiences. CDNs also help improve viewer experience by caching audio and video segments.

By comparison, CDN support for RTMP is rapidly declining. RTMP also requires the use of a dedicated streaming server, making it more resource-heavy to deploy.

 

Adaptive Bitrate Streaming With HLS

To deliver the highest-quality stream possible to everyone watching — including those with small screens and poor connections — HLS streaming dynamically adapts the resolution to each individual’s circumstances. Called adaptive bitrate streaming, this allows broadcasters to deliver high-quality streams to users with outstanding bandwidth and processing power, while also accommodating those lacking in the speed and power departments.

Rather than creating one live stream at one bitrate, a transcoder (usually located in the server) is used to create multiple streams at different bitrates and resolutions. The media server then sends the highest-resolution stream possible for each viewer’s screen and connection speed.

Creating multiple renditions of a single stream helps prevent buffering or stream interruptions. Plus, as a viewer’s signal strength goes from two bars to three, the stream dynamically adjusts to deliver a superior rendition.

Adaptive Bitrate Streaming to Multiple Devices

 

How HLS Works

 

HLS Container Format

Unlike most HTTP-based protocols, which use the MPEG-4 Part 14 (MP4) container format, HLS initially specified the use of MPEG-2 Transport Stream (TS) containers. This changed in 2016, when Apple announced support for the fragmented MP4 (fMP4) format. Today, fMP4 is the preferred format for all HTTP-based streaming (including MPEG-DASH and Microsoft Smooth). These video files typically contain AVC/H.264 encoded video and AAC encoded audio.

 

HLS Segmented Delivery

As described above, video streams delivered via HLS are broken up into segments of data (also called chunks or packets) at the media server rather than being delivered as a continuous flow of information. Segmented delivery allows the player to shift between different renditions depending on available resources, while also driving down latency.

 

HLS Encoding Requirements

Apple provides the below encoding targets as an example of typical sets of bit rate variants when streaming with HLS.

 

16:9 Aspect Ratio H.264/AVC Framerate
416 x 234 145 ≤30 fps
640 x 360 365 ≤30 fps
768 x 432 730 ≤30 fps
768 x 432 1,100 ≤30 fps
960 x 540 2,000 same as source
1280 x 720 3,000 same as source
1280 x 720 4,500 same as source
1920 x 1080 6,000 same as source
1920 x 1080 7,800 same as source
HLS Encoding Targets

 

.M3U8 Manifest File

HLS video segments are indexed into a media playlist so that the video player understands how to organize the data. A master .m3u8 playlist file must also be created — think of this as the index of indexes — to instruct the player on how to jump between the variant-specific playlists. This is also referred to as the manifest file. Anyone delivering the stream can then distribute the content by embedding the .m3u8 reference URL in a web page or creating an application that downloads the file.

HLS Streaming Workflow

Source: Apple

 

Segment Size and Latency

Up until 2016, Apple recommended using ten-second segments for HLS. The specification also required three segments to be loaded before playback could begin. By sticking with the ten-second recommendation, broadcasters would start out with 30 seconds of latency based on segment size alone. Apple eventually decreased the default segment size to six seconds, but that still meant that a ‘live’ stream might lag almost 20 seconds behind.

A popular way to decrease the lag has been to reduce the size of segments, called ‘tuning’ HLS for low latency. Shorter chunks enable faster download times, thereby speeding things up. But that’s not the only route to speedy streaming with HLS. In 2019, Apple announced the specs for an extension called Apple Low-Latency HLS. And more recently, this extension has been incorporated into the overarching HLS standard as a feature set.

 

Apple Low-Latency HLS

Apple designed the Low-Latency HLS extension to drive latency down at scale. While the protocol originally relied on HTTP/2 PUSH delivery, this requirement has since been removed. Additionally, the Internet Engineering Task Force (IETF) recently incorporated the Low-Latency HLS extension into traditional HLS as a feature set. The significance of this is twofold: it further standardizes the new technology and puts pressure on technology providers to add support. 

  • Playback Compatibility: Any players that aren’t optimized for Low-Latency HLS can fall back to standard (higher-latency) HLS behavior
  • Benefits: Low latency meets HTTP-based streaming
  • Drawbacks: As an emerging spec, vendors are still implementing support
  • Latency: 2 seconds or less

Do note, Apple Low-Latency HLS is not to be confused with the open-source Low-Latency HLS solution (LHLS) that Periscope made. The main difference between the two is the delivery method. Unlike Apple’s extension, Periscope’s version uses chunked transfer encoding. The video developer community has abandoned this open-source alternative in favor of Apple’s standard.

We added support for Low-Latency HLS in Wowza Streaming Engine at the end of last year. As an early adopter, Wowza continues to develop against this emerging technology, and we’re working to extend support across our entire product portfolio.

 

HLS Hardware and Software

For instructions on how to configure your HLS stream, check out Apple’s recommendations.

Most content distributors use a streaming server to encode their video using RTMP, WebRTC, or SRT and then transcode the video to support HLS once it reaches the server. It’s also wise to deliver video in additional formats (such as MPEG-DASH) to make sure that viewers across a broad range of devices can view the content. A cloud-based service like Wowza Streaming Cloud™ or streaming server software like Wowza Streaming Engine™ can be used for live transcoding and more.

 

HTML5 Player

Lastly, viewers will need either a compatible device or an HTML5 player. HLS has become the de facto standard in the wake of Adobe Flash’s decline, meaning that most devices and browsers already have this functionality built in.

 

When Not to Use HLS

Any use case requiring sub-one-second delivery — such as web conferencing, real-time device control for cameras and drones, or situational awareness — requires a protocol like WebRTC (Web Real-Time Communications). Even Apple Low-Latency HLS comes with an inherent delay that’s unacceptable for these scenarios.

 

When to Use HLS

Because HLS is currently the most widely used protocol for media streaming, it’s a safe bet for the majority of broadcasts. Anyone streaming to connected devices should at least consider it — especially when broadcasting live events and sports, where quality is key. Latency is worth considering, but as support is implemented for the Apple Low-Latency HLS feature set, sub-two-second delivery should become more common. This will make it suitable for interactive streaming, online gambling, e-gaming, and more.

When streaming to mobile devices, HLS is a must-have. Just consider the role that iPhones play in the cellular landscape. Many smart TVs, set-top boxes, and players also default to HLS, so broadcasters trying to reach users in their living rooms should also look to HLS. And finally, for anyone still using RTMP for delivery to Flash, it’s time to make the switch.

That said, reaching the broadest audience possible starts with accommodating additional video formats. By transcoding the stream into a variety of formats, you can ensure video scalability no matter the device.

You can use Wowza Streaming Engine for transcoding using your own servers — whether they’re on premises or in a third-party cloud platform. Alternatively, Wowza Streaming Cloud might be a better fit for those who want to get up and running quickly.

 

About Traci Ruether

As a Colorado-based B2B tech writer, Traci Ruether serves as Wowza's content marketing manager. Her background is in streaming and content delivery. In addition to writing, Traci enjoys cooking, reading, gardening, and spending quality time with her fur babies. Follow… View more