What Is HLS (HTTP Live Streaming)? [Update]July 29, 2021
Adobe Flash finally died last year, which is why Apple’s HTTP Live Streaming (HLS) protocol has become the preferred way to deliver streaming video to users. What is HLS and how does it work? Buckle up, you’re in for a long ride.
Table of Contents
- What Is HLS?
- Apple HLS Snapshot
- How Does HLS Work
- HTTP Live Streaming Tools and Services
- Technical Overview of HLS Streaming
- Apple Low-Latency HLS
- HLS Alternatives
- When Not to Use HLS
- When to Use HLS
What Is HLS?
HLS is an adaptive HTTP-based format for transporting video and audio data from media servers to viewers’ screens. Regardless of whether you’re watching live streams via an app on your phone or binging on-demand content on your smart TV, the chances are that HLS streaming is involved. This is especially likely if you’re using an Apple device.
In our 2019 Video Streaming Latency Report, more than 45% of participants indicated that they use the HLS protocol for content distribution. The popularity of HLS over its alternatives can be chalked up to playback compatibility and quality of experience. This is because all Mac, Android, Microsoft, and Linux devices can play streams delivered using HLS.
With HLS, content distributors are able to ensure great viewing experiences across a wide range of devices, while relying on a content delivery network (CDN) for global delivery. Scale and quality have traditionally come at the sacrifice of speedy delivery, but that’s all changing with Apple’s release of Low-Latency HLS.
Apple HLS Snapshot
How Does HLS Work?
Typical RTMP to HLS Workflow
HLS video streams are broken up into segments of data (also called chunks or packets) rather than being delivered as a continuous flow of information. This departure from how streams were traditionally delivered enables a higher-quality stream to reach a greater number of viewers. That said, it also pushes latency higher, so most content distributors encode their streaming content using the Real-Time Messaging Protocol (RTMP) and then repackage it for HLS delivery once it reaches the media server.
Transcoding for Adaptive Bitrate Streaming
To deliver the highest-quality stream possible to everyone watching — including those with small screens and poor connections — HLS streaming dynamically adapts the resolution to each individual’s circumstances. Called adaptive bitrate streaming, this allows broadcasters to deliver high-quality streams to users with outstanding bandwidth and processing power, while also accommodating those lacking in the speed and power departments.
Rather than creating one live stream at one bitrate, a transcoder (usually located in the media server) is used to create multiple streams at different bitrates and resolutions. The server then sends the highest-resolution stream possible for each viewer’s screen and connection speed.
Creating multiple renditions of a single stream helps prevent buffering or stream interruptions. Plus, as a viewer’s signal strength goes from two bars to three, the stream dynamically adjusts to deliver a superior rendition.
Delivery and Scaling
Unlike the RTMP protocol used in conjunction with Flash player, HLS can easily scale for delivery using ordinary web servers across a global content delivery network (CDN). By sharing the workload across a network of HTTP servers, CDNs accommodate viral viewership spikes and larger-than-expected live audiences. CDNs also help improve viewer experience by caching audio and video segments.
By comparison, RTMP requires the use of dedicated streaming servers, making it more resource-heavy to deploy.
Never miss an HLS update
Subscribe to keep up with all the live streaming news from protocols to the latest trends.Subscribe
HTTP Live Streaming Tools and Services
HLS Streaming Server
As stated above, most content distributors use a streaming server to ingest content delivered via RTMP, WebRTC, or SRT and then repackage the video using HLS once it reaches the server. It’s also wise to deliver video in additional formats (such as MPEG-DASH) to make sure that viewers across a broad range of devices can view the content. A cloud-based service like Wowza Streaming Cloud™ or streaming server software like Wowza Streaming Engine™ is essential for this conversion process, as well as for transcoding the stream for adaptive bitrate delivery.
Content Delivery Network (CDN)
By connecting servers across the globe, CDNs create superhighways that truncate the time it takes to deliver video streams from origin to end user. For anyone streaming to a large number of viewers or geographically distributed area, CDNs are critical to reliable content delivery. The Wowza Streaming Cloud service automatically leverages the Wowza CDN to scale live streams on demand. Additionally, it’s available as an add-on stream target for Wowza Streaming Engine subscriptions. Akamai, Fastly, and Microsoft Azure are also great options for HLS streaming — all of which are available as add-on stream targets using Wowza’s product portfolio.
Lastly, viewers will need either a compatible device or an HTML5 player. HLS has become the de facto standard in the wake of Adobe Flash’s decline, meaning that most devices and browsers already have this functionality built in. For our list of the best HTML5 video players, check out this blog.
Technical Overview of HLS Streaming
So you’ve had a basic overview of how it works, but what about the details? From encoding requirements to segment size, let’s dig in.
- Audio Codecs: AAC-LC, HE-AAC+ v1 & v2, xHE-AAC, Apple Lossless, FLAC
- Video Codecs: H.265, H.264
- Playback Compatibility: Great (All Google Chrome browsers; Android, Linux, Microsoft, and MacOS devices; several set-top boxes, smart TVs, and other players)
- Benefits: Adaptive bitrate, reliable, and widely supported.
- Drawbacks: Quality of experience is prioritized over low latency.
- Latency: While HLS traditionally delivered latencies of 6-30 seconds, the Low-Latency HLS extension has now been incorporated as a feature set of HLS, promising to deliver sub-2-second latency.
Unlike most HTTP-based protocols, which use the MPEG-4 Part 14 (MP4) container format, HLS initially specified the use of MPEG-2 Transport Stream (TS) containers. This changed in 2016, when Apple announced support for the fragmented MP4 (fMP4) format. Today, fMP4 is the preferred format for all HTTP-based streaming (including MPEG-DASH and Microsoft Smooth). These video files typically contain AVC/H.264 encoded video and AAC encoded audio.
Apple provides the below encoding targets as an example of typical sets of bit rate variants when streaming with HLS. For greater detail on how to configure your HLS stream, check out Apple’s recommendations.
|16:9 Aspect Ratio||H.264/AVC||Framerate|
|416 x 234||145||≤30 fps|
|640 x 360||365||≤30 fps|
|768 x 432||730||≤30 fps|
|768 x 432||1,100||≤30 fps|
|960 x 540||2,000||same as source|
|1280 x 720||3,000||same as source|
|1280 x 720||4,500||same as source|
|1920 x 1080||6,000||same as source|
|1920 x 1080||7,800||same as source|
.M3U8 Manifest File
HLS video segments are indexed into a media playlist so that the video player understands how to organize the data. A master .m3u8 playlist file must also be created — think of this as the index of indexes — to instruct the player on how to jump between the variant-specific playlists. This is also referred to as the manifest file. Anyone delivering the stream can then distribute the content by embedding the .m3u8 reference URL in a web page or creating an application that downloads the file.
Segment Size and Latency
As described above, video streams delivered via HLS are broken up into segments of data at the media server. Segmented delivery allows the player to shift between different renditions depending on available resources, while also reducing buffering and other interruptions.
Up until 2016, Apple recommended using ten-second segments for HLS. The specification also required three segments to be loaded before playback could begin. By sticking with the ten-second recommendation, broadcasters would start out with 30 seconds of latency based on segment size alone. Apple eventually decreased the default segment size to six seconds, but that still meant that a ‘live’ stream might lag almost 20 seconds behind.
A popular way to decrease the lag has been to reduce the size of segments, called ‘tuning’ HLS for low latency. Shorter chunks enable faster download times, thereby speeding things up. But that’s not the only route to speedy streaming with HLS. In 2019, Apple announced the specs for an extension called Apple Low-Latency HLS. And more recently, this extension has been incorporated into the overarching HLS standard as a feature set.
Apple Low-Latency HLS
Apple designed the Low-Latency HLS extension to drive latency down at scale. While the protocol originally relied on HTTP/2 PUSH delivery, this requirement has since been removed. Additionally, the Internet Engineering Task Force (IETF) recently incorporated the Low-Latency HLS extension into traditional HLS as a feature set. The significance of this is twofold: it further standardizes the new technology and puts pressure on technology providers to add support.
- Playback Compatibility: Any players that aren’t optimized for Low-Latency HLS can fall back to standard (higher-latency) HLS behavior
- Benefits: Low latency meets HTTP-based streaming
- Drawbacks: As an emerging spec, vendors are still implementing support
- Latency: 3 seconds or less
Do note, Apple Low-Latency HLS is not to be confused with the open-source Low-Latency HLS solution (LHLS) that Periscope made. The main difference between the two is the delivery method. Unlike Apple’s extension, Periscope’s version uses chunked transfer encoding. The video developer community has abandoned this open-source alternative in favor of Apple’s standard.
We added support for Low-Latency HLS in Wowza Streaming Engine at the end of last year. As an early adopter, Wowza continues to develop against this emerging technology, and we’re working to extend support across our entire product portfolio.
The primary alternative protocols for last-mile delivery are MPEG-DASH and WebRTC. Whereas DASH is functionally quite similar to HLS but lacks support across Apple devices, WebRTC is a whole different animal for real-time delivery that wasn’t designed with scale in mind.
HLS vs. MPEG-DASH Comparison
- Proprietary vs. international: HLS is proprietary to Apple, whereas DASH is an open standard defined by MPEG.
- Playback compatibility: HLS is more widely supported than DASH due to the immense influence that Apple has on the industry at large.
- Codec requirements: Whereas HLS specifies the use of certain video codecs (H.265, H.265)and audio codecs (detailed here), DASH is codec-agnostic. This could enable higher quality broadcasts at lower bitrates when more advanced codecs are leveraged.
- Container format: HLS has traditionally used the MPEG-2 transport stream container format, or .ts (MPEG-TS), whereas DASH used the MP4 format, or .mp4.
- Latency: Both protocols have traditionally lagged in terms of delivery speed, but new approaches seek to change this. For DASH, this takes the form of the Common Media Application Format (CMAF), whereas Apple now offers the Low-Latency HLS extension.
HLS vs. WebRTC Comparison
- Latency: WebRTC streams traverse the internet at a blistering 500ms delivery speed, putting even Low-Latency HLS to shame.
- Proprietary vs. open-source: HLS is proprietary to Apple, whereas WebRTC is open-source and free.
- Playback compatibility: WebRTC doesn’t require additional plug-ins or software to function within most browsers, but HLS is more widely supported across mobile devices.
- Scalability: Scalability is the name of the game with HLS and the same can’t be said for WebRTC. Without a streaming platform like Wowza, WebRTC is limited to small chat-based environments.
- Quality: With WebRTC, real-time delivery is prioritized over quality,
When Not to Use HLS
Any use case requiring sub-one-second delivery — such as web conferencing, real-time device control for cameras and drones, or situational awareness — requires a protocol like WebRTC (Web Real-Time Communications). Even Apple Low-Latency HLS comes with an inherent delay that’s unacceptable for these scenarios.
When to Use HLS
Because HLS is currently the most widely used protocol for media streaming, it’s a safe bet for the majority of broadcasts. Anyone streaming to connected devices should at least consider it — especially when broadcasting live events and sports, where quality is key. Latency is worth considering, but as support is implemented for the Apple Low-Latency HLS feature set, sub-two-second delivery should become more common. This will make it suitable for interactive streaming, online gambling, e-gaming, and more.
When streaming to mobile devices, HLS is a must-have. Just consider the role that iPhones play in the cellular landscape. Many smart TVs, set-top boxes, and players also default to HLS, so broadcasters trying to reach users in their living rooms should also look to HLS. And finally, for anyone still using RTMP for delivery to Flash, it’s time to make the switch.
That said, reaching the broadest audience possible starts with accommodating additional video formats. By transcoding the stream into a variety of formats, you can ensure video scalability no matter the device.
You can use Wowza Streaming Cloud to transcode and deliver streams via our fully managed service. Alternatively, Wowza Streaming Engine might be a better fit for those who want to keep their streaming infrastructure in house.