Adaptive Bitrate Streaming: How It Works and Why It Matters (Update)
To computer scientists, video buffering refers to the act of preloading data to stream with fewer interruptions. To the rest of us, it’s mostly a punch line. In either case, nobody wants to be interrupted every few minutes while their video struggles to load. Thankfully, video technology has come a long way since the 90s, and it’s no longer necessary to wait half an hour for a fully loaded stream. Developers have envisioned new ways for players and streaming data to interact, leading to improved viewer experience and optimized quality.
Adaptive bitrate streaming (ABR) provides the best video quality and viewer experience possible – no matter the connection, software, or device. This dynamic streaming algorithm allows video playback devices to adjust to internet speed fluctuations, preventing the bitrate incongruity that leads to buffering. This technology continues to evolve as developers seek more efficient ways to deliver video streaming information.
This blog will discuss why ABR matters, how it works, and the various benefits of different ABR technology applications.
Table of contents
What Is Adaptive Bitrate Streaming (ABR)?
Let’s work backward, word for word. Of course, streaming is self-explanatory. You want to get your video streaming content into viewers’ hands.
Bitrate refers to the speed with which data travels across a network. This is not to be confused with “bandwidth,” which refers to the maximum amount of data that can be transmitted within a specific timeframe — although both are measured in megabytes per second (Mbps). When the bitrate of a streaming video exceeds the bandwidth of the playback device’s internet connection, you get buffering. This data queue typically spells frustration and lost viewership.
For this next part, remember two key facts:
- internet bandwidth is not static, and
- the higher the bitrate, the higher the video quality.
Bandwidth changes when you travel, when poor weather interferes with your Wi-Fi, or just because. But when video data with a set bitrate streams to your device (also called progressive streaming), that bitrate isn’t changing. When your bandwidth drops, the video buffers. That or your video was already playing at a low bitrate, and you were viewing it at a lower quality than you could have been for at least some of that time.
That’s where the third and most crucial word comes in: adaptive. In adaptive bitrate streaming, files are chunked into segments and stored in a range of bitrates. As a video plays, the playback device requests different bitrates based on the current bandwidth. This effectively allows the bitrate to better sync with the bandwidth, allowing seamless streaming at the highest possible quality.
So, what is adaptive bitrate streaming? It’s a process for packaging and delivering streaming data specifically designed to maximize quality without interruptions. The ABR algorithm is your best friend for providing a satisfying video streaming experience.
ABR vs. MBR
In defining ABR streaming, we should also explore what it’s not. Multi-bitrate streaming (MBR) sounds a lot like ABR in that it involves multiple available bitrates for playback devices to choose from. However, a playback device is locked in when it chooses one of these. In other words, it chooses what it thinks is best but can’t respond to sudden drops or increases in bandwidth. It turns out MBR lacks the “adaptive” part of ABR.
How Does Adaptive Bitrate Streaming Work?
ABR starts with your raw data and how it is prepped prior to streaming. Video is transcoded and segmented into chunks. After this, the process is up to the playback device, which requests these chunks of data according to what it can handle given the available bandwidth. Let’s look at these steps in more detail and some factors that might affect how they work.
Video Encoding and Transcoding
Encoding is the process by which raw video data is compressed and prepared for transport to a playback device. Transcoding is the process by which already compressed data is decompressed and decoded to alter it in some way, often resulting in multiple versions of the original data. These changes are typically as follows:
- Transizing: Resizing the video frame – or resolution – to accommodate different screens.
- Transrating: Changing the bitrate of the decompressed file to accommodate different connection speeds. This can include changing the frame rate or the resolution.
Transrating and transizing overlap and are both critical to adaptive bitrate streaming. After all, a playback device can’t access data at a specific bitrate or resolution if it doesn’t exist in that form.
Great. You now have a collection of various sizes and resolutions for your video data that a playback device can access. But what if the playback device chooses incorrectly and the bitrate of the chosen file is not ideal for the available bandwidth?
This is where segmenting comes in. Also known as chunked encoding, or chunking, this is the process by which streaming data is separated into a series of non-overlapping segments before being sent to the playback device. Each chunk typically ranges in length from 2 to 10 seconds. By breaking up the data this way, it’s possible to adjust the size of data sent to a playback device mid-stream.
Now that the data is fully prepared, the viewer’s playback device takes the wheel. Before streaming begins, the playback device downloads a manifest that describes all the available chunks and bitrates. This is a menu from which the playback device can begin streaming. Typically, a playback device will take it slow, selecting a bitrate it knows it can handle before adjusting.
After each segment, the playback device recalibrates and requests the next segment based on the new information. For example, the first segment was likely a much lower bitrate than necessary. The device will then request a higher bitrate for the next segment. If the bandwidth lessens or the device otherwise struggles to play a segment, then it will adjust downward when requesting the next one.
Encoding Ladders for Dynamic Adapting
The spectrum of transcoded files available to the playback device falls on what’s called an encoding ladder. At the top, a high-bitrate, high-framerate, high-resolution stream is output for viewers with the highest-tech setups. At the bottom of the ladder, the same video in low quality is available to viewers with smaller screens and poor bandwidth. This allows the player to smoothly shift between bitrates and resolutions as it constantly takes stock of available resources. The more nuanced your encoding ladder, the easier it will be for the playback device to optimize viewer experience.
Why We Need Adaptive Bitrate Streaming
In the early days, Real-Time Messaging Protocol (RTMP) was the go-to protocol for online streaming. Sending linear streams via RTMP enabled lightning-fast video delivery. These RTMP streams were encoded at a bitrate that was comfortably less than the bandwidth of target viewers. These were then delivered via a dedicated streaming server as a continuous stream of data.
The industry eventually shifted in favor of HTTP-based technologies. These “streams” were not technically streams. Rather, they were progressive downloads sent via web servers. This delivery method worked by downloading the video as you watched it. The content could be cached on local servers, but it didn’t optimize for screen size or connectivity.
Finally, adaptive bitrate streaming came into play. Content distribution encoded streams into many different bitrates and broke them into fragments. These multi-bitrate chunks would then be indexed in a manifest and delivered to a media player. The result? Very little buffering, fast start time, and a good experience for both high- and low-end connections.
Benefits of Adaptive Bitrate Streaming
- Maintain satisfied viewership
- Prevent lost revenue due to dropped viewership
- Reach a wider audience across devices and bandwidth availability
ABR and Protocols: Which Pairs Best?
When it comes to video streaming, protocols are a set of standards that determine how data is packaged and delivered. What protocol you use is determined in part by the devices to which you want to stream. ABR streaming works with all the most popular protocols.
Apple developed HTTP Live Streaming (HLS), and it used to only work with Apple devices. Now the protocol is device agnostic, making it a versatile choice for both Live and OTT streaming. You need the H.264 or H.265 encoding formats to use this protocol.
Sometimes referred to as DASH, Moving Pictures Expert Click Dynamic Adaptive Streaming over HTTP (MPEG-DASH) was the first adaptive bitrate streaming protocol to become an international standard. MPEG-DASH does not require any specific encoding formats. However, you cannot use it natively with Apple devices.
Originally designed to work with the now discontinued Adobe Flash, HTTP Dynamic Streaming (HDS) can be set up for on-demand or live streaming. Like MPEG-DASH, it doesn’t work with Apple devices.
ABR can work with Web Real-Time Communication (WebRTC) in several ways. This can be done on the server-side, where the server is creating the various versions of the video data, or on the client-side (a.k.a. simulcasting), where the client machine performs this task. WebRTC is well known for its ultra-low-latency streaming. However, without the help of a streaming service, it may not scale well to larger audiences. It, too, is not supported on iOS.
All the above protocols work with Hypertext Transfer Protocols (HTTP)-based streaming as opposed to Real-Time Messaging Protocol (RTMP). ABR was created for HTTP-based streaming. As such, it is much easier to include ABR in your HTTP-based streaming solution than in your RTMP-based one. It is technically possible to make the latter work but not recommended.
Adaptive Bitrate Streaming and Wowza
Choosing ABR for your adaptive streaming needs means choosing a protocol, effectively transcoding your video data to build your encoding ladder, and segmenting that data into smaller chunks. Sound easier said than done? It doesn’t have to be.
To optimize the viewing experience across various devices and connection speeds, you’ll need a transcoding solution like the Wowza Video platform or Wowza Streaming Engine software. Streaming solutions like these can take your raw video data and make it available to a wide audience.
Get started today with Wowza.
Search Wowza Resources
About Sydney Roy (Whalen)
Sydney works for Wowza as a content writer and Marketing Communications Specialist, leveraging roughly a decade of experience in copywriting, technical writing, and content development. When observed in the wild, she can be found gaming, reading, hiking, parenting, overspending at the Renaissance Festival, and leaving coffee cups around the house.