The Complete Guide to Live Streaming:
Packaging and Protocols

Live streaming process garbage visualization

Once compressed, video streams must be packaged for delivery. The distinction between compression and packaging is subtle but relevant. For the sake of explanation, let’s picture the process in terms of residential garbage removal.

 

We start with the raw video data, which must be compressed down for delivery across the internet. An encoder allows us to do so by compressing gigabytes into megabytes. Think of the encoder as a household trash compactor and the codecs as the bags of compressed trash.

 

In order to actually transport the compressed trash (audio and video codecs) to the dump (viewer), another step is required. It’s crucial to place the bag of trash, as well as any other odds and ends (such as metadata), into a curbside trash receptacle. File container formats can be thought of in terms of these receptacles. They act as a wrapper for all streaming data so that it’s primed for delivery.

 

Finally, the contents of the trash bin are transported to the dump via an established route. Think of protocols as the established routes that garbage trucks take.

 

OK. Enough trash talk. Time to define each term for real.

What Is a Video Container Format?



Video container formats, also called wrappers, hold all the components of a compressed stream. This could include the audio codec, video codec, closed captioning, and any associated metadata such as subtitles or preview images. Common containers include .mp4, .mov, .ts, and .wmv.

What Is a Streaming Protocol?

 

A protocol is a set of rules governing how data travels from one device to another. For instance, the Hypertext Transfer Protocol (HTTP) deals with hypertext documents and webpages. Online video delivery uses both streaming protocols and HTTP-based protocols. Streaming protocols like Real-Time Messaging Protocol (RTMP) offer fast video delivery, whereas HTTP-based protocols can help optimize the viewing experience.

 

The protocol used can increase streaming latency by to up to 45 seconds.

Traditional Stateful Streaming Protocols

 

In the early days, traditional protocols such as RTSP (Real-Time Streaming Protocol) and RTMP (Real-Time Messaging Protocol) were the go-to methods for streaming video over the internet and playing it back on home devices. These protocols are stateful, which means they require a dedicated streaming server.

 

While RTSP and RTMP support lightning-fast video delivery, they aren’t optimized for great viewing experiences at scale. Additionally, fewer players support these protocols than ever before. Many broadcasters choose to transport live streams to their media server using a stateful protocol like RTMP and then transcode it for multi-device delivery.

 

RTMP and RTSP keep latency at around 5 seconds or less.

HTTP-Based Adaptive Streaming Protocols

 

The industry eventually shifted in favor of HTTP-based technologies. Streams deployed over HTTP are not technically “streams.” Rather, they’re progressive downloads sent via regular web servers.

 

Using adaptive bitrate streaming, HTTP-based protocols deliver the best video quality and viewer experience possible — no matter the connection, software, or device. Some of the most common HTTP-based protocols include MPEG-DASH and Apple’s HLS.

 

HTTP-based protocols are stateless, meaning they can be delivered using a regular old web server. That said, they fall on the high end of the latency spectrum.

 

HTTP-based protocols can cause 10-45 seconds in latency.



Lowest latency protocols

Emerging Protocols for Near-Real-Time Delivery

 

With an increasing number of videos being delivered live, industry leaders continue to improve streaming technology. Emerging standards like WebRTC, SRT, WOWZ, and low-latency CMAF (which is a format rather than a protocol) support near-real-time delivery — even over poor connections.

Protocols Benefits Limitations
WebRTC Real-time interactivity without a plugin. Broadcast quality? Forget about it. Scalability? Only with help.
SRT Smooth playback with minimal lag. Additional capabilities are still being developed.
WOWZ Sub-three-second global delivery. Proprietary to Wowza Streaming Cloud.
Low-Latency CMAF Streamlined workflows and decreased latency. Adaptive bitrate capabilities are still a work in progress.

 

These new technology stacks promise to reduce latency to 3 seconds or less!

Video Packaging and Protocols for Every Workflow

 

Depending on how you set up your streaming workflow, you’re not limited to one protocol from capture to playback. Many broadcasters use RTMP to get from the encoder to server, and then transcode the stream into an adaptive HTTP-based format. The best protocol for your live stream depends entirely on your use case. Find out how transcoding works in the next section.