Streaming Protocols: Everything You Need to Know (Update)


A man pressing a play button with different streaming protocols listed around it including DASH, CMAF, SRT, RTSP, HLS, and MPEG-DASH.

When it comes to online video delivery, RTMP, HLS, MPEG-DASH, and WebRTC refer to the streaming protocols used to get content from point A to B. Because different protocols play a role in different types of broadcasts, it’s not always fair to say that one is better than another.

Differentiating between all the acronyms is challenging for anyone entering the video streaming space. Add in the fact that each protocol varies slightly in terms of compatibility, latency, security — the list goes on — and things get downright puzzling.

In this article, we spell out exactly what a protocol is, the ten most common video streaming protocols in 2022, and considerations when selecting the right technology for your use case.


Table of Contents


What Is a Protocol?

A protocol is a set of rules governing how data travels from one communicating system to another. These are layered on top of one another to form a protocol stack. That way, protocols at each layer can focus on a specific function and cooperate with each other. The lowest layer acts as a foundation, and each layer above it adds complexity.

You’ve likely heard of an IP address, which stands for Internet Protocol. This protocol structures how devices using the internet communicate. The Internet Protocol sits at the network layer. It’s typically overlaid by the Transmission Control Protocol (TCP) at the transport layer, as well as the Hypertext Transfer Protocol (HTTP) at the application layer.

Chart showing the protocols at each layer of the protocol stack as well as their data unit.

The seven layers — which include physical, data link, network, transport, session, presentation, and application — were defined by the International Organization for Standardization’s (IS0’s) Open Systems Interconnection model, as depicted above.


What Is a Streaming Protocol?

Each time you watch a live stream or video on demand, video streaming protocols are used to deliver data over the internet. These can sit in the application, presentation, and session layers.

Online video delivery uses both streaming protocols and HTTP-based protocols. Streaming protocols like Real-Time Messaging Protocol (RTMP) transport video using dedicated streaming servers, whereas HTTP-based protocols rely on regular web servers to optimize the viewing experience and quickly scale. Finally, emerging HTTP-based technologies like Apple’s Low-Latency HLS seek to deliver the best of both options by supporting low-latency streaming at scale.


Become a Video Expert

Subscribe to our bi-weekly roundup of industry updates, trends, and more.

Sign Up

UDP vs. TCP: A Quick Background

User Datagram Protocol (UDP) and Transmission Control Protocol (TCP) are both core components of the internet protocol suite, residing in the transport layer. The protocols used for streaming sit on top of these. UDP and TCP differ in terms of quality and speed, so it’s worth taking a closer look.

Chart comparing benefits of UDP vs TCP
Adapted from

The primary difference between UDP and TCP hinges on the fact that TCP requires a three-way handshake when transporting data. The initiator (client) asks the accepter (server) to start a connection, the accepter responds, and the initiator acknowledges the response and maintains a session between either end. For this reason, TCP is quite reliable and can solve for packet loss and ordering. UDP, on the other hand, starts without requiring any handshake. It transports data regardless of any bandwidth constrains, making it speedier and riskier. Because UDP doesn’t support retransmissions, packet ordering, or error-checking, there’s potential for a network glitch to corrupt the data en route.

Protocols like Secure Reliable Transport (SRT) often use UDP, whereas protocols like HTTP Live Streaming (HLS) use TCP.


What Are the Most Common Protocols for Video Streaming?


Video Streaming Protocols Comparison in 2022


Traditional Video Streaming Protocols

Traditional streaming protocols, such as RTSP and RTMP, support low-latency streaming. But they aren’t natively supported on most endpoints (e.g., browsers, mobile devices, computers, and televisions). Today, these streaming formats work best for transporting video between and IP camera or encoder and a dedicated media server.

The Streaming Latency and Interactivity Continuum

As shown above, RTMP delivers video at roughly the same pace as a cable broadcast — in just over five seconds. RTSP/RTP is even quicker at around two seconds. Both formats achieve such speed by transmitting the data using a firehose approach rather than requiring local download or caching. But because very few players support RTMP and RTSP, they aren’t optimized for great viewing experiences at scale. Many broadcasters choose to transport live streams to the media server using a stateful protocol like RTMP. From there, they can transcode it into an HTTP-based technology for multi-device delivery.


Adobe RTMP

Adobe designed the RTMP specification at the dawn of streaming. The protocol could transport audio and video data between a dedicated streaming server and the Adobe Flash Player. Reliable and efficient, this worked great for live streaming. But open standards and adaptive bitrate streaming eventually edged RTMP out. The writing on the wall came when Adobe announced the death of Flash — which officially ended in 2020.

While Flash’s end-of-life date was overdue, the same cannot be said for using RTMP for video contribution. RTMP encoders are still a go-to for many content producers, even though the proprietary protocol has fallen out of favor for last-mile delivery.

In fact, in our 2021 Video Streaming Latency Report, more than 76% of content distributors indicated they use RTMP for ingest.

Which streaming formats are you currently using for ingest?

Graph comparing popularity of different ingest streaming protocols, including RTMP, RTSP, and WebRTC.
  • Video Codecs: H.264, VP8, VP6, Sorenson Spark®, Screen Video v1 & v2
  • Audio Codecs: AAC, AAC-LC, HE-AAC+ v1 & v2, MP3, Speex, Opus, Vorbis
  • Playback Compatibility: Not widely supported (Flash Player, Adobe AIR, RTMP-compatible players)
  • Benefits: Low-latency and requires no buffering
  • Drawbacks: Not optimized for quality of experience or scalability
  • Latency: 5 seconds
  • Variant Formats: RTMPT (tunneled through HTTP), RTMPE (encrypted), RTMPTE (tunneled and encrypted), RTMPS (encrypted over SSL), RTMFP (travels over UDP instead of TCP)



Like RTMP, RTSP/RTP describes an old-school technology used for video contribution. RTSP and RTP are often used interchangeably. But to be clear: RTSP is a presentation-layer protocol that lets end users command media servers via pause and play capabilities, whereas RTP is the transport protocol used to move said data.

Android and iOS devices don’t have RTSP-compatible players out of the box, making this another protocol that’s rarely used for playback. That said, RTSP remains standard in many surveillance and closed-circuit television (CCTV) architectures. Why? The reason is simple. RTSP support is still ubiquitous in IP cameras.

A surveillance operator watching live footage from and IP camera.
  • Video Codecs: H.265 (preview), H.264, VP9, VP8
  • Audio Codecs: AAC, AAC-LC, HE-AAC+ v1 & v2, MP3, Speex, Opus, Vorbis
  • Playback Compatibility: Not widely supported (Quicktime Player and other RTSP/RTP-compliant players, VideoLAN VLC media player, 3Gpp-compatible mobile devices)
  • Benefits: Low-latency and supported by most IP cameras
  • Drawbacks: No longer used for video delivery to end users
  • Latency: 2 seconds
  • Variant Formats: The entire stack of RTP, RTCP (Real-Time Control Protocol), and RTSP is often referred to as RTSP

Adaptive HTTP-Based Streaming Protocols

Streams deployed over HTTP are not technically “streams.” Rather, they’re progressive downloads sent via regular web servers. Using adaptive bitrate streaming, HTTP-based protocols deliver the best video quality and viewer experience possible — no matter the connection, software, or device. Some of the most common HTTP-based protocols include MPEG-DASH and Apple’s HLS.


Apple HLS

Since Apple is a major player in the world of internet-connected devices, it follows that Apple’s HLS protocol rules the digital video landscape. For one, the protocol supports adaptive bitrate streaming, which is key to viewer experience. More importantly, a stream delivered via HLS will play back on the majority of devices — thereby ensuring accessibility to a large audience.

HLS support was initially limited to iOS devices such as iPhones and iPads, but native support has since been added to a wide range of platforms. All Google Chrome browsers, as well as Android, Linux, Microsoft, and MacOS devices, can play streams delivered using HLS.


Never miss an HLS update

Subscribe to keep up with all the live streaming news from protocols to the latest trends.

  • Video Codecs: H.265, H.264
  • Audio Codecs: AAC-LC, HE-AAC+ v1 & v2, xHE-AAC, Apple Lossless, FLAC
  • Playback Compatibility: Great (All Google Chrome browsers; Android, Linux, Microsoft, and MacOS devices; several set-top boxes, smart TVs, and other players)
  • Benefits: Adaptive bitrate and widely supported
  • Drawbacks: Quality of experience is prioritized over low latency
  • Latency: 6-30 seconds (lower latency only possible when tuned)
  • Variant Formats: Low-Latency HLS (see below), PHLS (Protected HTTP Live Streaming)

Low-Latency HLS

Low-Latency HLS (LL-HLS) is the latest and greatest technology when it comes to low-latency streaming. The proprietary protocol promises to deliver sub-three-second streams globally. It also offers backward compatibility to existing clients.

In other words, it’s designed to deliver the same simplicity, scalability, and quality as HLS — while significantly shrinking the latency. At Wowza, we call this combination the streaming trifecta.

Even so, successful deployments of Low-Latency HLS require integration from vendors across the video delivery ecosystem. Support is still lacking, and large-scale deployments of Low-Latency HLS are few and far between.

  • Playback Compatibility: Any players that aren’t optimized for Low-Latency HLS can fall back to standard (higher-latency) HLS behavior
    • HLS-compatible devices include MacOS, Microsoft, Android, and Linux devices; all Google Chrome browsers; several set-top boxes, smart TVs, and other players
  • Benefits: Low latency, scalability, and high quality… Oh, and did we mention backward compatibility?
  • Drawbacks: As an emerging spec, vendors are still implementing support
  • Latency: 2 seconds or less



MPEG-DASH is a vendor-independent alternative to HLS. Basically, with DASH you get a non-proprietary option that ensures the same scalability and quality. But because Apple tends to prioritize its own tech stack, support for DASH plays second fiddle in the slew of Apple devices out there.

A video of a car appearing on a tablet and smart phone.
  • Video Codecs: Codec-agnostic
  • Audio Codecs: Codec-agnostic
  • Playback Compatibility: Good (All Android devices; most post-2012 Samsung, Philips, Panasonic, and Sony TVs; Chrome, Safari, and Firefox browsers)
  • Benefits: Vendor-independent, international standard for adaptive bitrate
  • Drawbacks: Not supported by iOS or Apple TV
  • Latency: 6-30 seconds (lower latency only possible when tuned)
  • Variant Formats: MPEG-DASH CENC (Common Encryption)


Low-Latency CMAF for DASH

Low-latency CMAF for DASH is another emerging technology for speeding up HTTP-based video delivery. Although it’s still in its infancy, the technology shows promise in delivering superfast video at scale by using shorter data segments. That said, many vendors have prioritized support for Low-Latency HLS over that of low-latency CMAF for DASH.

  • Playback Compatibility: Any players that aren’t optimized for low-latency CMAF for DASH can fall back to standard (higher-latency) DASH behavior
  • Benefits: Low latency meets HTTP-based streaming
  • Drawbacks: As an emerging spec, vendors are still implementing support
  • Latency: 3 seconds or less


Microsoft Smooth Streaming

Microsoft developed Microsoft Smooth Streaming in 2008 for use with Silverlight player applications. It enables adaptive delivery to all Microsoft devices. The protocol can’t compete with other HTTP-based formats and is falling out of use. In fact, in our 2021 Video Streaming Latency Report, only 5 percent of respondents were using Smooth Streaming.

Which streaming formats are you currently using?

A graph comparing the use of different streaming protocols for last-mile delivery and playback, with HLS leading the way, followed by MPEG-DASH and WebRTC.
  • Video Codecs: H.264, VC-1
  • Audio Codecs: AAC, MP3, WMA
  • Playback Compatibility: Good (Microsoft and iOS devices, Xbox, many smart TVs, Silverlight player-enabled browsers)
  • Benefits: Adaptive bitrate and supported by iOS
  • Drawbacks: Proprietary technology, doesn’t compete with HLS and DASH
  • Latency: 6-30 seconds (lower latency only possible when tuned)


Adobe HDS

HDS was developed for use with Flash Player as the first adaptive bitrate protocol. Because Flash is no more, it’s also slowly dying. Don’t believe us? Just take a look at the graph above.

  • Video Codecs: H.264, VP6
  • Audio Codecs: AAC, MP3
  • Playback Compatibility: Not widely supported (Flash Player, Adobe AIR)
  • Benefits: Adaptive bitrate technology for Flash
  • Drawbacks: Proprietary technology with lacking support
  • Latency: 6-30 seconds (lower latency only possible when tuned)

New Technologies

Last but not least, new technologies like WebRTC and SRT promise to change the landscape. Similar to low-latency CMAF for DASH and Apple Low-Latency HLS, these protocols were designed with latency in mind.



This open-source protocol is recognized as a proven alternative to proprietary transport technologies — helping to deliver reliable streams, regardless of network quality. It competes directly with RTMP and RTSP as a first-mile solution, but it’s still being adopted as encoders, decoders, and players add support.

From recovering lost packets to preserving timing behavior, SRT was designed to solve the challenges of video contribution and distribution across the public internet. And it’s quickly taking the industry by storm. One interactive use case for which SRT proved instrumental was the 2020 virtual NFL draft.  The NFL used this game-changing technology to connect 600 live feeds for the first entirely virtual event.

  • Video Codecs: Codec-agnostic
  • Audio Codecs: Codec-agnostic
  • Playback Compatibility: Limited (VLC Media Player, FFPlay, Haivision Play Pro, Haivision Play, Larix Player, Brightcove)
  • Benefits: High-quality, low-latency video over suboptimal networks
  • Drawbacks: Not widely supported for video playback
  • Latency: 3 seconds or less, tunable based on how much latency you want to trade for packet loss



As the speediest technology available, WebRTC delivers near-instantaneous voice and video streaming to and from any major browser. It can also be used end-to-end and thus competes with ingest and delivery protocols. The framework was designed for pure chat-based applications, but it’s now finding its way into more diverse use cases.

Scalability remains a challenge with WebRTC, though, so you’ll need to use a solution like Wowza’s Real-Time Streaming at Scale feature to overcome this. The solution deploys WebRTC across a custom CDN to provide near-limitless scale. This allows broadcasters to reach a million viewers with sub-500 ms delivery — a once impossible feat.

Workflow: Real-Time Streaming at Scale for Wowza Video

A workflow showing how Wowza's Real-Time Streaming at Scale Feature ensures real-time delivery to 1,000,000 viewers.
  • Video Codecs: H.264, VP8, VP9
  • Audio Codecs: Opus, iSAC, iLBC
  • Playback Compatibility: Chrome, Firefox, and Safari support WebRTC without any plugin
  • Benefits: Super fast and browser-based
  • Drawbacks: Designed for video conferencing and not scale
  • Latency: Sub-500-millisecond delivery

Considerations When Choosing a Streaming Protocol

Selecting the right media streaming protocol starts with defining what you’re trying to achieve. Latency, playback compatibility, and viewing experience can all be impacted. What’s more, content distributors don’t always stick with the same protocol from capture to playback. Many broadcasters use RTMP to get from the encoder to server and then transcode the stream into an adaptive HTTP-based format.

Typical RTMP to HLS workflow where a live stream in encoded into RTMP and repacked into HLS for playback across a range of devices.

Media streaming protocols differ in the following areas:

  • First-mile contribution vs. last-mile delivery
  • Playback support
  • Encoder support
  • Scalability
  • Latency
  • Quality of experience (adaptive bitrate enabled, etc.)
  • Security

By prioritizing the above considerations, it’s easy to narrow down what’s best for you.


Contribution vs. Delivery

RTMP and SRT are great bets for first-mile contribution, while both DASH and HLS lead the way when it comes to playback. On the flip side, RTMP has fallen entirely out of favor for delivery, and HLS isn’t an ideal ingest format. That’s why most content distributors rely on a media server or cloud-based video platform to transcode their content from one protocol to another.


Playback Support

What’s the point of distributing a stream if viewers can’t access it? Lacking playback support is the reason RTMP no longer plays a role in delivery to end users. And ubiquitous playback support is the reason why HLS is the most popular protocol today.


Encoder Support

The inverse of playback support is encoder support. RTMP maintains a stronghold despite its many flaws due to the prevalence of RTMP encoders already out there. Similarly, RTSP has stayed relevant in the surveillance industry because it’s the protocol of choice for IP cameras.

WebRTC is unique in that it can be used for browser-based publishing and playback without any additional technologies, enabling simple streaming for use cases that don’t require production-quality encoders and cameras.



HLS is synonymous with scalability. The widely-supported HTTP-based protocol leverages web servers to reach any device worth reaching today. But what it delivers in scalability, it lacks in terms of latency. That’s because latency and scalability have traditionally been at odds with one another. New technologies like Real-Time Streaming at Scale, however, resolve this polarity.



Low-Latency HLS, low-latency CMAF for DASH, and WebRTC were all designed with speedy delivery in mind. Anyone deploying interactive video environments should limit themselves to one of these three delivery protocols.


Download our low-latency deep dive

Find out what’s required to delivery interactive video experiences.

Get the PDF

Quality of Experience

While real-time delivery is crucial to some niche video experiences, high-quality delivery has become table stakes. Viewers no longer appreciate smooth, high-resolution streams. They simply expect it. And your best bet for ensuring a top-notch user experience is using a protocol that supports adaptive bitrate streaming. This is a core technology deployed by HLS and DASH.



Last but not least, you’ll want to consider content protection. Encryption and digital rights management (DRM) come standard with HLS and DASH. WebRTC, on the other hand, is secured by the browser — but it lacks DRM capabilities and standard out-of-the-box security measures for broadcasting workflows. So, you’ll want to keep this in mind when architecting your workflow.



What’s the best protocol for video streaming in 2022? That’s up to you.

The right video technologies vary depending on each business’s needs. Instead of looking for the best protocols, you should look for the video platform that provides the most flexibility.

Having a wide range of protocols to choose from is essential for anyone looking to reduce latency, increase playback compatibility, and even address the challenges of remote video contribution. What’s more, many video engineers today are building hybrid workflows that employ a mish-mash of protocols at ingest and delivery.

Video now powers businesses in every industry, which is why prescriptive streaming workflows and rigid technologies no longer do the trick. But you’re in luck. Wowza’s flexible video platform supports a wide range of protocols and can be customized to your needs.

Simply contact us today to talk to an expert about the best way to architect your streaming workflow or start a free trial.



Search Wowza Resources



Follow Us


About Traci Ruether

Traci Ruether is a Colorado-based B2B tech writer with a background in streaming and network infrastructure. Aside from writing, Traci enjoys cooking, gardening, and spending quality time with her kith and kin. Follow her on LinkedIn at or learn more at