Video Streaming Protocol ComparisonOctober 6, 2020
RTSP, HLS, WebRTC — the list goes on. It’s challenging for even video engineers to differentiate between the many acronyms that get thrown around when discussing live steaming protocols. In this video, we compare the most common protocols used for live streaming, as well as the pros and cons of each.
Full Video Transcript:
A protocol is a set of rules that determine how live video and audio travel through a network. Generally, they start from an encoder, get ingested by a media server, then are egressed to players around the world. Because the encoder and player have different needs, the same protocol isn’t usually used for the whole process, and so a media server is needed to transmux the stream into a different protocol and even transcode the data to use different codecs.
On the encoder side, we have RTMP and RTSP, old school protocols that have been around longer than most. On the playback side, we have HLS, Low-Latency HLS, and DASH. These are HTTP-oriented, made for the internet. WebRTC and SRT are new protocols that are becoming more popular and can be used on both sides.
One other important thing to mention is the protocol transport layer, TCP and UDP. It’s good to know which one’s being used, as TCP works like a conversation between two people with an introduction and a lot of interaction; whereas UDP is like a presenter speaking to an audience.
Now that we have a little background, let’s compare. Adobe RTMP is the most well-known live streaming protocol. Adobe used it between their dedicated streaming server and Flash Player. It was heavily-used for decades; but more recently with the death of Flash, it’s mainly used for ingest by media servers, where it has very wide support. This protocol uses TCP, but it requires no buffering, making it fairly low latency. On the downside, it’s not optimized for quality of experience or scalability.
RTSP is mainly used by IP cameras today, especially for closed circuit television. The protocol was designed to command media servers on a private network. It uses UDP, so it’s very low latency; but it can also use TCP, and it’s pretty customizable. On the downside, it doesn’t scale well, isn’t reliable on poor networks, and it’s not supported by HTML5. On a private network is where it excels best, used between IP cameras, switchers, and media servers.
Apple HLS is the most-used protocol today when it comes to playback and media server egress. It’s supported by most browsers, mobile devices, set-top boxes, and even smart TVs. It uses TCP and can adapt to available bandwidth. Supported by HTML5, HLS has become more mainstream with the death of Flash, and it’s now widely-accepted. On the downside, with 10-second chunks and 3-chunk buffering, HLS has a 30 to 45-seconds in latency.
Low-Latency HLS is on the horizon and designed to reduce the latency of HLS by using HTTP to push delivery combined with shorter media chunks. However, this requires support by vendors of media servers, content delivery networks, and players. While it’s still in development, we hope to see availability by the end of 2020.
DASH is often called MPEG-DASH, as it was developed by the Motion Pictures Expert Group. It also uses TCP and is HTML5 compatible. It’s also code-agnostic and open-source standard and supported almost everywhere but on Apple devices. As it can’t be used in Safari or on an iOS device, this creates a challenge for those looking to stream with it. With Low-Latency HLS soon to arrive and iOS users representing over 25% of the global mobile operating systems in use, it’s doubtful DASH will be accepted by Apple any time soon.
WebRTC is open-source and was designed as a way to directly communicate between modern browsers. It’s peer-to-peer, requires a secure connection, and can use UDP or TCP. This protocol is ultra low latency, using a browser as the encoder for near real-time playback on a receiving browser. However, it doesn’t scale well to a mass audience, and it can’t be used outside of a browser for either encoding or playback from a computer. That said, there are libraries for native clients like Android and iOS to offer the same functionality within an app that’s usually done through the browser.
SRT is also open-source and is made for the first mile of streaming. It was developed by a number of leaders in live video streaming, including Haivision and Wowza. SRT was built to control the amount of latency of a stream, eliminate issues like jitter from packet loss over poor networks, and offer 256-bit AES encryption. SRT employs its own packet-loss recovery method using UDP, which can be tuned if necessary, and it’s been slowly gaining acceptance within the industry. It’s also HTML5-compatible and may soon become a standard playback option.
That’s it for live video streaming protocols. While RTMP to HLS is still the most widely-used workflow, other available options are gaining traction. If you need help determining what might work best for your needs, please contact us at Wowza.