Live streaming has become an important lifestyle in the Internet entertainment and social realm. If you are a developer working on live streaming-related products, there are 5 things you need to know about live streaming protocols.
Main live streaming protocols and their properties
RTMP stands for Real-Time Messaging Protocol. It is a TCP-based streaming protocol. RTMP streams can be ingested by an RTMP encoder such as OBS ( i.e., Open Broadcaster Software ). And you can publish them to various destinations such as a streaming cloud server, an online live streaming platform, or a CDN. The RTMP latency is about 2 seconds.
ZEGOCLOUD provides an interactive live streaming solution using RTMP protocol for CDN broadcasting, but can achieve a latency of about 1 second. It not only includes acceleration using ZEGOCLOUD’s proprietary data transmission network, but also includes CDN distribution using RTMP. Please refer to documentation article, “the difference between ZEGOCLOUD live streaming and common CDN plus RTMP live streaming solution“, to learn more.
HLS stands for HTTP Live Streaming. It is an HTTP-based adaptive streaming communications protocol. We can use OBS to ingest HLS streams too. Likewise, you can publish HLS streams to a streaming cloud server, an online live streaming platform, or a CDN. The audience should use a web browser to watch HLS streams. The HLS latency is usually more than 10 seconds.
SRT stands for Secure Reliable Transport. It is a UDP-based open-source streaming protocol. We can ingest SRT streams by using an SRT streaming encoder like OBS. The destination of an SRT stream must be a server or cloud service that supports SRT. Some CDN network operators have declared support for SRT. The playback of SRT streams must be done using a native player since SRT is based on UDP, and browsers are currently not friendly to SRT. The SRT latency can be less than 1 second.
A live stream journey will include publishing, acceleration and broadcasting. The live streaming protocols fit into the part of high concurrency broadcasting with the support of CDN. To learn about how an interactive live streaming system works, you can refer to this blog article: “How Does a Truly Interactive Live Streaming System Works“.
Backgrounds and trends of the live streaming protocols
RTMP was developed as a proprietary protocol to stream media data over the Internet. It was created by Macromedia, which was acquired by Adobe later. Although Adobe Flash player has become obsolete, RTMP still plays an important role in the realm of live streaming. Nowadays, RTMP is one of the most common streaming protocols since it is supported by low-cost RTMP encoders such as OBS and also most CDNs. However, support for RTMP on the playback side is not so common.
HLS was developed by Apple. Since Apple has learned its lesson from the QuickTime service, HLS is designed as an open and evolving specification and has been upgraded with new features regularly. It is one of the most common streaming protocols for live streaming, given that it is an open standard, and it is widely supported by mainstream browsers such as Google Chrome and Apple Safari.
SRT is a relatively new streaming protocol, and it is expected to be the live stream protocol of the future. SRT was developed by Haivision, a leading video streaming, and video encoding solution provider. SRT is an open-source protocol, secure and reliable, and can achieve low latency. With that being said, it still lacks support for playback and ingestion to some extent.
Generally speaking, RTMP is still widely used in the realm of live streaming since its latency is relatively low, and is well-supported by low-cost RTMP encoders. HLS is widely used too and might be trending up in the future since it is strongly backed by Apple, and is fully supported by mainstream browsers. SRT might be the future live streaming protocol if the support for its encoding and playback will keep growing and be widely adopted in the technology ecosystem.
The pros and cons of the live streaming protocols
RTMP was chosen by the market as one of the most commonly used streaming protocols for some reasons. Firstly, it can achieve a relatively low latency of 1 to 2 seconds, which is much better than HLS. Secondly, it is supported by a variety of low-cost RTMP encoders, including OBS and FFmpeg, which are open-source and free software. Furthermore, it is widely supported by mainstream CDNs. With that being said, RTMP’s future is limited since the Flash player is obsolete and it lacks playback support. At the end of 2020, Adobe ended support for RTMP. This marks the end of the growth of the legendary protocol. In addition, RTMP cannot achieve ultra-low latency, say less than 300 milliseconds, which is possible for UDP-based streaming protocols. Moreover, RTMP is not supported by HTML 5 and cannot be used in web browser scenarios.
HLS is the most widely used streaming protocol on the internet. Why is it so popular? First, it has nearly universal playback support. Although it was created by Apple for iOS devices, it is now supported by all the mainstream web browsers like Apple Safari, Google Chrome, and Firefox, and a wide range of devices, including iOS, Android, and smart TVs, set-top boxes, and others. Secondly, since it is an HTTP-based protocol, you don’t have to worry about firewall transversal. Furthermore, HLS is an adaptive bitrate streaming protocol, which means it can dynamically adjust the quality of the video stream according to the available bandwidth and device performance to deliver the best possible streaming experience. The downside of HLS is that the latency can be relatively high, typically more than 10 seconds. The high latency makes it unfit for interactive live streaming.
Now, let’s take a look at SRT. SRT is said to be a live streaming protocol of the future for several reasons. Firstly, it is an open-source protocol. Haivision has created the SRT Alliance, which is a group of technology companies and telecommunication operators that are dedicated to promoting the adoption of SRT. Secondly, it can achieve ultra-low latency of less than 1 second since it is a UDP-based streaming protocol, and it delivers high reliability as it uses ARQ and FEC to detect and correct errors in data transmission. However, currently, SRT is still a relatively new protocol and is not widely supported yet.
4. Why are the protocols different in latency performance?
From what we’ve discussed previously, you may have already noticed that the three protocols offer different levels of latency. So, let’s go a little further to see why they have such a difference in latency performance.
Let’s start with HLS, which has the highest latency among the three. Why does HLS have a high latency? First and most importantly, HLS works by segmenting the stream into a sequence of small HTTP-based file downloads (referred to as segments or chunks), and Apple recommends a six-second segment size and at least 3 segments to be buffered before playback can start. This results in about a 20-second delay. Secondly, the CDN delivery will introduce another 15-30 seconds of latency. So, the overall streaming latency of HLS will be in the range of 10 to 45 seconds.
The latency of RTMP is much lower compared to HLS but higher than SRT. First of all, it is a TCP-based protocol, and its efficiency of media data transmission is higher than that of an HTTP-based one like HLS, but it still cannot match that of a UDP-based one like SRT or a private protocol. Secondly, its packets are not cut into chunks and are transmitted as continuous streams, so unlike HLS, there is no intrinsic latency caused by the segment duration. However, it still relies on a CDN for high concurrency delivery, which will introduce a big part of the latency. With RTMP, the streaming latency will be in the range of 1 to 3 seconds, still not ideal for interactive live streaming.
SRT delivers an ultra-low latency of less than 1 second. SRT is much faster primarily because it is a UDP-based protocol. UDP is a connectionless protocol that doesn’t require handshakes. Data is processed in order of arrival without verifying the sequence; instead, it uses a simple checksum to verify data integrity. Therefore, UDP-based protocols have lower latency. UDP itself is unreliable for that its packets will be disordered or even lost. SRT has built a layer of loss recovery using techniques such as ARQ and FEC to ensure reliability. SRT’s ultra-low latency makes it suitable for interactive live streaming.
5. How to choose the most suitable live streaming protocols?
Now that we already have some idea about the differences among the three protocols: HLS, RTMP, and SRT. Then, how to choose the most suitable live streaming protocols for your scenario?
There are two important factors to consider: first, does your application require live interactions; and second, are the users using your service on web browsers or native apps.
If you need to build interactive live streaming, which means your app allows users to interact in real time via audio and video during a live streaming session, you should consider SRT, for it offers ultra-low latency. Since SRT lacks playback support in web browsers, you will have to build your native client app.
If you don’t need live interactions, and your users mostly access your service from web browsers, then HLS would be the choice.
If you don’t need interactions and your users are mainly in your native client app, then RTMP might be the right choice for its relatively low latency and low infrastructure cost of using CDN for stream distribution.
Finally, let me mention that ZEGOCLOUD offers a premium live streaming solution that delivers outstanding live audio and video quality with ultra-low latency of less than 300 milliseconds. Check out more information at the live streaming product page. If you have any questions or need any further information, you are more than welcome to contact us.