Observations: The Power of SRT + Latency in Bandwidth Constrained Situations

Update (10/11/21) Added a note discussing the max supported latency of SRT

So to start this out, I need to establish some context: Over the past three or four days, I've been having internet issues at home. Particularly, with the upload connection.

With that in mind, early Sunday evening I was messing around and testing some off-site streaming decoders. I needed to send out a stream to test with, but quickly realized that wasn't a very feasible option with my internet in the state that it was.

However, out of curiosity, turning to my recent experience with the SRT protocol, I decided to try something just to see if it would work, and the results shocked me.

So what was that 'something'? We'll get to that.

The bulk of the rest of this story will be told through text and video, to better illustrate things.

I would recommend that you check out the videos in full-screen for clarity.

Also please mind that the controls don't block anything on screen, as there's a few points in the videos where information is displayed at the very bottom of the screen.

So to start out, lets take a look at my upload-speed prior to this little experiment, taken just minutes before on the same PC, same connection.


There's one additional thing to note, I'll be using a lot of latency with SRT. SRT allows you to define latency, and I'm taking advantage of that here. I'm not an expert, but it seems like the latency relates to some sort of buffer.

You can adjust the latency as shown in the video, the latency is measured in micro-seconds.

Update: Someone in a Facebook group that I shared this article in mentioned that the max supported latency of SRT, per the specification, is 8 seconds. Based on some quick Googling, this seems right. 20ms minimum, 8000 ms maximum. I haven't checked in any technical sheets. But it's what every manufacturer and developer has listed on their site, so I'm going with it. Keep things at 8 seconds or less. I used 10 seconds in this test, and 12 in others, but this was just a test, better safe than sorry in a production.


And of course, we'll want to discuss the encoding settings. They'll be the same for both the SRT and RTMP tests: H.264, Superfast, 2.3 Mbps, 1280x720p, 96 kbps audio. Pretty standard stuff. Part of the reason I used OBS for this test is that it allows more precise tuning of encoding for SRT.

The encoding settings look like this


So with that out of the way, lets get to the experiment: Streaming on this connection, reliably, with two different protocols: RTMP, and SRT.

And now on to the show, first up RTMP, lets see how it performs

RTMP Streaming Test


It's...not good. In fact it's unusable, over 90% of the frames sent to the CDN were dropped, as expected, we're trying to push 2.3 Mbps down a connection that's fluctuating between 400 and 900 Kbps, and God only knows what the packet-loss and jitter is like.

But, just for laughs, we're going to try the test again with SRT, and as discussed previously, we're going to give it a bunch of latency, 10 seconds in this instance.

SRT Streaming Test


It's kind of night and day isn't it? One works and one...doesn't. But more than that, the SRT stream is actually pretty flawless. There's very little quality degradation and, aside from the increased latency, it looks just like a regular RTMP stream would in this instance.

I can't explain how this works; how I was able to effectively push 2.3 Mbps through a pipe that was being throttled down to 400 Kbps at times. I guess it's a combination of the latency, buffer, and error-correction.

I likewise cannot guarantee this will be the case for you, or be replicable over time. But I've had similar experiences with SRT previously, and as someone that deals with a lot of customers broadcasting from rural areas, this is very encouraging. I'll continue experimenting with what can be accomplished with this protocol in relation to bandwidth-challenged situations.

Michael Wilson