NTEK & Verzion got it too "4 ms seconds" & & It lo
Post# of 96879
Low-Latency Streaming: What Is It & Do You Need It?
July 12, 2016 by Candace Cunningham
Here's a dirty secret: when it comes to media, live rarely means live. Say you're at home watching a live-streamed concert and you see an overexcited audience member jump onstage; the audience at the concert venue saw that happen at least 30 seconds before you did. That's because it takes time to pass chunks of data from one place to another. That delay between when a camera captures video and when the video is displayed is latency.
What Is Low Latency?
So, if several seconds of latency is normal, what is low latency? It's a subjective term. By default, latency with the super-popular Apple HLS streaming protocol is 30–45 seconds (more on this below), so when people talk about low latency, they are often speaking of whittling that down to single-digit seconds. However, the term low latency also covers what's often termed real-time streaming, in which case we're talking about milliseconds.
When Is Low Latency Important?
No one wants notably high latency, of course, but in what contexts does low latency truly matter? For most streaming scenarios the regular 30–45-second delay isn't problematic. Returning to our concert example, it's irrelevant that the lead guitarist broke a string 36 seconds ago and you're just now finding out. But for some streaming use cases, latency is a business-critical consideration.
Let's take a look at a few streaming use cases where low latency is undeniably important.
Second-screen experiences—If you're watching an event on TV as well as on a second-screen app, you can tell quickly if there's a latency issue, and it won't make you happy. Imagine that your alma mater offers a second-screen-experience app to let you see alternate camera angles and exchange comments with other fans. The game-winning score happens on TV, but it doesn't stream to your app until nearly a minute later. The moment for exchanging comments about the winning play in the app has passed. However, the sweet spot for latency here is not the ultra-low, "real-time" latency we'll discuss next. That's because there's latency for the television broadcast, too—if you’re watching on digital cable, as most households do now, the broadcast latency can be as much as six seconds. Your second-screen app just needs to match that level of latency to deliver a fantastic experience that's in sync with the televised content.
Video chat—This is where ultra-low-latency "real-time" streaming comes into play. We've all seen televised interviews where the reporter is speaking to someone at a remote location, and the latency in their exchange results in long pauses and the two parties talking over each other. That's because the latency goes both ways—maybe it takes a full second for the reporter's question to make it to the interviewee, but then it takes another second for the interviewee's reply to get back to the reporter. That conversation can quickly turn painful. When true immediacy matters, about 150 milliseconds (one-seventh of a second) latency in each direction is the upper limit. That's short enough to allow for smooth conversation without awkward pauses.
Betting and bidding—Activities like auctions and sports-track betting are exciting because of their fast pace. And that speed calls for real-time streaming. For instance, horse-racing tracks have traditionally piped in satellite feeds from other tracks around the world and allowed their patrons to bet on them online. Satellite delays and costs can both be high. Ultra-low-latency streaming eliminates problematic delays and cuts costs. Similarly, online auctions are big business, and any delay can mean bids aren't recorded properly. Fractions of a second make all the difference.
Video gaming—Anyone who has yelled, "this game cheats!" (or more colorful invectives) at a screen knows that timing is critical for gamers. Sub-100-millisecond latency is a must. No one wants to play a game via a streaming service and discover that they're firing at enemies that are no longer there.
How Does Low-Latency Streaming Work?
Now that you know what low latency is and when it's important, you're probably wondering, how can you deliver low-latency streaming? As with most things in life, low-latency streaming involves tradeoffs. You'll have to balance three factors to find the mix that's right for you:
Encoding protocol and device/player compatibility
Audience size and geographic distribution
Video resolution and complexity
The streaming protocol you choose makes a big difference, so let's dig into that:
Apple HLS is among the most widely used streaming protocols due to its reliability, but it's not suited to true low-latency streaming. That's because HLS is an HTTP-based protocol, which means it streams chunks of data. Because each video chunk is generated and viewed in real time, the chunk size plays a big part in latency. The default Apple HLS chunk size is 10 seconds, leading to latency of up to 45 seconds, as mentioned above. Customization can cut that significantly, but it will never be sufficient for an ultra-low-latency scenario. Exacerbating the problem, your viewers will experience more buffering the smaller you make those chunks.
RTMP (Real-Time Messaging Protocol) and WebRTC (Web Real-Time Communications) are the standards for low-latency streaming.
RTMP delivers good low-latency streaming, but requires a Flash-based player, meaning it is not supported on iOS devices (and soon won't work on many web browsers).
WebRTC is the emerging standard already deployed across many platforms, and all signs point to its likely native adoption by Apple for iOS. WebRTC allows for low-latency delivery in an HTML5-based, Flash-free environment.
Another major consideration is your streaming server or cloud service. You'll want streaming technology that provides you with fine-grained control over latency and video quality, and provides you with the greatest flexibility.
Wowza technology was designed for multiple types of streaming: high scale as well as low latency. Why is this mix important? It gives you the best of both worlds: low-latency streaming with the ability to scale delivery to audiences of any size.
Watch part one of our four-part low latency whiteboard series now to learn about quality of service, audience size, geographic scope, and more. Then delve into the remaining whiteboard sessions for more on how low latency works, and the Wowza approach to low-latency streaming.
Also a data link on Latency pings of Verizons
http://www.verizonenterprise.com/about/network/latency/