#News, #sports, #online #gaming, user-generated content (#UGC), #web #conferencing—you name the market, and #live-#streaming video is now ubiquitous in it. Content providers and end users across industries all want a piece of the real-time action. However, one complaint is still common from video streamers and viewers, despite nearly two decades of refining the technology: #latency is too high.
#Latency describes the delay between the time a live stream is captured on camera and the time it appears on a viewer’s screen. A significant amount of delay may be involved when live-streaming video, especially as compared to traditional TV broadcasts. Even though “live” cable feeds have five to 10 seconds of latency, viewers perceive the broadcast signal as being instantly available, in real time, as soon as they turn on their TV—and they expect live-streaming video to perform the same way.
For many use cases, streaming in or near real time is crucial to the user experience. For example, #game-#streaming services and two-way #conferencing platforms rely on low latency to deliver real-time interactions. For live-streaming news and sports platforms, latency must keep pace with cable and satellite TV broadcasts to deliver current coverage and prevent spoilers for viewers.
But this presents a challenge for the streaming industry. A balance must be struck between three conflicting constraints, only two of which can be met effectively: low latency, high resolution and all-conditions playback. To complicate things further, universal support for Flash Player in browsers—which has been the leader for years in delivering low-latency audio and video streams—is fading fast.
So, what are the alternatives? Luckily, there are several options, depending on your use case. These include:
Traditional streaming protocols (#RTSP, #RTMP) #HTTP-based adaptive streaming protocols (#HDS, #HLS, #Smooth #Streaming, #MPEG-#DASH) Emerging technology (#WebRTC, #WebSocket)
Written by Wowza Media Systems
댓글