As more people ditch traditional television in favor of mobile apps, smart TVs, and more, the way we stream live sports and other events is changing. What was once a strictly one-sided viewing experience now has the opportunity to be something entirely new. That’s why the Agora team is excited to announce our newest feature which allows companies to insert external video streams into live broadcasts.
This service will allow companies to create interactive experiences with their audiences. For example, a show’s host could invite the audience to share their thoughts and opinions during a live sports streaming. Think of it as a video-version of a Twitter conversation. A host could also pull external video streams (TV or game shows, video footages from drones, and of course, sports events) and insert them into the broadcasting stream.
Companies can face technical issues when they’re implementing live streams into live broadcasts, however. Here are a few complications developers might encounter along the way and how they can address those issues:
Challenges of Inserting an External Video Stream into a Live Broadcast
Challenge 1: Transcoding the External Video Stream
External video streams with different formats, resolutions, frame rates, and bit rates have to be transcoded to accommodate different clients or network conditions. The original format of the video streams captured by most video streaming websites is in MP4, while the resolutions vary from 360p to 1080p. The transcoding process generates either one video stream for CDN publication or multiple video streams of different resolutions, and the server decides the distribution strategy.
Transcoding external video streams also removes inconsistencies in the frame rate, bit rate, and resolution between the host’s video stream and the external video streams and creates a composite layout of both video streams.
Ineffective transcoding can limit the external video stream formats to AVX and H.264, for example, making it impossible to modify the video profiles. When combining the external video streams with the host’s video streams, developers must also consider the composite layout logistics and audio effect processing.
Challenge 2: Synchronizing the External and Local Video Streams
The external and local video streams are then sent to the server to create a composite layout.
In addition to the composite layout, the synchronization between the commentary and the external video streams in transcoding must also be addressed.
The following figure depicts what the video stream synchronization looks like when using the Agora SD-RTN™ (Software Defined Real-time Network) as the server:
● The host or commentator creates a local stream (yellow) while simultaneously commentating on the external video streams.
● An external stream (blue) is transcoded and transmitted to the Agora SD-RTN.
● The local stream and the external streams are combined (red) to create a composite layout before being published to the CDN.
● The local and external streams use the latencies and timestamps to synchronize.
Challenge 3: Signaling Reliability
Requesting and transcoding the external video streams and combining the local and external video streams depend on a reliable signaling system. To ensure each user is assigned the nearest edge server, developers should deploy a network covering most backbone networks to improve reliability.
How-to Using Agora’s SDK
So we’ve identified the challenges associated with embedding an external video into a live broadcast, but all isn’t lost. The “External Stream into a Live Broadcast” feature introduced in the Agora SDK v2.1.1 removes the obstacles, allowing companies and developers to easily incorporate this feature into their existing services. Here’s how it works:
Processing the external video stream
- The host sends a signal requesting the transcoding server to pull video streams from an external source.
- The external video stream is transcoded and pushed to the Agora SD-RTN.
- The local video stream is then pushed to the Agora SD-RTN.
- The external and local video streams are combined by The Agora SD-RTN.
- The combined video stream is transcoded to the real-time messaging protocol (RTMP) and distributed to the CDN.
This solution can also be used to support a variety of scenarios. By inserting one or multiple HTTP live streams (HLS) or RTMP streams into the broadcast, the hosts and audience can interact with each other while watching the same movie, game or show. Video footage from drones or remote cameras can also be incorporated into broadcasts. In other words, the opportunities are virtually limitless. For the detailed API reference and implementation, see the Agora Developer Center.