Skip to content
What is Video Bandwidth featured

What is Video Bandwidth? Streaming Bandwidth Explained

By Author: Team Agora In Developer

There’s plenty of bandwidth on the internet. Still, worldwide internet use has seen an unprecedented uptick with everyone switching to remote work setups. This spike has many companies relying on real-time voice, video chat, and streaming, impacting operational costs. Since bandwidth is one of the most critical factors of live streaming solutions, knowing how much video bandwidth you need is crucial.

What is Video Bandwidth?

The ability of a network to send and receive data is measured in terms of bandwidth. Bandwidth on the Internet refers to the greatest quantity of data that may be sent and received in a given period. As a rule, it is measured in Megabits per second (Mbps).

The amount of upload bandwidth you need increases when uploading a video to the web. The quality of your outgoing broadcasts is dependent on your upload bandwidth, which you can find out by checking your network settings.

As a developer, you need to understand how bandwidth relates to the end user experience, especially if video call, voice call, live, stream, and chat are part of your service. Doing so will allow you to create an experience that optimizes video bandwidth to successfully deliver an elevated quality of experience (QoE).

With that in mind, let’s take a look at the associated parameters behind a stream video bandwidth, its impact on the user experience, and how Agora’s technology can help developers.


The quantity of information transmitted in a video at any given time is known as its bitrate.

Bitrate is the rate at which your video files transfer from the server to the web and are generally expressed in bps, kbps, and Mbps.

The quality of a live stream will be impacted by the bitrate used. Video quality degrades in the form of stuttering, buffering, or complete interruption when the bitrate surpasses available bandwidth (at any point along the channel). It won’t take long for users to close their session if the image quality drops. Finding the optimal compromise between file size (bitrate), video quality (resolution), and delivery latency (ping) has a significant impact on user engagement when it comes to live video streaming.

As a developer, providing an interactive, real-time experience with 1080p bandwidth video quality is often impossible, especially if you’re striving for synchronicity.

Frame Rate

All video is composed of these individual images, which are referred to as “frames.” These fast-moving still frames give the impression of a video’s continuous movement. The number of frames per second (fps) delivered to the viewer is known as the frame rate. Videos can be found in a variety of frame rates. Yet, creators will typically match the frame rate to the content delivered. For instance, most movies will run at 24 fps, which creates a cinematic feel. Conversely, most high-speed action scenarios, like a live sports game, rely on 60 fps to avoid artifacting or blurry video.

Live video streaming requires at least 30 fps along with its 1080p bandwidth. A lower frame rate may be acceptable for streaming videos with minimal action and movement. However, a greater frame rate is needed for streams with a lot of activity, such as sports.

There is a one-to-one relationship between frame rate and data transfer needs. The more frames per second a video has, the more bandwidth is needed.

Video Resolution

Several factors influence the final video resolution. The capturing device’s capabilities are the first factor. However, the specifics of a stream depend heavily on the parameters chosen for encoders, such as bitrate and frame rate.

It also stands to reason that the full HD video streaming bandwidth would be more taxing than the 720p streaming bandwidth since the file size would be larger to accommodate the higher resolution.

Video Encoding and Compression

Streaming videos via the internet requires nearly all videos to be compressed. This need is due to the huge file sizes that would be required if left uncompressed. Streaming bandwidth optimization is achieved through compression, which involves encoding the video stream in a way that uses less data than the source. The core of most compression methods is the elimination of redundant or irrelevant information. If, for instance, the background remains the same from one frame to the next, there is no reason to repeatedly transmit that data.

For this reason, video sequences with little activity can tolerate much higher levels of compression (with little obvious damage) than, say, a sporting event. Therefore, the bitrate is proportional to the resolution and the fps, less the loss that occurs during compression. Frame rate, resolution, and maximum bitrate can typically be set with video codecs.

Bandwidth vs. Internet Speed

It’s questionable whether bandwidth and speed can be compared on equal footing. They’re related to one another, but they’re not the same. You need to look at your download speed and your upload speed to get a full picture of your Internet speed.

Upload and download speeds are how quickly information may be transferred from your local network to the wider internet. Like Bandwidth, this rate is measured in Mbps (Mbps). How fast data from the internet can reach your network is indicated by its download speed.

The rate at which information may move from one device to another outside of its local network is known as its “upload speed.” Download and upload speeds are often measured in kilobits per second (Kbps) or megabits per second (Mbps).

Bandwidth is how much information can be sent in a given length of time. When you add up both of these numbers, you get something called “network throughput.”

When considering elements like latency, packet loss, jitter, and network speed, throughput determines how much information or data is really delivered in a given length of time.

The efficiency of a network depends on the reliability with which data packets are delivered. Even if a lot of bandwidth is available, service delays will occur if it isn’t used efficiently.

A common assumption is that high-bandwidth networks are quick and have great throughput. In actuality, this is not always the case — especially when the latency is factored into the equation. In a network, latency often results from the physical separation between the requesting client and the server processing its response. For example, website visitors in New Jersey should expect a response time of fewer than ten milliseconds if the site is hosted in a New York City data center. However, queries from users in San Francisco (about 2,900 miles away) will take significantly longer to arrive, at 50 milliseconds round-trip.

Round trip time is the total amount of time it takes for a server’s answer to reach a client device following a request made by the client (RTT). Due to the fact that information must travel both to and from a destination, the round-trip time (RTT) is equivalent to twice the amount of delay.

It’s easy to assume that these milliseconds don’t account for much. But when you consider the amount of time it takes to establish a connection between the client and the server, the size and load time of the page, and the possibility of difficulties with the network equipment the data passes through, even a few extra milliseconds can add up to a significant delay.

Distance between systems (as in a wide area network), the number of intermediate nodes, packet sizes, jitter, and network congestion are all contributors to latency.

For instance, Internet Exchange Points (IXPs) are where data packets stop when traveling between networks. There, RTT increases by a few milliseconds because routers must process and route the data packets and sometimes must break them up into smaller packets.

A greater amount of delay is introduced into the system when data packets are dropped and resent due to any of the aforementioned causes. Retransmitting more data over longer distances uses more capacity and slows down the network.

Contending with latency, having a fast enough upload speed, and balancing bandwidth is crucial for live streaming.

Why is Video Bandwidth Important?

When a video freezes, a large download stalls, or a streaming service spins endlessly, bandwidth is often to blame. Consumers’ insatiable appetite for online information is increasing the demand for faster internet speeds and more streaming video bandwidth, but it’s necessary to pause and consider the factors behind this trend.

Since the early days of the Internet, consumers’ demands for bandwidth have steadily increased, with video and gaming being two of the primary drivers.

Bandwidth requirements often increase by roughly 50% annually. In five years, approximately 40 Mbps will likely be necessary to deliver a similar experience to what a light residential internet user can get by with now (where 5 to 10 Mbps may be sufficient). This is projected to reach 300 Mbps in ten years.

Networks dependent on video-on-demand services, live-streaming, or environments with many early adopters are more likely to experience this kind of expansion. This exponential expansion has been documented all the way back to the beginning of the Internet.

The demand for increased bandwidth and faster speeds is ever-present. There is a growing selection of 4K videos available on streaming sites. The industry shifted to distributing new games via download, but firms like Google Stadia have already shifted toward a streaming model.

Your web service’s capacity to handle the increasing number of simultaneous users will be tested as the number of users expands. Once again, additional bandwidth will be required to handle this influx. The capacity of your network must be sufficient to support the number of visitors to your application.

Real-time video broadcasts can be quite data-intensive. How much exactly will depend on a number of variables discussed above. However, let’s look a little deeper into some of the figures behind 720p, 1080p, and 4k video bandwidth.

Live video streaming at 720p at 30 frames per second requires an upload speed of up to 4.16 Mbps. Assuming this, an upload speed of about 4 Mbps will suffice.

However, this is predicated on the concept that the user has a consistent upload speed of up to 4.16 Mbps. As mentioned before, upload speeds are unpredictable. Therefore, it’s best to leave a 35–40% buffer. That’s equivalent to about 5.7 Mbps of upload speed.

The variance in the amount of bandwidth necessary for streams at higher resolutions grows wildly. For instance, a 1080p bandwidth video at 30 fps would require upload speeds between 3.8 Mbps and 7.4 Mbps, with 5 Mbps being the recommended bandwidth. This amount increases to a range between 5.6 Mbps and 11 Mbps if the user wants to stream at 60 fps. If the user wanted to increase the video resolution to 4k at 30 fps, they would need an upload speed between 15.8 Mbps and 41 Mbps, though they wouldn’t want to go below 25 Mbps.

Grow with Agora

As the demand for video bandwidth grows, developers will need to take steps to accommodate it.

Our real-time voice, video chat, and live video streaming solutions encourage real-time exchanges. Agora’s technology is built to automatically make adjustments to improve the user experience, including scenarios that involve issues with low bandwidth. For example, if a user has a poor internet connection on an older device, Agora can automatically adjust the video quality to keep it stable, avoiding jitter and packet loss in real-time.

Contact us at 408-879-5885 or register on our website today to begin implementing our technology into your next web or mobile project.