Whether you’re uploading video to YouTube, streaming live on social media, or running your own streaming platform, video quality matters. Since the internet can’t handle streaming raw video’s extremely large file sizes, the name of the game in streaming is encoding video to maximize efficiency while maintaining quality.
To that end, there are two important settings you’ll have to select whenever you’re encoding video: resolution and bitrate. The question is, which setting makes the most difference when it comes to streaming quality? To understand the answer, let’s first understand what each setting means.
Bitrate determines the compression quality of a video file when it’s encoded. To explain, encoding video for streaming purposes simply means taking the raw video file and compressing it – making it smaller – by throwing out data that (hopefully) isn’t needed.
The more you compress a file (aka the more data you throw out), the more likely the effect will be noticeable to viewers. So, all things being equal, more compressed files have lower quality.
Bitrate is a measure that tells the encoder how much to compress. It’s expressed in kilobits (Kbps) or megabits (Mbps) per second. For example, 3Mpbs encoding means there will be three megabits of data per second of video. More – say 5Mpbs would mean more data per second – or less compressed.
To make things slightly more complicated, there are three ways an encoder can handle bitrate:
Setting a high bitrate will yield high-quality video but will take longer to encode, resulting in a larger file. Variable bitrates can help keep file size more manageable while still achieving good quality. For reference, bitrates around 2-3Mbps are workable for video intended for smartphones only. For content viewed on TVs and other bigger screens, 4-6Mpbs would be better.
Resolution is simpler to understand. It’s simply a measure of how big your video is. This is measured in pixels and is expressed either as a two-dimensional number like 1920x1080 (width is always first) or in simplified form with just the height, like 1080p. Thus, 1920x1080 resolution is 1920 pixels wide by 1080 pixels high.
For reference, 1080p (or 1920x1080) is “Full HD”, and 4K is 3840x2160.
The thing to understand about video resolution is that it's only as good as the screen outputting it. So, if you have a video file that’s 4K and it’s played on a screen that can only handle HD, the end result to a viewer is HD. Therefore if you know your video will never be viewed in 4K, there’s no reason to encode it at 4K.
As a rule of thumb, it’s good to know your audience and release your videos at the highest resolution viewers are likely to be capable of viewing. If a viewer uses a screen that’s only 720p, HD video will shrink down and look quite crisp. On the other hand, if your video is only 360p and it’s played on an HD screen, it will be expanded with a poor-quality, grainy appearance.
As you’ve probably realized, neither bitrate nor resolution is more important. Both work together (along with other factors) in determining how well a video performs online. Therefore, consider your audience and your content to determine which settings to use, and be sure to test before you release.
When you’ve got that down and you’re ready to take the next step in creating your own custom streaming empire, contact MAZ to talk about the endless possibilities.