Video metrics explained

7 min read

Video is a sequence of individual images (frames), which are made of pixels.

Resolution

Resolution (or sometimes definition) is the number of pixels in a single frame. It is usually defined by two numbers: the width and height of every video frame in pixels. A pixel is a single point of the same color, small enough so our eyes would not perceive individual pixels in an image, but rather see smooth contours and color variations. A “megapixel” is not a very big pixel but rather 1 million pixels, a common way to measure the amount of data in a video frame.

Some resolutions used to have common names, mainly for historical reasons. For example, the resolution of 640 by 480 pixels is often referred to as “VGA” which stands for “Video Graphics Array”, an obsolete kind of computer graphics hardware. “HD” refers to “high definition”.

Picture size (pixels)Common nameAmount of megapixels
640 × 480VGA0.3 Mpix
1280 × 720HD1 Mpix
1920 × 1080FullHD2 Mpix
3840 × 21604K8.5 Mpix

Frames per second (framerate)

Frames per second (FPS) is simply the number of images captured every second.

In earlier times, videos were captured at 24 images per second. This frequency is weakly related to the way our human visual system works: 24 images per second are just enough to create the illusion of smooth motion for the human eye. A smaller number of frames per second may create a perceivable “slideshow” effect.

Modern digital cameras can capture videos at a much higher frequency. An average smartphone camera is capable of capturing 60 frames per second or even more. It is usually hard to perceive the difference beyond 30 images per second though; higher framerates are typically used for FX effects such as slow motion, which is achieved by playing a high-framerate video at a slower speed.

Bitrate

Bitrate is the amount of data processed or transmitted per unit of time in a digital video stream. It is, therefore, the number of bits per second. In a real-world scenario, such a number can be very big, so it is convenient to rather use multiples of 1000:

UnitAcronymEquivalent amount
kilobit per secondkbps or kb/s1000 bits per second
megabit per secondMbps or Mb/s1000’000 bits per second
gigabit per secondGbps or Gb/s1000’000’000 bits per second

These units are not to be confused with bytes per second (kilobytes, megabytes, and gigabytes respectively). They represent the equivalent volume of information if multiplied by a factor of 8, as 1 byte is 8 bits.

How resolution, framerate and bitrate are determined?

When a video is captured, the camera sensor is configured to capture images at a given resolution and number of frames per second. The sensor is arguably the most important part of the camera: it converts light into an electric signal, which then becomes a video we can store, send across the network, and watch. The maximum resolution and framerate of the camera are limited by the sensor capabilities. Within those limits, it is usually up to the camera user to choose the desired resolution and framerate.

Unlike the sensor’s resolution and frame rate, which have physical limitations, bitrate is a parameter that can be adjusted based on factors such as available storage space, desired video quality, and transmission bandwidth. For example, when streaming video over the internet, the bitrate needs to be optimized to ensure smooth playback.

What is a good resolution, framerate, bitrate?

To achieve good visual quality in a video, a framerate of at least 25 or 30 frames per second is recommended. HD video at 3-5 Mbps bitrate or FullHD at 10-20 Mbps would usually look good in many use cases.

But the actual answer depends on many factors. Choosing the appropriate bitrate and resolution involves striking a balance between video quality and data size. Too low of a bitrate can lead to noticeable compression artifacts and loss of detail, especially in scenes with high motion or complexity, and/or when watching on a big screen. Conversely, excessively high bitrates can result in unnecessarily large file sizes, making it impractical for storage or streaming.

Modern video codecs, such as H.264 and H.265 (also known as MPEG-4 AVC and HEVC, respectively), employ advanced compression algorithms to optimize bitrate efficiency while maintaining quality. These codecs use techniques like predictive coding, motion compensation, and entropy encoding to reduce redundancy in the video data, allowing for lower bitrates without significant loss of quality.