After Effects tutorial: Understanding video formats in After Effects

What you’ll learn in this After Effects Tutorial:

  • Understanding Video Formats

This tutorial provides you with a foundation for working with Adobe After Effects video formats. It is the first lesson in the Adobe After Effects CS6 Digital Classroom book. For more Adobe After Effects training options, visit AGI’s After Effects Classes.

Adobe After Effects Tutorial: Understanding video formats in After Effects

Some video formats are common for professional video production, while others are suitable only for broadband or small-screen purposes. There are two main standards used for broadcast television, a handful of competing standards for desktop and web video, and a series of device-specific standards used in mobile handheld devices. Technical standards, such as the ones touched upon here, are very complex, and a full description of each one is beyond the scope of this book. In general, regardless of the platform for which you are creating video content, there are three main properties to keep in mind:

Dimensions: This property specifies the pixel dimensions of a video file—the number of pixels horizontally and vertically that make up an image or video frame. This value is usually written as a pair of numbers separated by an x, where the first number is the horizontal value and the second represents the vertical value, such as 720×480. The term Pixel is a combination of the words picture and element and is the smallest individual component in a digital image. Whether you are dealing with a still image or working with video frames makes no difference; everything displayed on-screen is made up of pixels. The dimensions of a video or still image file determine its aspect ratio; that is, the proportion of an image’s horizontal units to its vertical ones. Usually written in the following format: horizontal units:vertical units, the two most common aspect ratios seen in current video displays are 4:3 and 16:9.

Frame rate: This property specifies the number of individual images that make up each second of video. Frame rate is measured as a value of fps, which is an acronym that stands for frames per second.

Pixel aspect ratio: This property specifies the shape of the pixels that make up an image. Pixels are the smallest part of a digital image, and different display devices such as televisions and computer monitors have pixels with different horizontal and vertical proportions.

When producing graphics for broadcast television, you have to conform to a specific set of formats and standards. For example, you need to know whether your graphics will be displayed on high-definition screens (1080i, 1080p, 720p), standard-definition screens, or mobile devices because this affects the size that you must create your graphics at. Similarly, you need to know whether you’re in a region that broadcasts using the ATSC (often still called NTSC) or PAL standards, as this affects both the size you can create your graphics at, and the frame rate and pixel aspect ratio you will need to use. If you are producing animation or video for the Web, you’ll need to know the format that the distributing site will be using: Flash, Silverlight, h.264, or other, since certain video effects don’t work well when exported to certain formats.


In the United States, the ATSC, or Advanced Television Systems Committee, has issued a set of standards for the transmission of digital television. These standards have replaced the older, analog NTSC (National Television Standards Committee) formats. The standards embraced by the ATSC include standard-definition and high-definition display resolutions, aspect ratios, and frame rates. All broadcast video and graphics must conform to one of the ATSC standards. Information on the various ATSC standards is available on their website at

High-definition television

While high-definition (HD) television technology has existed for decades, it wasn’t until the beginning of the 21st century that it came to the attention of the average American television viewer. The term HD is used to describe video that has a higher resolution than traditional television systems, which are called SD, or standard definition. There are two main high-definition standards for broadcast television—720p and 1080i—while many televisions, gaming consoles (Playstation 3, Xbox 360, and more) and Blu-ray disc players can support a third, 1080p. The letters p and I refer to whether the format uses a progressive or an interlaced display method. Interlacing divides each frame of video into two separate fields. When combined, these two fields form a single video frame that shows a single image. Progressive display forgoes fields and has each individual frame as its own unique image. In general, progressive displays are clearer and better defined, while interlaced displays require less broadcast bandwidth to be transmitted to the viewer. Most modern video cameras allow the user to choose whether to record in a progressive or interlaced format.

720p: The 720p format has a resolution of 1280 pixels wide by 720 pixels high and supports a variety of frame rates, from the 24 fps used by film, through the 30 fps that was part of the old NTSC standard, all the way up to 60 fps.

1080p and 1080i: The 1080 formats come in both progressive and interlaced versions and, like other modern digital standards; they support a variety of frame rates between 24 fps and 30 fps.


You will learn more about the differences between progressive display and interlacing later in this lesson.


Standard-definition television

Prior to the invention of high definition, there was only one standard in the United States, NTSC (National Television Systems Committee), which includes settings for both 4:3 and 16:9 aspect ratios. While technically it has been replaced by the ATSC standards, the term NTSC is still used by most video cameras and editing and graphics applications when referring to standard-definition, broadcast-quality video.

NTSC and NTSC widescreen: Graphics applications designed to produce content for broadcast, such as Adobe After Effects, Adobe Photoshop, Adobe Illustrator, and more, include pre-built settings for creating video projects called presets that correspond with the most commonly used broadcast standards. The NTSC presets include settings for both a standard (4:3) and widescreen (16:9) aspect ratio. They use the same dimensions, 720 × 480, but different pixel aspect ratios, and this is what accounts for the difference in shape. Devices that comply with the NTSC standard use a frame rate of 29.97 frames per second.


PAL, or Phase Alternating Line, is the standard for broadcast television used throughout Europe and much of the rest of the world outside of North America. PAL differs from NTSC in several key ways, including dimensions and frame rate. It uses a frame rate of 25 fps, which is closer to the 24 fps used in film and, like NTSC, it has both a standard and widescreen setting.

PAL and PAL widescreen: In applications such as After Effects, the PAL presets include both a standard (4:3) and a widescreen (16:9) aspect ratio. Much like their NTSC equivalents, they use the same pixel dimensions, in this case, 720 × 576, but each have different pixel aspect ratios.

Web and mobile device video: There is no single standard for video on the Web or on mobile devices, though there are only a handful of competing audio/video formats. QuickTime, Windows Media Video, Flash Video, Silverlight, and H.264 are the main video formats. The QuickTime format is controlled by Apple Inc., and for years was the de facto standard for web-delivered video. The freely available QuickTime Player is compatible with both Windows and Mac OS and is used to view QuickTime Movie (.MOV) and other video file formats. QuickTime format video is also supported on some mobile devices; most notably the Apple suite of phones, iPods and iPads.

Windows Media Video, often simply called WMV, is the Microsoft standard made by the creators of the Windows operating system. A variation of WMV is used for Silverlight video, which is widely used by many professional media organizations, including NBC Sports for their live Olympics coverage and Netflix for streaming videos. Windows Media is also a supported format on some multimedia players and mobile devices, such as Windows phones.

Flash video is the native video format of the Adobe Flash platform, and as such, it is used for the distribution of much of the world’s online video content. While the Flash player is widely installed on the desktop computers of Internet users, it is not as common on mobile devices. In fact, Adobe discontinued development of its mobile Flash Player and is recommending the use of HTML5 technology for mobile devices. Even before this, some organizations, companies, and other online content creators had begun to move their rich-media content away from Flash and onto other platforms, such as HTML5. In recent years, the dominance of flash video has been challenged by the natively browser-supported HTML5 video formats, h.264, and OGG Theora. H.264 is a standard for video compression derived from the MPEG-4 standard, created and patented by MPEG LA, while OGG Theora is its open source alternative. Mobile devices such as the Apple iPod, Sony PSP and Microsoft Zune, and some HTML5-compliant browsers support variations of H.264, along with many mobile phones and third-party video playback applications, such as QuickTime Player, Flash Player, and the VLC Media Player.

Continue to the next After Effects Tutorial: Understanding frame rate and resolution in After Effects >