If you view video on your computer or listen to music you downloaded from the Internet, you probably use MPEG technology.
In contrast to data files, audio and video files tend to be very large and intolerant of any delay or latency in delivery. Each packet must be received, decompressed and delivered to the user in precisely the order it was sent and at just the right time. Any dropped packets or mistimed delivery can turn the message into gibberish.
MPEG algorithms compress the data to form small bits that can be easily transmitted and then decompressed accurately and quickly to allow high-fidelity reconstruction. MPEG standards aim for a compression ratio of about 52:1, requiring the reduction of, for example, 7.7MB to less than 150KB.
In the early days of MPEG, having enough power to perform these compressions and decompressions was a problem. A PC needs sufficient processor speed (about 400 MHz), internal memory and hard-disk space. At 30 frames per second (fps), digital video requires 235MB of disk space per minute of play.
Previously, PCs needed pricey hardware coprocessors, or coder/decoders (codecs) to handle the heavy processing load of MPEG files. Today's desktop machines use software-only codecs, such as free products RealPlayer from Seattle-based RealNetworks Inc., Windows Media Player from Microsoft Corp. and QuickTime from Apple Computer Inc. to play the files.
In 1988, the Moving Picture Experts Group Licensing Administrator (MPEG LA), which is made up of nine companies and a university, developed MPEG-1 and submitted the standard to the U.S. government. In 1991, the group received a patent along with permission to license the standard.
Designed for coding progressive video and developed primarily for computer games, MPEG-1 delivered near-VHS-quality video at a data rate of 150KB/sec. MPEG-1's video standard was based on the standard image format of 352 by 240 pixels at 20 fps.
MP3 (MPEG-1 Audio Layer 3) codecs can produce CD-quality audio at compression factors of up to 12:1. But typical MP3s have a 25:1 compression ratio and lose a substantial amount of data, says Louis Latham, an analyst at Gartner Inc. in Stamford, Conn.
But even at the lower compression rate, a five-minute audio file that would normally take 50MB of space on your hard drive typically uses only 5MB of space for the same quality sound.
MPEG-2, developed in 1994 for coding interlaced images, was conceived as a broadcast standard: 720 by 480 pixels at 60 fps at data rates up to 2MB/sec.
For interlaced images, like those on a conventional TV, half of the screen - every other field - is drawn at a rate of 60 times per second. The other half of the fields are drawn in the next second. The two sets alternate continuously, producing an even data stream and images that the human eye perceives as smooth motion.
In computer displays, which are noninterlaced, jagged edges appear where one image meets another. To produce smooth video on a computer, both sets of interlaced fields are captured, and an MPEG-2 codec smooths the edges where the two meet. The crisper look of digital TV and DVD are the result of an MPEG-2 codec.
Approved in 1998 and 1999, respectively, MPEG-4 and MP4 are intended for very narrow bandwidths, speech and video synthesis, fractal geometry, computer visualization and artificial intelligence to accurately reconstruct images from minimal data. MPEG-4 and MP4 offer more lossless video and audio compression that let you store movies in about 15% of the space required by standard DVDs.
The ease with which a 90-minute movie can be copied onto a CD using MPEG-4 prompted moviemakers, fearing a Napster-like furor, to petition Congress for copyright protection - now standard - on DVDs to prevent such copying.
But it's the standard's scalability that's of greater importance, Latham says. The MPEG-4 codec allows video to be broken into bits so small and to be transmitted so quickly that video can run over 9,600 bit/sec. mobile networks.
MPEG-7 is a standard for describing multimedia content data.
Built on previous MPEG standards, MPEG-21 is a multimedia framework designed for creating and delivering multimedia. Work on the standard began in June 2000. Key elements are digital item declaration; identification; content handling; use and representation; intellectual property management and protection; terminals and networks; and event reporting.