HDTV (High-definition television)
by Chris Woodford. Last updated: September 22, 2016.
Gaze at the world around you and (if your eyesight is good) you'll notice how virtually everything you see looks pin-sharp and crystal clear. But if you're looking at a representation of the world, such as a digital photograph or an image on a TV or computer screen, that's seldom the case. Look closely at a standard TV picture from a few inches away and you can see the thousands of colored dots called pixels from which it's made—you can probably see it flickering too. That's why many people are getting behind the latest kind of high-definition television called HDTV for short. HDTV essentially means the picture is much more detailed, a bit wider, and it doesn't flicker, even when it's shown on really big screens. Let's take a closer look and find out more!
Photo: Some LCD televisions, like this one, are "HD-ready": you won't have to replace them when HDTV becomes more widely available. Please note that the flower images used throughout this article are not generated from actual TV pictures: they're mockups designed to show how one kind of picture compares with another.
Eyes, TVs, and digital cameras
Our eyes contain 130 million light-detecting cells called rods and cones so, in the language of digital cameras, our vision is effectively 130 megapixels. Put that another way and it means the images created on our retinas are about 10 times higher definition or resolution (more detailed) than the images created by the very best, professional quality digital cameras. Televisions and computer monitors aren't anything like as good as the human eye. A typical old-style, analog television (technically known as SDTV for standard definition television) has a picture made up from about 700 pixels across by about 500 pixels down (actually 704 by 480, written "704 x 480" for short), which is a little over a third of a million pixels (0.33 megapixels). If you have a 3 megapixel digital camera, it takes pictures that are 10 times better quality than the ones your old TV can display.
Photo: The pictures a typical digital camera takes are least 10 times more detailed than those on a typical, old-style, SDTV. Compare the left picture with the picture on the right, which is 10 times fuzzier, to get an idea how much difference 10 times more detail can make.
Why can't TV be more like a digital camera? It can be! In fact, it already is. The computer screen you're looking at now most probably has a resolution of something like 1024 x 768 (technically known as XGA) or 1280 x 1024 (SXGA), both of which give a level of detail similar to what you'd get on a one megapixel digital camera. (Older computers only managed about 800 x 600 or even 640 x 480, which wasn't much better than a standard analog TV. At those low resolutions, when you're sitting only a couple of feet from the screen, individual pixels stare out at you like bricks in a wall!.
In HDTV, the resolution is typically 1920 x 1080 or 1280 x 720, which is similar to what you get with a computer monitor. Multiply up the numbers and you can see that, compared to a 704 x 480 SDTV screen, you get roughly six times more pixels in a 1920 x 1080 screen and roughly three times more in a 1280 x 720 screen. Either way, you're getting a much more detailed picture.
Photo: HDTV (left) gives about six times more pixels than SDTV (right). Note that we're not showing the correct HDTV or SDTV aspect ratios (screen shapes and dimensions) here; we'll come to that in a moment.
More pixels means you can show an HDTV picture on a much larger screen without it looking fuzzy. This has always been a problem with SDTV: it looks fine on tiny sets but bigger sets use exactly the same picture signal, so the bigger you make your screen, the more area each pixel in the signal has to cover and the fuzzier it looks. It's a source of great irritation to TV sales people that state-of-the-art LCD TVs in their stores often look as though they have fuzzy pictures, even though they have screens capable of showing pin-sharp images. The reason is simply that the SDTV signal they're usually displaying is far too crude to take advantage of a modern set's capabilities. To make a crude 704 x 480 picture display on a 1920 x 1080 screen, you have to scale it up by a process called interpolation so that each picture in the signal occupies about four pixels on the screen. Or, to put it another way, your TV is showing only a quarter of the detail that it can. Unlike SDTV, HDTV takes full advantage of big, high-resolution screens.
Photo: If you start with a small, sharp TV image and try to scale it up so it works on a bigger screen, you end up with a much fuzzier image. That's because each pixel in the small image has to be copied several times over ("interpolated") to cover a larger area. In other words, you can't invent detail that isn't there.
There's another big difference between SDTV and HDTV. If you look at new televisions in a store, you can see straight away that the screens are much more rectangular than old-style TVs. You can see that in the numbers as well. An old TV with a 704 x 480 picture has a screen about 1.5 times wider than it is tall (just divide 704 by 480). But for a new HDTV with a 1920 x 1080 screen, the ratio works out at 1.78 (or 16:9), which is much more like a movie screen. That's no accident: the 16:9 ratio was chosen specifically so people could watch movies properly on their TVs. (If you try to watch a widescreen movie on an SDTV screen, you either get part of the picture sliced off as it's zoomed in to fill your squarer screen or you have to suffer a smaller picture with black bars at the top and bottom to preserve the wider picture—like watching a movie through a letterbox.) The relationship between the width and the depth of a TV picture is called the aspect ratio; in short, HDTV has a bigger aspect ratio than SDTV.
Photo: HDTV (left) gives a more rectangular picture than SDTV (right).
Movies and TV pictures rely on an optical illusion called the persistence of vision: they show our eyes a series of still images at high speed and our brains blend them into a single moving picture (read more about this in our article on movie cameras). The trouble is, if a TV doesn't change ("refresh") the images quickly enough, you can see one image being replaced by another. If you have an old-style SDTV television, you can sometimes see the picture flicker when you watch it through the corner of your eye or read a book with a TV on in the background. That's because our eyes are most sensitive to changes in movement in the edges of the retina, where the motion-sensitive rods are concentrated (humans evolved that way so we could see threatening predators sneaking up on us).
Old style SDTV is based on a technology called a cathode-ray tube (or "tube" for short), in which the TV picture is made by "scanning" electron beams across a phosphor-coated screen (read our main article on television to find out how it's done). That takes a certain amount of time, even when you're using something as zippy as electrons (tiny charged particles inside atoms). If TVs scanned all the pixels in a picture in turn, you'd almost be able to see the picture appearing in front of you. You'd definitely see it flickering. So, to reduce this problem, old-style TVs use a technique called interlacing: they scan all the odd-numbered lines on the screen and then, a short time later, they go back and scan the even-numbered lines—the lines in between. Each pass takes one sixtieth of a second so it takes twice this long (a thirtieth of a second) to completely refresh a TV picture. One refresh of the picture is called a frame, so you get about 30 frames per second with SDTV. That's just quick enough for our eyes to avoid seeing flicker.
Like computer monitors, HDTVs use pictures built up with LCD screens in which the pixels can be switched on and off much more quickly under electronic control (see our article on LCD TV for more on how this is done). In an HDTV, it's possible to change an entire picture 60 times per second—twice as fast as with SDTV—even though there are more pixels. Generally, you don't need to use interlacing with HDTV; the pixels can all be refreshed in turn from the top left to the bottom right—which is known as progressive scanning.
Photo: With old-style interlaced scanning (left), the red lines are scanned one after another from the top down. Then the blue lines are scanned in between the red lines. This helps to stop flicker. With progressive scanning (right), all the lines are scanned in order from the top to the bottom. HDTV generally uses progressive scanning, though (like SDTV), it can use interlacing at higher frame rates.
Is HDTV really HD?
There's one important caveat to all this: even if you have an HDTV, you only get an HD picture if you have a full-resolution, HD-quality signal feeding into your box to begin with. If you're playing a streamed movie onto an HDTV, you will see a much more poorer quality picture because the original picture has been highly compressed so it can be stored and transmitted efficiently over the Internet. Cable and satellite images may also be of variable quality. DVDs also use lower-quality pictures than HDTV is capable of, though Blu-Ray will typically offer very high definition.
What do you need to hook up HDTV?
You can't get HDTV with an old-style analog set, because analog TVs work in a totally different way (receiving analog signals, using a different aspect ratio, using interlacing, and so on). But nor can you get HDTV with an ordinary digital television, no matter how new it is. You'll either need a digital television with a separate HDTV decoder or an integrated digital TV with a built-in HDTV decoder. Newer digital televisions are now being sold so they can take advantage of HDTV when it becomes more widely available. You'll see them marked as "HDTV-ready" or "HD-ready". That's what you need to look out for if you're buying a new digital TV today.