High definition (HD) has become more and more desirable among both TV watchers and computer video fans. Where it didn't really matter how good your monitor was when you were watching amateur YouTube shorts, the advent of commercial streaming video has increased the demand for high-quality displays.
Until recently, the perceived wisdom about HD was that bigger was always better. What good is having HD unless your display is at least, say, 52 inches?
These days, though, many people can't afford, don't have space for or just don't want to bother with a display that large. But that doesn't necessarily mean that you can't have HD. Currently, you can buy displays no bigger than 21.5 inches that are capable of displaying true HD video in 1080-pixel resolution.
But are they really as good as their big brothers? Is the HD offered by smaller, cheaper screens anywhere near as good as that provided by bigger ones?
To test whether smaller screens are as spiffy as larger ones, I collected and tested six differently sized HD monitors from a variety of vendors: NEC M46-AV (46 inches), HannsG HG281DPB (27.5 inches), Samsung P2370 (23 inches), Hewlett-Packard w2338h (23 inches), Dell SX2210 (21.5 inches) and Lenovo L215p Wide (21.5 inches).
Defining high definition
When it comes to HD, the resolutions that you'll most often find are 720p, 1080p and 1080i. The numbers are familiar to most of us: It's the measure of the horizontal scan lines that make up the display resolution. But what are those letters about?
The "p" stands for progressive scan, while the "i" stands for interlaced video. Interlaced videos consist of an image presented in alternating sets of lines: first the odd-numbered lines are "painted" across the screen, then the even-numbered lines, and the process is repeated. One set of lines, whether even or odd, is called a field while two consecutive fields -- odd plus even -- is called a frame. Progressive video presents each frame as a whole, with equal fields, at one time.
In general, a progressive scan image is better than an interlaced image because of its smoothness and ability to better display motion without blur. Currently, HD television that's broadcast at 1080 is interlaced. Blu-ray is often touted as superior technology because it uses progressive scanning.
How we tested
I started with DisplayMate's series of benchmarks specifically designed for LCD monitors, which I ran by connecting each to a home-built computer via its HDMI port (except for the Samsung display, which didn't have HDMI). These tests include blooming, color purity, stuck pixels, text readability, backlight bleed and others. DisplayMate has also recently added a new motion blur test aimed at culling out displays that might be unsuited for fast-action gaming or video.
To test the HD aspect of these monitors, I connected a Samsung BD-P1500 Blu-ray player directly to each display via its HDMI port (again, except for the Samsung display) and ran two special-effects-laden films: The Dark Knight and Serenity. These two movies were more than adequate to show any problems that might exist with white and grayscale reproduction, as well as possible motion blur effects.
I also ran a copy of an episode of the television show Chuck broadcast in 1080i and then edited down (without commercials) into 1080, 720 and 420 (standard definition) clips.