Question:
Technically, what defines "HD"?
pandemonium
2009-07-17 00:56:08 UTC
I've done tremendous amounts of research on this myself and the only answer I can come up with is the fact that it's higher resolution. I've come to understand it as nothing more than a marketing scheme to get people to buy a higher quality display.

I've always thought that computer monitors and graphic cards were "HD" long before the invention of the phrase since they could go beyond 640x480 (the original display resolution of projection TVs).

Can anyone prove otherwise supplying factual data?
Five answers:
lare
2009-07-17 14:41:05 UTC
The specification for digital television is ATSC. These spec out a number of video formats that can be broadcast. Any ATSC format that is 720 line or 1080 line is HDTV. ABC and Fox networks use the 720p spec for their HDTV programs, mostly because it is better for action sports. NBC and CBS use the 1080i spec. One can watch these HDTV programs on an old fashioned analog TV set with the proper converter box. It seems that people actually prefer the HDTV programs in HD, so in that sense it is a marketing scheme to sell new TV sets.



The manufactures of video displays decided in 1995 with that advent of DTV that to be advertised as an HD display, it had to be minimum 1 meter diagonal with either 720 or 1080 line natural resolution and 16:9 aspect ratio. It is only recently that any computer displays capable of this requirement have been available. basically computers waited until they could adapt television display technology, not the other way around. Certainly VGA, XVGA, and SVGA display "standards" are no where near HD.



As to comparing analog television standards to a VGA display, the VGA falls far short of meeting the test. First NTSC television has 483 vertical lines of active picture content, not 480. In fact at one time the FCC was going to fine television stations that used 480 line computer generated graphics or effects. The standards for digital generated video for analog broadcast television (D-1) sets the pixels per line at 720, not 640. VGA is neither the digital equivalent of analog television or the resolution equivalent. It was not until ATSC relaxed the digital specs that VGA (480x640 pixel) became standard definition.



The original display resolution of projection TVs for analog input was about 600 to 1000 lines. It was not later, until projection of computer power point programs become popular, that video projection became pixellated. most of these projectors were for svga 600 x 800 pixels, not widescreen, and certainly not 1080x1920 native pixel resolution.



To say that DTV is a recent come along following the computer nerds lead is a farce. At least in the United States, television stations have been broadcasting HD content on DTV transmitters for more than 10 years, at a time when Windows95 was not even Y2K compliant, let alone widescreen.
classicsat
2009-07-17 06:32:07 UTC
In the context of TV, HD is 720P, 1080i, or 1080p.



In short HD is simply the fact that the display or content is a higher resolution than 480i/p (or 576i/p). There is nothing to add



Yes, computer monitors (above 640x480) are/were in essence HD.
2009-07-17 01:03:08 UTC
'HD" stands for High Definition.

For example, tv's

A HD television usually has a bigger screen, better color display (more intense colors) and in general, has a better quality



When you have a "HD ready" tv, that means it's a good quality, etc but also very suited for using with digital tv and digital recorders, for the best quality
2009-07-17 00:59:03 UTC
720p or 1080p
malaka
2009-07-17 00:59:18 UTC
theres more pixels on the screen and they update really fast measured in mHz or something like that. the more pixels there are and the faster they update, the higher the resolution


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...