Question:
What are HDMI and VGA?
Iceshowers
2008-04-30 10:00:54 UTC
I want to know what are HDMI connectors (or just what is HDMI) and Video connector (VGA) in normal non-geeky/big word way. I googled it but it described them with many other tech words that I didn't know and I can't spend a whole day googling each word until I finally understood the one that I needed. I just need what they are/do in your own words that's understandable, please. Thanks

(And I am curious 'cuz of my new laptop if you can tie it into that).
Six answers:
2008-04-30 10:15:48 UTC
The High-Definition Multimedia Interface (HDMI) is a compact audio/video connector interface for transmitting uncompressed digital streams. It represents a digital alternative to consumer analog standards such as Radio Frequency (RF) coaxial cable, composite video, S-Video, SCART, component video, and VGA.



The term Video Graphics Array (VGA) refers specifically to the display hardware first introduced with the IBM PS/2 line of computers in 1987, but through its widespread adoption has also come to mean either an analog computer display standard, the 15-pin D-subminiature VGA connector, or the 640×480 resolution itself.



A VGA connector as it is commonly known (other names include RGB connector, D-sub 15, mini sub D15 and mini D15) is a three-row 15 pin DE-15. There are four versions: original and DDC2 pinouts, the far older and less flexible DE-9 connector, and a Mini-VGA used for laptops.



HDMI supports, on a single cable, any TV or PC video format including standard, enhanced, and high-definition video along with up to 8 channels of digital audio
icuassist
2008-04-30 10:16:13 UTC
I hope this helps!





HDMI (High-Definition Multimedia Interface) is an interface standard used for audiovisual equipment such as high-definition television and home theater systems. With 19 wires wrapped in a single cable that resembles a USB wire, HDMI is able to carry a bandwidth of 5 Gbps (gigabits per second). This is more than twice the bandwidth needed to transmit multi-channel audio and video, future-proofing HDMI for some time to come. This and several other factors make HDMI much more desirable than its predecessors, component video, S-Video and composite video.



HDMI is an uncompressed, all-digital signal, while the aforementioned interfaces are all analog. With an analog interface, a clean digital source is translated into less precise analog, sent to the television, then converted back to a digital signal to display on screen. At each translation, the digital signal loses integrity, resulting in some distortion of picture quality. HDMI preserves the source signal, eliminating analog conversion to deliver the sharpest, richest picture possible.



The term Video Graphics Array (VGA) refers specifically to the display hardware first introduced with the IBM PS/2 line of computers in 1987, but through its widespread adoption has also come to mean either an analog computer display standard, the 15-pin D-subminiature VGA connector, or the 640×480 resolution itself. While this resolution has been superseded in the computer market, it is becoming a popular resolution on mobile devices.



VGA was the last graphical standard introduced by IBM that the majority of PC clone manufacturers conformed to, making it today (as of 2008) the lowest common denominator that all PC graphics hardware supports before a device-specific driver is loaded into the computer. For example, the Microsoft Windows splash screen appears while the machine is still operating in VGA mode, which is the reason that this screen always appears in reduced resolution and color depth.



VGA was officially superseded by IBM's XGA standard, but in reality it was superseded by numerous slightly different extensions to VGA made by clone manufacturers that came to be known collectively as "Super VGA".
Paul R
2008-04-30 10:12:33 UTC
I don't remember what the letters stand for but HDMI is high def connection specification. The cable must be capable of carrying a high Def signal.



VGA is generally a computer monitor output, so for instance to connect most laptops to a tv that has the capability, you would connect from the laptop monitor out to the tv VGA in.



Many DVD players are now coming with HDMI capability because they "upconvert" their signal to a higher quality. It is not tru High Def, but better than 410p and requires an HDMI cable to carry the upconverted signal.



Implicit in this is that you have HDMI and VGA connections on your TV - fairly common in the 15-22" range of LCD's.
percival.sweetwater
2008-04-30 10:52:10 UTC
You want to know if you can use the HDMI cable for your TV with your computer as the input. The answer is no.



In simple terms, all color video starts out as RGB (red, green, blue). Each one is an actual video signal all by itself (although in just that one color).



Video is nice. It CAN make a picture. But your monitor or TV doesn't know left from right, top from bottom. So there also needs to be horizontal drive and vertical drive. Those are the two signals that synchronize the monitor to the video.



Now there, for the most part, is where VGA stops. Yes, there's SVGA, XGA, U-XGA, etc to describe the various resolutions, but that's unimportant. You're talking about hooking up to your TV. And your TV does NOT do RGB.



I know, you're wondering about those Compnent connectors that are also red green & blue. We'll get back to that.



Now, those drive signals for sync are combined into 1 composite sync signal. The green video is used for basic video called Luminance (which is just the brightness & contrast of black & white TV). The red and blue video is encoded into it's own signal called Chrominance (or essentially the color portion)and is just the color itself, now longer any video in it. Now, the color portion is combined with the brightness portion, which is then combined with the sync portion, so that they can all travel on one wire, or one channel.



In your TV, the sync is separate from the rest and sent to the synchronization circuits. The color portion is separated and sent to the color circuits. Here, it's interesting. Red and Blue are decoded and sent to each of their own circuits. Anything remaining HAS to be green, so that's sent to its own circuit.



Meanwhile, the actual green is just B/W. And the red, blue, and green COLOR is applied to the appropriate areas of the B/W video. (The color is just color ONLY, in a code. It has no video itself.) So, while the TV station takes each tiny part of a TV signal and combines it into a complex single signal, your TV deconstructs it back into its individual parts and displays it on your screen.



Here's the problem. Red video, as it was originally created, is no longer red video. It's just red color in the proper area of a black & white picture. Same for blue. And green, too, really. So no matter how much we try to preserve quality, all that encoding and decoding degrades the image.



Now comes VCR's. When recording a VHS tape, it never does combine the luminance and chrominance. They each go on their own tracks. So why combine them out of the VCR, just so your TV can UNcombine them again? And so S-Video was created. It takes separate color and B/W information and sends it directly to the proper circuits in the TV without combining/separating them. Color is just a little bit better now. Not a lot, but a little bit.



Then along come DVD's. Lots more room on them critters. So you don't even need to squash red and blue together on one channel. There is a separate red color information (remember, it's been coded so that the original video is no longer there, just the color coding). And there is a separate blue color information. Whatever is left, again, HAS to be green, which is also being used as the Luminance information. And you have Component, which uses Red, Green, & Blue connectors to distinguish them from each other. The color is now better quality than S-Video, but still not the original full VIDEO colors.



And that's what goes into your HDMI cables. So what makes them better, or more expensive, than Compnent cables? The use of a SINGLE cable instead of 5 for connection between your DVD and your TV. It makes less of a mess of spaghetti behind your entertainment center or wall mount, and it also reduces the potential for picking up cross-talk, or interaction between cables and other unrelated cables back there.



But, unless your TV is specially designed to be used as an RGB monitor AND TV, it will never be able to accept VGA, no matter what cables you use. You'll need a VGA - HDTV converter box which contains electronics to do the conversion.
2008-04-30 10:13:07 UTC
HDMI (High-Definition Multimedia Interface) is basically a format used for high definition resolution in computer screens or television.



VGA ( Video Graphics Array) is basically the old format used for resolution of size 640 x 800, the new resolution now is higher. Ever visit websites and some of them seem like there's more to it but no scrollbar for you to use? It's old news.



In caveman terms.... Hdmi is to Vga as Digital television is to Basic cable
?
2016-11-09 15:03:50 UTC
HDMI/DVI produce the comparable high quality to maximum people's eyes. i take advantage of a DVI to HDMI cable with a view to get right of entry to my television's larger resolutions (1080p). DVI/HDMI might produce a clearer image than VGA so yest, they look distinctive.


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...