HDMI trumps DVI?

Self build / N/A
April 14, 2011 at 19:41:31
Specs: Vista Ultimate, SP1, 3.0Ghz,2Ghz
I have a computer with a video card that has HDMI output. I have a HDMI cable connecting the computer to the monitor. I can see analog AND digital signals just fine. What I mean by analog is the POST messages and multiboot screen. On another computer I have a video card with VGA and DVI-I outputs and a monitor that has VGA and DVI-D inputs. Connecting a VGA cable works fine showing the POST/multiboot screens. Connecting the DVI-D cable shows no analolg signals at all; all it allows is WIndows once it starts loading. I understand the extra four pins are not engaged when connecting a DVI-D to a DVI-I output so the analog signals will not show. I don't understand why the HDMI, which is supposed to be digital, shows the analog signals while booting; you think it would not. This is bafflling. Is there an adapter you can use to see analog signals when connecting a DVI-D to a DVI-I output and a DVI-D input on a monitor? Or is DVI just out of the question? The industry seems to have gone haywire with DVI cables! There's so many different kinds to make your head spin! I've heard that you can set the video adapter up to output the analog signals but if the monitor accepts only DVI-D input, that seems like it wouldn't work either, am I right? Thank you.

P.S. If I really want to hook up a DVI-D cable to the computer instead of a VGA, do I need to buy a new video card? Seems typical!


See More: HDMI trumps DVI?

Report •


#1
April 14, 2011 at 20:27:40
"What I mean by analog is the POST messages and multiboot screen."

Just because the POST/boot screens are low resolution images doesn't mean that they're analog. Anything that gets piped out of an HDMI jack is digital--whether it's a low resolution POST screen or a computer game running at 2560x1600.

Is your monitor plugged into the first/primary DVI port? On certain video cards, the secondary DVI port won't "engage" until after Windows loads the video driver.

Super PIII | Unlocked ES Tualatin @ 1.8GHz (150x12, 1.65v, 512K L2)
2GB PC2700 | 500GB | Radeon x1950Pro | Apollo Pro 266T | Win 7 Pro


Report •

#2
April 15, 2011 at 08:03:03
HDMI & DVI send the same digital signal & have the same video quality. Basically, HDMI is nothing more DVI + audio in a single interface.

Report •

#3
April 15, 2011 at 08:43:04
Digital Visual Interface (DVI)
http://en.wikipedia.org/wiki/Digita...

Excerpts.....
(DVI )
It is partially compatible with the High-Definition Multimedia Interface (HDMI) standard in digital mode (DVI-D), and VGA in analog mode (DVI-A).


DVI and HDMI compatibility

HDMI is a newer digital audio/video interface developed and promoted by the consumer electronics industry. Both DVI and HDMI share the same electrical specifications for the TMDS and VESA/DDC links. However, HDMI and DVI differ in several key ways.

First, HDMI lacks analog VGA compatibility, as these signals are absent in the HDMI connector.

Second, DVI is limited to the RGB color space, whereas HDMI supports both RGB and YCbCr.

Finally, HDMI supports the transport of digital audio, in addition to digital video. An HDMI source differentiates between a legacy DVI display and an HDMI-capable display by reading the display's EDID block.


Connector

The DVI connector on a device is therefore given one of three names, depending on which signals it implements:

* DVI-D (digital only, both single-link and dual-link)
* DVI-A (analog only)
* DVI-I (integrated - digital and analog)
..........

Virtually all actual video CARDS, that you install in a mboard slot and can remove, that have a DVI port, and MOST mboards that have a DVI port for onboard video, have a DVI-I port - a standard DVI to VGA gender adapter, or a standard DVI-I to VGA cable, WILL work with the DVI-I port to provide a VGA (analog) video signal to a monitor.

SOME mboards that have a DVI port for onboard video have a DVI-D port - a standard DVI to VGA gender adapter, or a standard DVI-I to VGA cable, WILL NOT work with the DVI-D port to provide a VGA (analog) video signal to a monitor.

If you use a DVI-D cable connection, or if the monitor has a DVD-D port, you cannot get VGA (analog) video from the video adapter..

.........

"I have a computer with a video card that has HDMI output. I have a HDMI cable connecting the computer to the monitor. I can see analog AND digital signals just fine."
"I don't understand why the HDMI, which is supposed to be digital, shows the analog signals while booting; you think it would not."

I don't understand that either.

Are they HDMI ports on both ends of the connection ?
Or -are you using a gender adapter on the monitor end to adapt it to a DVI port ?
Or what ?

Does the first monitor have both VGA and HDMI ports, or both VGA and DVI-I ports, and you had both connected ?
Usually, if not always, the monitor will accept only one type of video input at a time. If you had both connected, you were probably getting VGA video.


.


Report •

Related Solutions

#4
April 15, 2011 at 19:56:48
To all:

Thanks for responding.

To tubesandwires:

The computer with the HDMI cable has HDMI ports on BOTH ends, on the video card and on the monitor. I am not using any adapter. The POST messages are clearly seen.

The other computer has a DVI-I port on the video card and a DVI-D port on back of the monitor. Connecting a DVI-D cable does not show the POST messages; it only shows Windows once it starts loading. Connecting a VGA cable shows everything.

What if you need to get into the BIOS? Or in my case, select an OS from the multiboot screen? I'm sure there are cases when a user would want to do this. However, with the DVI-D cable connected I can't do either. I must resort to a VGA cable.

The part I don't understand is, how is the HDMI cable showing POST messages?

I'm resigned to the fact that I have to use a VGA cable on the one computer. No big deal, since DVI is not that much different than VGA.

But, out of curiosity, I was wondering how could the HDMI cable let the POST messages through? By the way, the other computer also has a DVI-I output in addition to HDMI. I bet that if you hook up a DVI-D cable to it, you'll get the same thing: No POST messages! Whoever came out with DVI-D did not see the whole picture. Appparently this cable is for those who never want to access their BIOS, see POST messages or see a multiboot screen. No, I am not using a boot manager. My observation is that this sort of inconsistency seems to be the norm in the computer industry. We have to grin and bear it!


Report •

#5
April 16, 2011 at 08:27:40
I don't know why you get the video before Windows loads when you use the HDMI cable between HDMI ports on both ends.

It doesn't make sense to me that any computer monitor, or monitor port on a TV, would have a DVI-D port rather than a DVI-I one. Are you SURE it's a DVI-D port ?
If it's a DVI-I port, a DVI-I cable to a DVI-I port on the video adapter WILL produce video before Windows loads.
.....

There was a guy who started a Topic here not long ago who had a new Samsung monitor with a HDMI port and a VGA port. It came with a VGA cable, a HDMI cable, and a DVI to HDMI cable.He was getting no video before Windows loaded when he used the DVI to HDMI cable connected to, I assume, a DVI-I port on the onboard video adapter or video card. The VGA cable produced the video before Windows loads.
He was multi-booting two operating systems, so in addition to needing to be able to access the bios while booting, he needed to be able to chose other than the default operating system.
Apparently the same Samsung monitor model is available with several possible configurations of pairs of input ports, another one being a DVI port, I assume a DVI-I one, and a VGA port .
.....

I have one mboard that has a VGA port and a HDMI port for it's onboard video, that came with a HDMI to DVI adapter, and one video card that has a DVI-I port and a HDMI port, but all my monitors have a VGA port only, and our TV has no HDMI or monitor ports.
...........

NOTE what I said previously...

"SOME mboards that have a DVI port for onboard video have a DVI-D port - a standard DVI to VGA gender adapter, or a standard DVI-I to VGA cable, WILL NOT work with the DVI-D port to provide a VGA (analog) video signal to a monitor."

The term CARD is often mis-used.

If the video adapter is built into the mboard IT'S NOT A CARD !
....

"I'm resigned to the fact that I have to use a VGA cable on the one computer. No big deal, since DVI is not that much different than VGA."

I certainly agree with that. I think DVI's another one of of those things that is better in theory but in the real world I haven't seen that there is much difference .
Also, you usually have more resolution choices to choose from when the monitor is in VGA mode in Windows


Report •

#6
April 16, 2011 at 08:51:15
Tubesandwires,

Guess what? That guy was me! Yes, I am sure the input on the monitor is DVI-D. It has 18 female pins plus a female tongue-like connector next to the pins. The video card, however, has 18 pins, tongue-like slot and four more pins, 2 above and 2 underneath the tongue-like slot. These four extra pins are what carry the analog signal. If you're not engaging those pins, you won't get the POST messages. A DVI-D cable ignores those pins. Therefore, there are no POST messages. VGA is analog, so you can see everything. That's why I said the industry messed up when they decided to make DVI cables with just digital signals. Say goodbye to the analog signals or whatever they're supposed to be. All the monitors I've seen have DVI-D inputs, in keeping with the digital craze. And I think it must be hard to find a DVI-I cable anymore! As far as the industry is concerned that's ludicrous! But the poor people who want to get into the BIOS, etc. are left in the dust! That's why I've come to the conclusion that the best option for me is to go with a monitor that has both VGA and HDMI inputs. DVI-D inputs are of no use to me at all!


Report •

#7
April 16, 2011 at 09:10:32
"Guess what? That guy was me! "

Oops.
I should have clicked on your name to find that previous Topic you made:
http://www.computing.net/answers/ha...

"All the monitors I've seen have DVI-D inputs, in keeping with the digital craze"

I haven't looked into that, but if that's the case, then there must be something else involved here that is causing your problem, but I have no idea what it is.
Obviously most people have no problem getting video before Windows loads when they use a DVI connection to a monitor, otherwise we'd be hearing about that problem a lot.


Report •

#8
April 16, 2011 at 10:06:38
I just tried searching the web with Yahoo using:
DVI connection no video before Windows loads

Apparently the problem DOES happen sometimes.

I may have already referred you to this in your other Topic about your Samsung monitor:

737-370: Troubleshooting Common DVI Flat Panel Display Issues
http://support.amd.com/us/kbarticle...

Other possibilities.

http://forums.nvidia.com/index.php?...
.....

If the monitor is connected to the Secondary output port of the video adapter, NOT the Primary one, you may get no video until Windows loads.
In that case if the video adapter has two outputs, if you connect two monitors, the one connected to the Primary port of the video adapter WILL produce video before Windows loads .
I have seen that problem with older video cards, before there was onboard video with more than one monitor port, not with newer ones, but that's possible with newer ones.

That's not the same thing as whether Windows sees the monitor as the Primary display. If there is only one monitor connected, it's always seen as the Primary display in Windows.


Report •

#9
April 16, 2011 at 22:24:13
"If there is only one monitor connected, it's always seen as the Primary display in Windows."

That's my case. But, as I said, with a DVI-D cable connected to a DVI-I output, the extra four pins are not engaged. Therefore I do not see POST messages. My only option in this case is to use a VGA cable. As far as I know there is no way around this. Unless there's some kind of adapter you can use. The adapter would have to go on the back of the monitor as a DVI-D to DVI-I connection. Even if you could do this, the signals will still be lost since you're going from DVI-I to DVI-D.

Someone (from a cable store online) mentioned that I could look into the card software and see what signals each port is putting out. But why bother? The extra four pins are not being used, period. That leaves VGA as the only option. Unless there's something else I'm not aware of. But I don't think so. I have an old video card that uses a DVI-I connection. The industry has since moved on to DVI-D and HDMI. These are the problems you sometimes face in ByteLand.


Report •

#10
April 17, 2011 at 07:44:34
There is no disputing that you can't get VGA video from a DVI-D connection, but there must be something else involved for you to be able to see video before Windows loads.

Obviously most people have no problem getting video before Windows loads when they use a DVI connection to a monitor, otherwise we'd be hearing about that problem a lot.

Also, you're getting the video before Windows loads when you connect the two HDMI ports despite the fact there is no VGA signal available in a HDMI port.


Report •

#11
April 19, 2011 at 08:57:51
You're right! I'm stumped on this one too! Maybe it is a function of the card! As someone said a little further up, just because the signals are low level does not mean they are analog. If that's true, then maybe there is a setting in the card firmware which allows these signals to come through. Or maybe an updated driver would help. In meantime, I'm OK with the VGA cable.

Report •

#12
April 19, 2011 at 11:24:14
"As someone said a little further up, just because the signals are low level does not mean they are analog.:

That's probably the case..

"....maybe there is a setting in the card firmware which allows these signals to come through. "

Extremely unlikely, unless release notes for the firmware that are newer mention fixing the specific problem..
Same goes for firmware for the TV or monitor, if available.

"Or maybe an updated driver would help."

The specific video drivers loaded in Windows have absolutely nothing to do with the video before Windows loads.All video chipsets and mboards support basic video before the operating system loads.


Report •


Ask Question