Using DVI Output

June 5, 2008 at 10:12:45
Specs: Xp Pro, 3.00/2.00
This is quite strange. I bought a new LCD monitor and I'm finally using my DVI output on my Radeon 9800, but for some reason evertime I reboot or turn on my computer my video card is only giving output for the VGA port. So I then have to unplug the DVI cable and replug it in, then you hear windows make the sound that it detected new hardware and I finally get picture again. Is there any way I can resolve this problem? Thanks so much for the help guys.

Incarn


See More: Using DVI Output

Report •


#1
June 5, 2008 at 14:19:14
See if your video card has a management tool [software] that controls this.

Imagine the power of knowing how to internet search
http://www.lib.berkeley.edu/Teachin...


Report •

#2
June 5, 2008 at 16:40:44
Never unplug the monitor while Windows is running! You can easily damage the monitor or the video chipset or the mboard otherwise!

You could try tweaking video setings but it may not help.

A lot of video cards with two monitor ports won't work properly with a monitor unless a single monitor is connected to the primary port, not the secondary port.
If you still have the vga connected monitor, if you connect both monitors, both will probably work fine (though you may have to tweak some video settings). If it has two DVI ports, and you were using an adapter in one for a VGA connection, remove the adapter if it's still installed and use the other DVI port.
If one port is a VGA port, standard adapters can convert a DVI port to VGA use, but the opposite won't work - a DVI port requires more than the 15 connections a VGA port provides.


Report •

#3
June 6, 2008 at 05:58:05
I would suspect that you may have both of your dual adapters enabled and the primary one set to the VGA port. Go into advanced settings and if you have both displays enabled then disable the VGA.

Report •

Related Solutions

#4
June 6, 2008 at 13:24:28
I'm trying to disable my VGA port, but I can't find that option under advanced controls. I know that this solution would definitely fix my problem though.

I looked for an option to set the DVI output as the primary port, but couldn't find that either. Is there something that I'm missing? This problem is almost like a curse or something :(

Incarn


Report •

#5
June 6, 2008 at 15:56:50
You don't disable the port you disable the monitor (display). If there aren't two listed then that may not be your problem. The computer is probably trying to boot using the primary display. If you actually were using two monitors you can choose which is the primary. If only one monitor is enabled it should be used by default.

Report •

#6
June 7, 2008 at 11:33:38
There are two listed, but my DVI monitor is still the primary. It always has been, but I noticed that my secondary display, which is my VGA port, is set as default. I don't know how to change that though.

So, to sum it up, I have my DVI port set to primary, but my secondary port says that it's default. Does this make any sense? I'm going crazy!

Incarn


Report •

#7
June 7, 2008 at 13:06:45
Your primary display in Windows is the DVI monitor (since it's the only one connected it has to be), but that doesn't necessary mean it's connected to the card's primary port.

On ATI chipset cards I have, or have had, including a Sapphire Radeon 9800XT, if the card has two monitor ports, VGA and DVI, the VGA one is the primary port.
That particular card model had 2 DVI ports - a similar model, same chipset, was availble with both kinds - I used a DVI to VGA adapter in the primary port (I have no DVI monitors) - as I recall I installed the adapter on the secondary port by mistake once and the monitor wouldn't work properly.

We have had several people post in the last while that had a problem similar to yours, but it seems they had two DVI ports - plugging the DVI monitor into the other port cured the problem.

As I said above....
If you still have the vga connected monitor, if you connect both monitors, both will probably work fine...

There is only a tiny advantage of DVI over VGA (it's actually enhanced VGA) in any case.

If you have both DVI and VGA ports on the monitor, connect the VGA ports together.

If you have only a DVI port on the monitor...

DVI (male)to VGA (female) adapters are common for use on a card.
If you can find a DVI (female, holes) to VGA (male, pins) adapter to install on the end of the cord to your monitor (which has DVI male), if they even exist, you could connect to the VGA port.

Otherwise, you may need to get a video card that has a DVI primary port.


Report •

#8
June 10, 2008 at 08:46:59
I know that there is a solution to this other than buying a convertor. Before I reformatted my computer this DVI monitor worked fine. I never had to unplug then plug the cable back in when I turned my computer on. For some reason now it just doesn't work. This is weird.

Incarn


Report •

#9
June 10, 2008 at 10:34:56
"Before I reformatted my computer this DVI monitor worked fine."

You should have mentioned that in your first post. That's a whole different situation, assuming you didn't still have the VGA connected monitor plugged in when the DVI monitor worked before (if you did still have the VGA connected monitor plugged in at that time, of course the DVI monitor would have worked fine).

Did you load the main chipset drivers?

Whenever you load Windows from a regular Windows CD (or DVD) from scratch, after Setup is finished you must load the drivers for the mboard, particularly the main chipset drivers, in order for Windows to have the proper drivers for and information about your mboard hardware, including it's AGP or PCI-E, ACPI, and hard drive controller support. If you have a generic system and have the CD that came with the mboard, all the necessary drivers are on it. If you load drivers from the web, brand name system builders and mboard makers often DO NOT have the main chipset drivers listed in the downloads for your model - in that case you must go to the maker of the main chipset's web site, get the drivers, and load them.

One of the things in the main chipset drivers are the drivers and/or information Windows needs to properly recognize the mboard chipset's enhanced video (specifically, GART) support.
Your video card will work fine in plain VGA mode but when you load specific drivers for it, in this case AGP drivers, the card will not work properly if Windows doesn't have the proper drivers and/or that proper info.
...

The other thing you must do is to load the proper drivers and associated applications for the video card, the right way.

XP has built in support and drivers for 9800 series cards, but the drivers on the CD are newer, or the ones you get from ATI are the newest, and both install more stuff and allow you to set more things.

If you install drivers from the CD or the ATI website, you must un-install anything you find Windows or previous driver installs have installed for the video in Add/Remove Programs, and/or you should un-install the listed Display Adapters in Device Manager before you install drivers.

I know from previous experience drivers and applications for ATI chipset cards must be installed in a certain order, and they are not always listed in the order you must install them on the ATI site. If you install those using the Setup or install on the CD that came with the card, everything will install properly, but if you get drivers from the ATI site they must be installed in the right order - there is a link to Uninstall, Install directions where you get the ATI drivers.

Installing updated display drivers found by Microsoft Update may result in the card not working properly. If you want to install updated drivers, go to the ATI web site instead to get them, and follow the directions - installing just the updated display drivers themselves may not work properly - you may need to, and I always, un-install all the old stuff, then install the new stuff.
You need a .Net Framework version as well for the Catalyst Control Center if you get drivers and apps from the ATI site - see the readme or release notes for the Catalyst Control Center version if that isn't obvious.
The CD may install a Control Panel, or Catalyst Control Center - if the latter it will install the .Net Framework version needed if it hasn't already been installed on your computer.
(.Net Framework versions are standalone - loading a higher version doesn't eliminate the need to install a lower version.)

.....

"I'm trying to disable my VGA port, but I can't find that option under advanced controls"
"I looked for an option to set the DVI output as the primary port, but couldn't find that either."
" noticed that my secondary display, which is my VGA port, is set as default.
"So, to sum it up, I have my DVI port set to primary, but my secondary port says that it's default. Does this make any sense?"

I've never seen a video setting that can disable a card's port.
Windows normally sees the card's primary port as the default a single display is supposed to be connected to.
Cards with two or more monitor ports always have a display adapter listed in Device Manager for EACH port. You could try RIGHT clicking on the primary adapter there, and Disable it, but you may get no display at all in Windows itself after you reboot with the DVI monitor connected to the secondary monitor port.
If that happens to you, press F8 repeatedly while booting, select Enable VGA mode, go into Device Manager, and Enable the primary display adapter.

Rarely, another thing that can happen right after you install drivers for a card is you may then get no display in Windows, or at all, because the video drivers are not detecting your monitor properly. In that case press F8 repeatedly while booting, select Enable VGA mode, go into Display - Settings - Advanced - Monitor and set your monitor drivers to Plug and Play Monitor, or if you have the CD for the monitor, use Have Disk and point to the specific drivers for the monitor on the CD (it's looking fo *.inf files), save settings, reboot normally.

You should always install the specific drivers for an LCD monitor - the Plug and Play Monitor selection was not designed to support LCD displays other than LCD displays on older laptops, it has not changed since XP was first released, and though default settings won't hurt the monitor, you can choose settings when you are using Plug and Play Monitor that will damage the LCD monitor.

As far as I've seen goes, you can't change which port the card itself sees as primary - it's "hard wired".
As I said previously, if you have only one monitor connected, it is always the primary display in Windows, but that doesn't necessarily mean it's connected to the card's primary port.
.....

I no longer have that Sapphire 9800XT card. I do have an ATI 9800 AIW on my other computer but it has only one (DVI) port.


Report •

#10
June 10, 2008 at 13:47:13
Incarnadine

When you look at display properties> advanced. Do you have two monitors showing? If so, you can set either one to be the primary using the ATI software.


Report •

#11
June 11, 2008 at 22:08:25
Yes, I see that and I set my DVI monitor to the primary monitor, but for some reason the problem still persists. I'm really starting to hate this damn problem.

Incarn


Report •

#12
June 12, 2008 at 04:41:03
Disable the second monitor.

Report •


Ask Question