Trying to Run Dual Monitors

Custom / CUSTOM
July 25, 2009 at 23:01:47
Specs: MS Win. XP Home Edition SP3, 2.2 GHz / 991 MB
I have a CTX 17" VGA Monitor and a Dell Triniton 17" monitor, both of which work well when attached to to the VGA port of my integrated Graphics Card on my ASUS M3N78-VM mother board with 1 GB of RAM. It's when I hook up one monitor (it doesn't seem to matter which) to the DVI out port (with a DVI-VGA adapter) and the other to the VGA port that I can't get a picture on the DVI attached Monitor. The computer recognizes both monitors and calls the one on the DVI port by name, (DELL or CTX) and calls the other monitor the "Plug and Play" or the "Default" monitor. The CTX monitor uses a host of NVIDIA drivers (that came with the the NVIDIA GeForce 8200 chipset) while the DELL uses and requires No Drivers. (according to "properties" found in"->control panel->system->hardware-> Device Manager-> display adapters->"NVIDIA GeForce 8200" or the "DELL 1250" monitors. I can't find out much about the Dell monitor such as drivers of manuels because it came from 1999. It is almost as good as my other CTX-PL9 monitor from 2002, both monitors support resolutions of up to-(I've forgotten but it was something like) 1024 by 1100 or 1000 and something.

I have tried installing updated drivers for both monitors and in both cases I have been told that the drivers I have are the best available found for the job. I have switched the monitors a couple of times to try them on the different out ports. I've ordered a VGA signal splitter to try plugging both monitors into the same (VGA) port But booklet that came with the mother board says,

"DVI Port- This port is for any DVI-D compatable device. DVI-D can't be converted to out put RGB Signal to CRT and isn't compatable with DVI-I."

"This motherboard comes with dual-VGA output. If you connect 2 monitors to both VGA and DVI-D /
HDMI out ports, each controller can drive same or different display contents to different resolutions and refresh rates.

Due to chipset limitation, simultateous output for DVI and HDMI is not supported.

I have not yet considered connection the monitors to the VGA port and the HDMI ports and I'm afraid that a second graphics card would require that I disable the integrated one.

Thanks in advance if anyone has any ideas,
Slocode


See More: Trying to Run Dual Monitors

Report •


#1
July 26, 2009 at 11:11:14
This is out of my area of experience. But installing drivers for monitors is not normally required. So I think you can eliminate drivers as a source of the problem.

I wonder if you have a defective or incompatible DVI > VGA adapter.


Report •

#2
July 26, 2009 at 15:37:58
Thanks for the thoughts, I have not spent a lot of time looking for drivers because I agree with you about that.

Has anyone here run a VGA monitor through an HDMI Port? I've read that that is a viable alternative if I can find the adapter for it.

I've been trying to get these monitors up and running for about 3 or 4 weeks now and I should be getting some idea of what the problem is. It's just frustrating to know that the computer can see the secondary monitor but not the other way around. I want to use these two monitors to write HTML with a lot of tables, frames and images and it would be so nice to not have to keep [alt+tab]ing every 35 seconds to cut or paste another element or tag.

BTW, My resolution is 1920 X 1200. Thanks,
slocode


Report •

#3
July 26, 2009 at 16:10:57
"both monitors support resolutions of up to-(I've forgotten but it was something like) 1024 by 1100 or 1000 and something"
and
"My resolution is 1920 X 1200"

I run dual monitors on a vga/dmi video card. But I run at 1024x768 which is a much lower res then you are running.

Try connecting again vga and dmi but go to 1024 and see if both monitors go live.

BTW a vga splitter will just result in the same view on each monitor. Not the path you want to follow.


Report •

Related Solutions

#4
July 27, 2009 at 20:13:25
No Luck Decreasing the resolution on both monitors down to 960 by 600, step by step and from 32 to 16 and even down to 8 bits where I could. I also tried some monkeying around with my hardware acceleration. I even took a big gamble and switched the primary for secondary monitors and back again, (luckily)with no disasters ... but still no picture either.

I downloaded a program called ultramon (it seemed like a very good program) but it wouldn't light up the monitor that is connected to the DVI port of my Video card.

Ultramon made it possible for my graphics programs, (Corel-PSP, Photoshop & Photo Impact) to recognize both monitors. They started asking me which monitor I wanted to open an image in. It seems as if every device on my computer knows about this 2nd monitor but the monitor itself.

I really have been wracking my brains out but I can't think of any other reason for this problem than a bad port. The only other two things that I can think of to try is to find an HDMI to VGA adapter to see if that port works or I could borrow a DVI equipped monitor from someone and check to see if the DVI port works for what it was meant for before I take the motherboard back for the warrantee replacement service.

Does anyone have any other ideas that I should try?

I thank you for your indulgence and ideas. This is my first experience with tech. forums. Slocode


Report •


Ask Question