Solved Screen off-center, in-game, directX 10 only

July 15, 2012 at 00:07:54
Specs: Windows 7, Intel E8400/4GB
It's only started happening recently (past week or so), I started up Saints Row the Third and the image was about two inches to the right. I fixed it by dropping the game's resolution from 1680x1050 (my desktop default resolution) to 1440x900. I've also noticed this with Two Worlds II, I dropped the resolution from default to 1440x900 and the screen re-centered itself. However, these issues only happen when I run the games in DirectX 10 mode. DirectX 9 does not have the same problem with displaying properly.

I attempted to take a screenshot, but that obviously did not pan out. It is displaying the whole image, it's just pushed to the right out of screen.


See More: Screen off-center, in-game, directX 10 only

Report •

#1
July 15, 2012 at 00:35:28
✔ Best Answer
If you're using an analog/vga cable, the monitor has to be calibrated for that particular resolution and refresh rate. Most monitors have an auto-calibrate button which will do this for you. If the automatic calibration fails, you should be able to tweak the picture manually using your monitor's OSD.

Better yet, use a DVI, HDMI, or DisplayPort cable. With a digital connection, you'll never have to adjust your monitor again, since it won't have to "guess" where the pixels are supposed to go.

HTPC | Pentium M @ 2.82GHz, 2MB L2) | 4GB | 1.0TB | Radeon HD5750 | Blu-Ray
Win 7 Pro | Modified PowerMac G4 QuickSilver case


Report •

#2
July 15, 2012 at 04:35:45
Interesting, couldn't figure out if my screen has an auto-calibrate, but it does have a reset setting that corrected the display when I ran it in-game. However, the problem then arose that once I exited out of the game, my screen was shifted a couple inches to the left and stayed there. So I'm hoping that I won't have to reset my settings every time I choose to run my games at a given resolution.

I can't remember if my video card has an HDMI slot, but I don't think my monitor does (monitor's about 5 years old, the card is a GTX260). Although, I wonder if I have a DVI cable or adapter...

EDIT: I do have a DVI adapter that I am currently using, but seemingly to little or no effect. Also, my monitor only has VGA and DVI connectors, so my extra HDMI cable I've had laying around is still not going to use.

Another Edit: Figured out how to get my monitor to Auto-Adjust, but the same thing happens as with a Reset. It will correct the display in-game, but mess up my display once I exit the game. Maybe I'll just have to deal with playing at 1440x900 or just forego the DirectX 10 version altogether.


Report •

#3
July 15, 2012 at 09:19:21
Alright, just figured out what was wrong. Went into my graphics card settings, and apparently it was put to 60Hz instead of 59Hz. Switched that setting and it fixed my screen issues.

I appreciate the help jackbomb!


Report •

Related Solutions

#4
July 15, 2012 at 20:26:05
No problem!

Does your adapter convert the graphics card's DVI port into a 15-pin VGA? If it does, then you're still getting an analog signal. If you connect the monitor directly to the DVI port, you'll get a digital signal.

HTPC | Pentium M @ 2.82GHz, 2MB L2) | 4GB | 1.0TB | Radeon HD5750 | Blu-Ray
Win 7 Pro | Modified PowerMac G4 QuickSilver case


Report •

Ask Question