does a amd 5750 run good with a 300w power su

March 8, 2011 at 03:56:11
Specs: Windows 7/64, 2.6 Intel dual core/2G
I would like to upgrade the graphic in my HP/Compaq 500b. This has an 300W P/S a dual core Intel running at 2.6G and momentarily drains max 100W for the mains.
I do not understand why my card would not work with the system as it consumes 86W peak..
Any thoughts why are the product specifications so high? Or do they just make sure that there is a lot of potential for additional (power consumping) hard drives and DVDs etc.
No one seems to target this in a smart way.

See More: does a amd 5750 run good with a 300w power su

Report •

March 8, 2011 at 04:09:39

That card requires 30 amps on the 12 volt rail with a 450 watt psu. It's not the wattage of the psu that's the key, it's the amperage on the 12 volt rail. psu's with multiple 12 volt rails distribute the amps to the different components and may not have sufficient amperage to the video card. If you want to upgrade the psu, look at the Corsair line or any good psu with a single 12 volt rail.

Report •

March 8, 2011 at 06:13:19
You are correct that video card manufacturers tend to "cover their butt" by over-recommending the PSU requirement. But PSU quality varies so widely, it makes sense for them to err on the side of caution. You may very well be able to get away with a 300W PSU, provided it's a damn good one, but you should want a bit of a safety factor built in for future upgrades & for peace of mind.

Think of it this way - if you weigh 200lbs & need to get to point A to point B by swinging across a deep ravine, would you prefer the rope be rated for 200lbs or 450lbs?

If you want to learn more about PSUs or check reviews, try

Report •

March 9, 2011 at 13:32:51
I just cannot get it, OK the amperage on 12V could be a factor, but for me (being an electrical engineer) it must be that the total power consumption of a system running at full load (i.e. game play) could be near 400W from A/C!
This is definitely NOT the case! My Compaq 500b is running at less then 60W with built in Intel graphic card while running 3dmark bench. Measured personally.
Either something stinks about P/S specs, or everyone is missing my point about real power consumption.

Report •

Related Solutions

March 9, 2011 at 13:48:23
450 - 300 = 150 watts is too much of a difference.

That's output watts by the way. Computer power supplies are never 100% efficient - the best of them are no more than about 85% efficient.

The chances are that if you don't upgrade the power supply, the 300 watt power supply will be overloaded all the time when the 5750 is installed and, if the computer starts up at all with the card installed, it will eventually malfunction, and after a longer time fail completely.
When a power supply malfunctions, or especially when it fails completely, it can fry the mboard or anything connected to it.

If you want to take that chance instead of upgrading the power supply, which isn't expensive, go ahead.

In most if not all cases, the max capacity rating of the PS is an intermittent rating (unless it says continuos on the label). It's recommended that you do not load your PS to any more that 80% of that rating if you are going to be using something that puts a constant load on it, such as playing a recent game for hours on end. In that case, you multiply the min capacity stated for the system with the particular video chipset on the card by 1.25 to find the min. capacity of the PS you should get.

If you do upgrade the power supply...
Don't buy an el-cheapo (in quality) PS.
See response 3 in this:

Report •

Ask Question