|So I have had this dilemma for over 2 years now, it seems not a single soul in this world has a clue of the answer. |
I am wanting to run a car audio amp in my house. Seems easy right? Just get a 12 volt power supply with "any old amp rating" and you good to go! Problem is that "any old amp rating" is going to starve your amp, did someone bring marshmallows?
Logic says the fuse rating on the amplifier is the maximum current draw of the amplifier. My amp has a 125A fuse. So with my thinking, I need a 12 volt power supply rated at 125A. This makes since because the amp is rated at 1500wRMS @ 2 ohms stable mono. (12x125=1500)
Let me back up a bit... The amp is a Kicker ZX1500.1. 1500 meaning 1500W (@2ohms stable) .1 for mono.
What about a car battery though..., what is its continuous amp rating while being 99% charged from a battery? Extremely few people know! I recently came across a formula--
Continuous amps = Amp Hour rating / Charge-Discharge Time (hours)
This is supposed to be the continuous current of a battery... but wait... I don't know if (the battery I have) even supplies 125A continuously! (insert dilemma of in line fuses)
I've seen people on Youtube who use battery chargers rated at 5 amps to power high end amplifiers in their home. WTH sense does this make? Charger- "Here you go Mr. amp! Here is (12Vx5A) 60w of power!
Amp- "YAY! wait... where is the other 1440 watts? I'm starved!"
So back to my original question: What is the best way to determine how many continuous amps you need from a power supply to power an amp at home with given parameters?
Am I right on the dot with logic? or am I completely lost in thought...... Don't tell me just use a home theater amp. There is not a single one in this world that will push 1500Wrms+ at 2 ohms with a low pass filter.