Power Supply

What you need to know about the most important part of your computer.

An explanation into why the power supply is so important, and why wattage is more important than you thought.

Episode #4-14 released on December 15, 2013

Watch on Youtube

When it comes to our computers, we may consider the processor, ram, graphics card, even hard drives as being the most important part of our computer. The fact is, the power supply is the single most important piece of hardware in your computer, without it, no computer, just a thousand dollar paper weight.

Today, you will learn how a power supply works, how to save money by buying an efficiently power supply, how to determine how many watts you should actually purchase for your computer, and what happens when you don't have enough power being delivered to your computer from the power supply.

Power supplies work by converting your alternating current from your wall outlet, to a direct current that can be used your computer's components. Power supplies convert the AC power into 3.3, 5 and 12 Volts of DC power. The 3.3 and 5 Volts are for your standard electronic components, and the 12 volt lines allow your fans, hard drives, and optical drives to function correctly.

Now, our power supply does a lot for our computers, but where it shines is how it deals with power loads required by our equipment. Our graphics cards, sound cards, hard drives, optical drives, processors, fans, USB devices, etc... all need power. A properly built machine can always deliver enough power to make the computer function flawlessly. To properly power the computer, you need a power supply that can deliver enough wattage. The more wattage a computer needs, the more wattage the power supply needs to be able produce in order to allow for proper power flow and functioning of each of the components within the computer.

If a power supply reaches its upper limit in the number of watts in can produce, it can make the computer extremely unstable, and have many undesirable side effects. These side effects can be spontaneously rebooting, devices failing to function while others are in use, fans can fail to spin, etc... In some extreme cases, and usually rare, devices that require a lot of power to run like graphics cards, some sound cards, even the motherboard, can be significantly damaged beyond repair. Due to this possibility, it is extremely important to read the user manual of each device to determine the required amount of wattage your power supply has to be able to deliver in order to both power, and protect your computer.

The first thing you should read is the power requirements is the motherboard manual, different manufactures will have different types of technologies that require more or less wattage to power correctly the board. Your motherboard manual may, also, contain information about suggested power supply wattages for single, or multiple graphics card operation, as well as, detailed steps about auxiliary power plugs you may have to use to provide sufficient power to both the motherboard and graphics cards. Failing to follow the instructions to your motherboard may result in shorting out the motherboard.

In some cases, you may have a power hungry graphics card, which may require a power supply with significantly more wattage potential. If your motherboard is expecting you to use a 600 watt power supply, and 300 watts would go to the graphics card, and your graphics card needs 450 watts, you should be looking at a 750 watt or more power supply to power the graphics card and motherboard.

Now, should you be buying a larger power supply for your computer?

If we ignore the fact that using an overly capable power supply will just dissipate the extra power not used in heat, and the fact that the computer will only ask for the required power from the power supply, should you buy a power supply that is far more powerful than your computer needs? No. If you have a standard computer that would work fine with a 500 or 600 watt power supply, don't buy a 1000 watt power supply. It won't be efficient because you will never put it under load, and it costs more to have higher wattage power supplies, and if you don't need to have such a powerful supply, you can easily put those extra dollars into better hard drives, SSDs, even Ram. Then only conceivable reasons that you would need bigger power supplies are that you intend to overclock your computer, run multiple graphics cards, large raids, etc... If you have a normal graphics card, a single hard drive, and nothing fancy, the investment is not worth the heat dissipation. Normal practice is to calculate the amount of required power, and add 20% to allow for more power usage for devices such as your graphics card. If you computer needs a total of 500 watts to function, 20% of that would be 100 watts, so that would mean that you should use a 600 watt power supply for your possible needs.

Now, a fun historical fact many of you may not remember, or have been around for, is that we used to manually close our computers. In addition to having a permanent switch, the power button, to turn on our computer, the operating system can now talk to the power supply and request the power supply to shut down to a lower power state that allows for a soft off mode, normally considered off, even though the computer is still technically on. Older computers had to wait for the user to manually press the power button to in order to turn off completely.

Host : Steve Smith | Music : Jonny Lee Hart | Editor : Steve Smith | Producer : Zed Axis Productions

Sources & Resources

Community Comments

Share your thoughts, opinions and suggestions

Login or Register to post Your comment.