A 630 W power supply? Too weak if you want to add any power hungry video card.
Former user
wrote on 10/31/2011, 5:58 PM
Red Prince is right...minimum for the newer video cards is a 750w power supply. The ATI 6990 and NVidia GTX 590 both use about 385 watts of power - that's JUST the video cards.
Aside from that 1TB isn't much drive space (drop in at least another TB, or two more and have a RAID 0 array).
4GB of memory is minimal. Memory is cheap these days...at LEAST double it. My "small" system has 12GB...my big one 16GB.
A higher end video card will cost another $400-500 on system price.
Your specs do not include a video card.. Also, hard drives are at an all time high right now due to flooding in Tawain, so if you can use existing hard drive(s), I suggest the following for a budget Vegas system:
$320 Sandy Bridge I7 2600K
$120 P67 motherboard
$ 60 8GB memory
$110 Nvidia GTX 550 Fermi
$100 Corsair 750W
$ 60 Antec 300 Case
------ Use existing Hard Drives
------ Assume using existing OS & Vegas license
TOTAL: $775 - $800 (with video card, without hard drive or OS...)
IMO, this will give you rendering speeds very close to those of us who plopped-down $1000 for the first 6-core I7 last year... Even without a fast video card the 2600K will give you great previews when working with HD video... Lots of others talking about the 2600K.
Where do you get this 750 watt continuous power consumption? My 980x six-core machine when overclocked to 4 Ghz for rendering with all cores at 100%, plus 6 hard drives, two 26" monitors and two nVidia GTS-450 cards only draws a measured 480 watts. At idle with stock clock speeds, the system draws 363 watts.
A standard size microwave draws well over 1000 watts. A quick look at 12,000 BTU "high efficiency" air conditioners show their power consumption to average around 1,100 watts.
I have just built a system with a 700wat supply this is sufficient for most. you would only need more if you were using 590 gtx dual gpu cards in sli.
or equivilent ati cards although they tend to draw less current.
There is no point in dual gpu cards or sli or crossfire in an editing situationa as neither sony or adobe utilise more than a single gpu even if on the same card.
I have been looking into that. and done some reseach.
I agree with the specs although id go for the 2700k cpu as its about the same money and slightly faster.
Here is a screen cap of what the CPUID Hardware Monitor is showing with a computer that runs 24/7. My computer has a 750 W power supply. Two hard drives inside the case, bunch more outside. An old CD/DVD burner and a relatively new BD burner, both inside. Plus the NVIDIA GT 430.
Total power consumption is shown in this image. I have circled where it is showing the CPU power and the total package power. The left value is the current (i.e., right now, not electric current) value, the middle value is the minimum value, the value on the right is the maximum value since I last turned the computer on.
I’d say 130 W maximum is a lot less than either a microwave oven or an air conditioner uses.
Just because a computer has a 750 W power supply does not mean it consumes 750 W/h. It only means the power supply can handle up to 750 W.
Now, more powerful GPUs require a lot more power than my lowly GT 430, let alone the CPU. That is because they have hundreds of cores, and each core has to consume some power.
Now I see I need to figure out how to lower my CPU temperatures somehow. But that is another topic.
I concede that maximum PS current is not going to be continuously drawn. I did want to stimulate some discussion on this because running a home computer generates wasted heat and adds $$ to your power bill, and the voluntary Energy Star ratings haven't done much to change that.
But putting my rant in perspective, a microwave runs at full power ~10 min at a time, has a low duty cycle in the home, and draws next to nothing on standby. A big room air conditioner is going to run a ~30% duty cycle during the heat of the summer, so it's not drawing full power continuously, either.
Using John Cline's numbers as a reference, his system rendering 24 hours continuously is still going to draw nominally ~40% more power than the air conditioner, at a hypothetical cost of $1.36 a day.
Leaving John's system idling 24/7 will cost him just over $1 a day, or $380 per year (at a net .12 kw/hr rate).
My understanding is that as computing speeds go up, so do currents [corrected] in order to keep the errors down. Kind of like putting 500+ hp engines in cars so they will go faster, which was the biggest fad of the 1960s. Does hungrier = faster? Yes, but is there a more cost efficient way? By (unfair) comparison, the lowly z80 was designed to run on only a few mA and just got warm to the touch. CPU fans in home computers only began to appear in the mid-1990s.
John Cline, your numbers make sense. Did you actually measure the continuous power line current to each device using an Amprobe or Killawatt? Or did you calculate the current using nameplate numbers and internal readings?
My understanding is that as computing speeds go up, so do voltages in order to keep the errors down.
Sorry, but no. Voltages have been the same for decades now. 12 V, 5 V, 3.3 V. Or maybe not, as I do not remember the 3.3 V in the ’80s (though that does not mean it was not used, I just do not remember it). If anything, many new digital materials require less voltage than they did decades ago. That is what made laptops possible.
Some of the devices, such as the more powerful GPUs, draw more current (I, Amperes), but not more voltage (V, Volts). But not all of them. The most power (P, Watts) is consumed when playing games on a powerful GPU. Hopefully, no one plays games 24/7. :) Using the GPU as a PPU (parallel processing unit), which is what we do in Vegas and other CUDA/OpenCL software, does not require as much current as playing games.
P = IV, so yes, you can consume more power with the same voltage. It is the current that makes the difference in computers, not voltage.
Now, some gamers increase the voltage of their GPUs, which will draw more power and also lower the life of their GPU. But I would hope no video editor goes that way.
As for it costing money, well yes. Even breathing costs money (since it makes you hungry and you have to buy your food). Most people do not keep their computers on 24/7. I am one of those who do because I have my personal web sites on my computer. But most do not.
Correct. I've fixed my typo.
Well familiar with the power formula, although the role of current is a common source of confusion. Increased current or voltage increases power consumption. The first infers lower resistance, the latter does not.
Unfortunately, the myth that keeping your computer on 24/7 will prolong its life persists, and many will do so for the remainder of their lives.
"Did you actually measure the continuous power line current to each device using an Amprobe or Killawatt?"
Yes, those are actual measurements. The idle power could be lower if I allowed the drives to spin down when they're not being used. It would be lower still if I had the monitors turn off when not in use. Although, I do turn them off for long, overnight renders.
As far as the cost is concerned, even at $2/day, that's still cheaper than a cup of coffee at Starbucks. If my machine is on, it's generating income and $2 represents less than 1 minute at my hourly rate.