I NEED YOUR OPINIONS!

Comments

Red Prince wrote on 10/31/2011, 1:44 PM
Sorry, I refuse to judge a system based on its photograph.

He who knows does not speak; he who speaks does not know.
                    — Lao Tze in Tao Te Ching

Can you imagine the silence if everyone only said what he knows?
                    — Karel Čapek (The guy who gave us the word “robot” in R.U.R.)

Red Prince wrote on 10/31/2011, 4:27 PM
A 630 W power supply? Too weak if you want to add any power hungry video card.

He who knows does not speak; he who speaks does not know.
                    — Lao Tze in Tao Te Ching

Can you imagine the silence if everyone only said what he knows?
                    — Karel Čapek (The guy who gave us the word “robot” in R.U.R.)

Former user wrote on 10/31/2011, 5:58 PM
Red Prince is right...minimum for the newer video cards is a 750w power supply. The ATI 6990 and NVidia GTX 590 both use about 385 watts of power - that's JUST the video cards.

Aside from that 1TB isn't much drive space (drop in at least another TB, or two more and have a RAID 0 array).

4GB of memory is minimal. Memory is cheap these days...at LEAST double it. My "small" system has 12GB...my big one 16GB.

A higher end video card will cost another $400-500 on system price.
TheRhino wrote on 10/31/2011, 6:11 PM
Your specs do not include a video card.. Also, hard drives are at an all time high right now due to flooding in Tawain, so if you can use existing hard drive(s), I suggest the following for a budget Vegas system:

$320 Sandy Bridge I7 2600K
$120 P67 motherboard
$ 60 8GB memory
$110 Nvidia GTX 550 Fermi
$100 Corsair 750W
$ 60 Antec 300 Case
------ Use existing Hard Drives
------ Assume using existing OS & Vegas license

TOTAL: $775 - $800 (with video card, without hard drive or OS...)

IMO, this will give you rendering speeds very close to those of us who plopped-down $1000 for the first 6-core I7 last year... Even without a fast video card the 2600K will give you great previews when working with HD video... Lots of others talking about the 2600K.

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

ciarrochi wrote on 11/2/2011, 10:02 PM
I should have been more clear. Here is what I have in my current PC that I will use in the new one...(copied from Belarc Advisor)

ATI Radeon HD 3600 Series [Display adapter]
Windows 7 Professional Service Pack 1 (build 7601)
Plus two new WD 2.0T drives
musicvid10 wrote on 11/2/2011, 10:32 PM
My opinion?
I think it's a shame when computer chips and GPUs take more wattage than it does to cook a frozen dinner in my microwave oven.

And rather than being on for ten minutes, these beasts run 24/7 in many instances, drawing full power continuously for multi-day renders.

The computer industry should be shamed into producing energy-efficient hardware, as the refrigeration industry was so motivated decades ago.

Let's do some simple math:
750 watts X .12 per kw/hr X 24 hours (all conservative numbers) =
$2.16 per day

By contrast, my 12,000 BTU room air conditioner, running 24/7, costs just less than $1 a day at 95-100 deg F daytime temps. Documented this past July.

Disgraceful.


John_Cline wrote on 11/2/2011, 11:59 PM
Where do you get this 750 watt continuous power consumption? My 980x six-core machine when overclocked to 4 Ghz for rendering with all cores at 100%, plus 6 hard drives, two 26" monitors and two nVidia GTS-450 cards only draws a measured 480 watts. At idle with stock clock speeds, the system draws 363 watts.

A standard size microwave draws well over 1000 watts. A quick look at 12,000 BTU "high efficiency" air conditioners show their power consumption to average around 1,100 watts.
photoscubaman wrote on 11/3/2011, 3:07 AM
I agree with John.

I have just built a system with a 700wat supply this is sufficient for most. you would only need more if you were using 590 gtx dual gpu cards in sli.

or equivilent ati cards although they tend to draw less current.

There is no point in dual gpu cards or sli or crossfire in an editing situationa as neither sony or adobe utilise more than a single gpu even if on the same card.

I have been looking into that. and done some reseach.

I agree with the specs although id go for the 2700k cpu as its about the same money and slightly faster.
Red Prince wrote on 11/3/2011, 11:39 AM
Here is a screen cap of what the CPUID Hardware Monitor is showing with a computer that runs 24/7. My computer has a 750 W power supply. Two hard drives inside the case, bunch more outside. An old CD/DVD burner and a relatively new BD burner, both inside. Plus the NVIDIA GT 430.

Total power consumption is shown in this image. I have circled where it is showing the CPU power and the total package power. The left value is the current (i.e., right now, not electric current) value, the middle value is the minimum value, the value on the right is the maximum value since I last turned the computer on.



I’d say 130 W maximum is a lot less than either a microwave oven or an air conditioner uses.

Just because a computer has a 750 W power supply does not mean it consumes 750 W/h. It only means the power supply can handle up to 750 W.

Now, more powerful GPUs require a lot more power than my lowly GT 430, let alone the CPU. That is because they have hundreds of cores, and each core has to consume some power.

Now I see I need to figure out how to lower my CPU temperatures somehow. But that is another topic.

He who knows does not speak; he who speaks does not know.
                    — Lao Tze in Tao Te Ching

Can you imagine the silence if everyone only said what he knows?
                    — Karel Čapek (The guy who gave us the word “robot” in R.U.R.)

musicvid10 wrote on 11/3/2011, 12:31 PM
I concede that maximum PS current is not going to be continuously drawn. I did want to stimulate some discussion on this because running a home computer generates wasted heat and adds $$ to your power bill, and the voluntary Energy Star ratings haven't done much to change that.

But putting my rant in perspective, a microwave runs at full power ~10 min at a time, has a low duty cycle in the home, and draws next to nothing on standby. A big room air conditioner is going to run a ~30% duty cycle during the heat of the summer, so it's not drawing full power continuously, either.

Using John Cline's numbers as a reference, his system rendering 24 hours continuously is still going to draw nominally ~40% more power than the air conditioner, at a hypothetical cost of $1.36 a day.

Leaving John's system idling 24/7 will cost him just over $1 a day, or $380 per year (at a net .12 kw/hr rate).

My understanding is that as computing speeds go up, so do currents [corrected] in order to keep the errors down. Kind of like putting 500+ hp engines in cars so they will go faster, which was the biggest fad of the 1960s. Does hungrier = faster? Yes, but is there a more cost efficient way? By (unfair) comparison, the lowly z80 was designed to run on only a few mA and just got warm to the touch. CPU fans in home computers only began to appear in the mid-1990s.

John Cline, your numbers make sense. Did you actually measure the continuous power line current to each device using an Amprobe or Killawatt? Or did you calculate the current using nameplate numbers and internal readings?
Red Prince wrote on 11/3/2011, 1:30 PM
My understanding is that as computing speeds go up, so do voltages in order to keep the errors down.

Sorry, but no. Voltages have been the same for decades now. 12 V, 5 V, 3.3 V. Or maybe not, as I do not remember the 3.3 V in the ’80s (though that does not mean it was not used, I just do not remember it). If anything, many new digital materials require less voltage than they did decades ago. That is what made laptops possible.

Some of the devices, such as the more powerful GPUs, draw more current (I, Amperes), but not more voltage (V, Volts). But not all of them. The most power (P, Watts) is consumed when playing games on a powerful GPU. Hopefully, no one plays games 24/7. :) Using the GPU as a PPU (parallel processing unit), which is what we do in Vegas and other CUDA/OpenCL software, does not require as much current as playing games.

P = IV, so yes, you can consume more power with the same voltage. It is the current that makes the difference in computers, not voltage.

Now, some gamers increase the voltage of their GPUs, which will draw more power and also lower the life of their GPU. But I would hope no video editor goes that way.

As for it costing money, well yes. Even breathing costs money (since it makes you hungry and you have to buy your food). Most people do not keep their computers on 24/7. I am one of those who do because I have my personal web sites on my computer. But most do not.

He who knows does not speak; he who speaks does not know.
                    — Lao Tze in Tao Te Ching

Can you imagine the silence if everyone only said what he knows?
                    — Karel Čapek (The guy who gave us the word “robot” in R.U.R.)

musicvid10 wrote on 11/3/2011, 1:36 PM
Correct. I've fixed my typo.
Well familiar with the power formula, although the role of current is a common source of confusion. Increased current or voltage increases power consumption. The first infers lower resistance, the latter does not.

Unfortunately, the myth that keeping your computer on 24/7 will prolong its life persists, and many will do so for the remainder of their lives.
John_Cline wrote on 11/3/2011, 3:37 PM
"Did you actually measure the continuous power line current to each device using an Amprobe or Killawatt?"

Yes, those are actual measurements. The idle power could be lower if I allowed the drives to spin down when they're not being used. It would be lower still if I had the monitors turn off when not in use. Although, I do turn them off for long, overnight renders.

As far as the cost is concerned, even at $2/day, that's still cheaper than a cup of coffee at Starbucks. If my machine is on, it's generating income and $2 represents less than 1 minute at my hourly rate.