Dual Graphics Cards vs. Dual DVI Single Card

CinemaPete wrote on 2/19/2013, 10:39 AM
Please advise: I know Vegas Pro can work with dual monitors. However, there are a number of ways of setting up a dual-monitor configuration. One way is to use a single graphics card with dual DVI ports, another way is to use two separate graphics cards each with it's own DVI port. I want to have my preview window in its own monitor with nothing else there, and the rest of the Vegas interface appear in the primary monitor. I need a definitive answer on what is the optimal way to do this. Thanks

Comments

OldSmoke wrote on 2/19/2013, 10:54 AM
It depends on your systems mother board. If your motherboard can provide 2 PCIe x16 slots, meaning that with both cards installed you still get PCIe x16 (not x16 & x8 or worse x8 & x8), then you "can" use two separate cards. However, from personal experience with a GTX570 and a GTX560Ti in the same system, there is no difference in speed or otherwise where I connect the "External Monitor". I use two cards because the i7-3930K doesn't have an integrated GPU and I have 3 monitors. Important is, and I am speaking only about NVIDIA, that you use a card from the GTX x6x like a 460 and higher or a or 560 and higher. The x5x series doesn't really improve performance and the GTX6xx series isn't fully supported from what I can read in the forum here.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

scrubus wrote on 2/19/2013, 11:01 AM
I use an SLI system and run both monitors off of 1 card in SLI mode
Works great for me.
2 - GeForce 8800GTX 768 mb 384 bit DDR3 PCI express
Intel Core2 Quad Q6600 Kentsfield 2.4 GHZ
Windows XP
CinemaPete wrote on 2/19/2013, 11:10 AM
Thanks OldSmoke for your reply: I should clarify my question somewhat: I can purchase a system with two PCIe x16 slots and populate them with two separate graphics cards. The other option is to purchase a single graphics card with dual-DVI ports. In the later example, a daul-DVI card will generally take up the space of two cards so even if the MB has two PCIe x16 slots using the single dual-DVI card renders the 2nd PCIe x16 slot unusabe. But either config would support two monitors. My concern is not so much for speed but for being able to see the VP Preview Window in a separate monitor (whether or not I'm using two separate cards or a single card with dual DVI ports). So, I guess my question should have been: do I need to have two separate cards to see the VP preview window in its own monitor or can that also work using a single grraphics card with dual DVI ports. Thanks again.
OldSmoke wrote on 2/19/2013, 11:37 AM
The short answer is yes, you can use a single Dual DVI port card such as the GTX570 I use. As for motherboards that support real dual PCIe x16; from my knowledge it has to be a socket 2011 or 1366 board. There is only one socket 1155 board that I know of supporting x16,x16. I recently changed to a socket 2011 for that reason and now have both cards running at full bandwidth.

As for SLI: I did extensive testing with two GTX460 about a year ago and found no improvement at all over a single GTX460 on my system; in fact rendering was a bit slower in this SLI setup.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

CinemaPete wrote on 2/19/2013, 12:07 PM
Thanks again OldSmoke.
CinemaPete wrote on 2/19/2013, 12:16 PM
Thanks Scrubus for your reply: I checked on the avalability of the card you're using and it seems to not be available any longer, but of course there are other more recent cards that will also work. My main concern was being able to see the Preview Window in a separate monitor and Sony confirmed to me that either a dual graphics card system or a single card with dual DVI will allow that. THanks.
wwaag wrote on 2/19/2013, 2:43 PM
You might also consider an Nvidia 6 series card. I have the 650--about $130. The beauty of this card is that it supports up to 4 monitors. At the moment, it drives two Dell U2412 monitors using DVI and also a 40 inch TV moniitor using HDMI. For Vegas, it's really great that I can set the external preview to the large TV in addition to the two desktop monitors. Since GPU acceleration in Vegas is problematic, I went with the bottom of the line card.

wwaag

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

OldSmoke wrote on 2/19/2013, 2:51 PM
Are you suggesting a GTX6xx series and switching GPU accelreation off? I personally played with that thought but seeing how well GPU acceleration works on my system with the GTX5xx series, I dropped that idea and decided on two GPUs; it also has the advantage to have a max. of 4 monitors. GTX560, and 570 are hard to find new but there are plenty on eBay.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

CinemaPete wrote on 2/19/2013, 3:01 PM
Thanks for the suggestions wwaag: I googled "NVidia 650" and it comes up with 650M 650 GTX, so I'm not sure which card you're actually referring to but I did find the EVA GTX 650 if that's an equivalent (?).
wwaag wrote on 2/19/2013, 4:33 PM
Here is the card I use.

[Link=http://www.newegg.com/Product/Product.aspx?Item=N82E16814125444]

wwaag

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

OldSmoke wrote on 2/19/2013, 6:53 PM
wwaag

This GTX650 card has only 384 CUDA cores versus 480 for GTX570. The core clock is higher but the bus width is only 128bit verus 320bit; even the 560Ti has a bus width of 256.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

TheRhino wrote on 2/19/2013, 9:04 PM
Installing (2) GTX 570 cards to drive 3+ displays is not only expensive up-front but they will increase the heat in your system, the heat in your studio/room, and drive-up the utility bill. I like the idea of going with a 4-output $130 6xx card & leaving GPU rendering OFF or waiting until the next generation of GPUs is released later this year.

Let's face it, GPU rendering is broken in Vegas and does not lead to increased productivity. The time wasted fiddling to get it to work does not match the time saved due to faster rendering with select codecs.

My next Vegas workstation will concentrate on the fastest non-Xeon CPUs (hopefully an 8-core Sandy Bridge by then) and the GPU will simply be something I know is reliable. IMO if Intel releases an 8-core Sandy Bridge for Socket 2011 that overclocks in the 4.5+ ghz range it will outpeform the next generation of GPUs plus all applications will benefit, not just render speeds...

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

OldSmoke wrote on 2/19/2013, 9:57 PM
Well on my system rendering times are cut in half with GPU acceleration and I doubt any CPU alone will ever beat it. Maybe I am one of the lucky ones as I never had an issue with it, even in VP11. SCS software is certainly not bug free but GPU acceleration has always worked for me with the right driver. How long did I play around with to get it working? Not long either. My repvious GTX460 worked form the start in VP11 but there was only driver 275.33 available at the time and as you can read in the forum, drivers only up to 296.10 or maybe one more generation are working fine. The GTX6xx series is based on the Kepler architecture which is more suited for gaming whereas the GTX5xx or Quadro series is based on Fermi which is what SCS recommends.

My fear is that GPU manufacturers will more and more support gaming with their mid range cards and we may have to buy a pro card such as a Quadro to get full support for NLE.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

NickHope wrote on 2/19/2013, 10:11 PM
Before anyone splashes out on a Quadro primarily for Vegas, it's worth reading JohnnyRoy's post of 2/8/2013 7:27:01 PM on this thread.
OldSmoke wrote on 2/19/2013, 10:53 PM
A Quadro 4000 is certainly not worth it. A comparable Quadro to a GTX570 would be one of the 5000 series; similar CUDA cores and bandwidth which both are very important performance factors aside from core clock, memory and shader clock. Quadro cards are more geared for CAD or 3D applications but again not necessarily for NLE. You can actually use a Quadro driver on a GTX570. There is no benefit to it but it shows how close the two are since they are based on the same Fermi architecture. My previous point is that manufactures like NVIDIA would us NLE users rather buy a Quadro then use a cheaper card and have a good life. For me and my system, the GTX570 was the right choice and I never looked back. The recent upgrade to a 3930K made a second card necessary for my 3 monitors and if I could fit another GTX570 into the computer I would have bought one. But there is still hope once I have converted the current one to water cooled I have the space for a second one.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

TheRhino wrote on 2/20/2013, 8:53 AM
OldSmoke---

I am not comparing CPU to GPU render speeds within V11/V12, I am comparing my CPU render speeds in V10e to V11/V12...

In V11/V12 GPU rendering is faster than CPU rendering because a 6-core CPU is not utilized 100% during CPU renders (only about 60% on my system). However, since Vegas 10e is able to utilize 100% of my 6-core CPU, my V10e renders are faster than V11/V12 even with a GTX 570 installed.

With most types of HD video as my source files (including uncompressed HD) I can edit them and render to the MainConcept Blu-ray & DVD MPEG-2 video at nearly 1:1. One hour of video takes one hour to render... I was not able to get a GTX 570 able to do this within V11/V12 but maybe I gave-up too soon... CPU rendering in V11/V12 takes about 50% longer than CPU rendering in V10e...

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

OldSmoke wrote on 2/20/2013, 9:58 AM
TheRhino

I have done that comparison too, in fact it was the first thing I did. I had already a GTX460 in my old system and VP10e so I downloaded Vegas own benchmark project, a trial version of VP11 and did exactly what Sony did. Of course the GTX460 was a bit slower then the results published by Sony on their systems with a GTX570 but nevertheless, it was a huge difference rendering the same project in VP10e compared to VP11 and now VP12. I got close to their results when I overclocked the GTX460 but I would not have been able to render a larger project without overheating the GPU and also the cooling fans made way too much noise running at almost 100%. So the next step was to go for a 570 and that solved it all. I have raised the clock on my GTX570 from stock 742 to safe 875 and I can render Sony's own project in 38sec. to MC AVC Internet 1080p. Maybe you really gave up to soon? Also those codecs in VP11 & 12 that are optimized for GPU rendering "can" do but are not CPU optimized hence the difference in utilization. I still believe that the majority of users are benefiting from GPU acceleration and hence are not on the forum. I only got here after so many years of using Vegas, I started with VP7, because I had 2 bugs to report of which only one turns out to be valid.

Aside from rendering, do you see any improvement in timeline playback? That is however mostly depending on the footage, effects applied and project settings. I only work with HDV and AVCHD but 90% of the timeline runs with full framerate at Good (Full) setting in the preview window and simultaneously on the external monitor.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

TheRhino wrote on 2/21/2013, 8:00 AM
OldSmoke---

The GTX 570 was installed in a system I setup for a colleague which utilized the newer 3930K but performed similarly to my 6-core 980X in CPU rendering tests... I didn't overclock anything because I aimed for stability & reliability since I cannot babysit the system for him...

I have (2) I7-920s that I still use for capturing video from tape & for rendering old projects stored on hot swap drives. When clients ask for various file formats I can just setup the render on one of the 920s and continue to edit current projects on the 980X without a peformance hit. However, the 920s are half the speed of my 980X and I don't like to let them run after I leave the studio. If a $200 GTX 570 would double my render speeds on those systems, I'm all-in.

Nvidia has just announced their new GTX TITAN based on the GK110 which is based on the same GPU as the Tesla K20 used in supercomputers. I am anxious to see how this card will perform in Vegas but the intro cost could be as high as $1000.

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

CinemaPete wrote on 3/4/2013, 3:10 PM
Thanks to everyone with their input to my topic question: My original goal was to see the Vegas Pro Preview Widow in a separate monitor. I had no idea how easy Vegas Pro makes this: Presuming you have two monitors attached to your PC you can enable the Preview Window to show up in the 2nd monitor and so I was able to do that with my current setup by using the HDMI output on my current PC and connected to a 47" HD TV - so in terms of achieving the goal of seeing the Preview Window in it's own monitor "problem solved".

Preview Window Performance: However, and this is no doubt due to my current PC's low power and limited RAM and graphics card with only 512 Meg on it: Anything that appears in the preview window, regardless of whether the preview window is part of the typical Vegas interface in the primary monitor or appears separately in a 2nd monitor, preview performance is not good if I try to use any setting greater than "Preview Auto", especially if there are lots of tracks with lots of FX going on. I am attributing this "bottle-neck" to the slow and limited graphics performance of the current - laugh - AMD Radeon HD3650 / 512 Meg RAM card.

I'm awaiting a new system utilizing an Asus P9X79 Pro motherboard which has 4 PCI 3.0 x16 slots, which will have an i7-3930K processor, 32 GIG DDR3 System Ram (1600 Mhz). While the motherboard can handle 4 graphics cards, I'm receiving the system with an NVidia GeForce 620, 2 GIG RAM (96 CUDA cores).

I suspect that the GeForce GT620 in the new system is much better than the current AMD Radeon HD3650 but by how much I don't know.

I can also option it with an NVida GeFroce GT640 / 2 GIG, GTX660 / 2 GIG, GTX670 / 4 GIG, GTX680 / 4 GIG - or anything else for that matter.

If you can suggest a card that will not "break the bank" yet allow the preview window to work with Good Auto and Best Auto even under heavy load, I welcome further suggestions.

In the meantime I will have to wait and see what Preview Window performance turns out to be on this new system using the GT620.

Again, thanks for the input.
OldSmoke wrote on 3/4/2013, 3:19 PM
The GTX620 wont be much better as it has only CUDA 96 cores. I still suggest to get a GTX570, ASUS is my preferred one. You can get GTX570 new and there are plenty on eBay in good conditions. If you go for 6xx series, use GTX670 but in general the 5xx series is still better supported under Vegas when combined with an earlier driver like 296.10.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

CinemaPete wrote on 3/13/2013, 2:54 PM
Well, I'm reporting back on this very interesting topic and the discussion I triggered: My new system has an i7-3930k (not over-clocked), 32-GIG RAM, and the NVidia GeForce 620 (DVI and HDMI output).
Performance wise: Of course, I see an overall performance increase in Vegas regarding loading, etc, in fact, everything is of course faster.

However, as OldSmoke suggested, I wouldn't see much better graphics performance (presumabley in the Preview Window) and my tests with material confirm this - I see some improvement but not much. You are right OldSmoke. I will try an find a GTX 570 as suggested.

Also, I'm not committed. (...no doubt "they" will indeed, commit me to the local asylum when this is all over) to NVidia, so if there are any other cards I'm open to suggestions.

Again thanks to all for the very interesting and enlightening dialog !