Titan with CUDA fails on vegas Pro 12

Hussayn wrote on 3/19/2013, 3:09 PM
Hi;

We have recently purchased an nvidia-titan, and installed the nvidia beta driver 314.21 (the official driver make vegas crash frequently during various operations) Now with teh Beta-Driver the rendering of an mp4 video without CUDA works just fine. But rendering with CUDA still shows 2 issues:

1.) There is no speed increase at all. (Same render time as without CUDA)
2.) At the end of the rendering Sony Vegas crashes.

Here are the render settings:

- MainConcept AAC/AVC
- size: 1280*720 pixels
- framerate: 25(PAL)
- aspect ratio: 1
- maximum bps: 18.000.000
- average bps: 12.000.000
- Encode mode: Render using CUDA if available
- enable progressive download

Is there anything we can do to get the power of titan used by Vegas ?

thanks
hussayn

Comments

NormanPCN wrote on 3/19/2013, 4:25 PM
Sorry, I can't directly help you but something here smells similar to what has happend to me on the AMD side of things with the MC AVC encoder.

I upgraded my video card from a 5850 to a 7950 and MC AVC OpenCL no longer thinks OpenCL is available since it runs CPU only. Same machine, same driver version.

I know it is not using OpenCL by render time and using the GPUz utility. All other OpenCL programs detect and use it.

I assume your machine had a previous Nvidia card installed. I wonder in MC has some issue with the hardware changing. They don't seem to think CUDA is there with your new card, like OpenCL with my new card.

Reported this to Sony. Who knows if they will ever reply. I did the whole uninstall, reinstall, start with ctrl+shift to reset prefs and all that.
OldSmoke wrote on 3/19/2013, 5:41 PM
@Hussayn

I doubt there is anything that can be done. The Titan as well as the 600 series is based on Nvidia's Kepler architecture which isn't well supported under Vegas. You could try and use a Nvidia Quadro driver rather then a GForce driver but I am not sure if that will help. Also the GPU used in the Titan, a GK110 doesnt seem to be used in any of the current Quadro cards according to this table I still believe that the 500 series is much better suited for Vegas Pro 11 and 12, especially the 560, 570 and 580 with driver 296.10. I am currently using a GTX570 with this driver and have only good things to say about it.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

TheRhino wrote on 3/19/2013, 6:33 PM
Hussayn:

Thank you for posting your initial results - many of us are interested in the Titan's potential.

If I had a Titan in-hand, I would definitely re-install the OS & drivers from the ground-up just to make certain the old drivers were not interferring. However, to save time, I would dual boot between my old working OS (with all programs installed) and a 2nd boot drive with just the OS, Titan drivers & Vegas.

It could be a while before we know if the lack of performance & stability is a Titan driver issue or a Vegas issue. Apparently the Titan w/initial drivers was also unable to complete an Adobe PPro render as well. This makes me believe that it is still a driver issue.

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

NormanPCN wrote on 3/19/2013, 6:46 PM
Vegas does not care about what video card is underneath the OpenCL API they use. Now one card, one brand, may be faster than other, but that has nothing to do with Vegas code implementation.

Nothing wrong with Kepler. The 600 series GK104 does have less "compute" perfromance than the 500 series. The architecture is fine, it is all how they distributed the transistor budget. The execution units are skewed towards DirectX/OpenGL render performance. Not so with the Titan GK110. Titan is not just a bigger 680. While both are Kepler in architecture flow (efficiency), Titan distributes its transistor budget differently. Compute on Titan is out of this world compared to the 580, the previous consumer card compute king for Nvidia. It had better be for the price!
TheRhino wrote on 3/19/2013, 7:18 PM
Norman:

So, will Titan's compute capability lead to better OpenCL performance?

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

OldSmoke wrote on 3/19/2013, 7:44 PM
@NormanPCN

You are right, the Titan is in terms of performance faster and yes there is nothing wrong with the Keppler architecture. However, it seems and there are articles on the Internet, that Keppler GPUs and the related drivers are more supporting gaming rather then NLEs and parallel computing. If NVIDIA would give us drivers that would fully support NLEs, then we all would have a better life. I feel that GPU manufaturers dont want us to use "lower" end consumer cards anymore and will eventually force us to use pro cards such as the Quadro line.

Drivers are what makes the GPU interact with the software and a proper writen driver can do so much more with the same hardware. The fastest driver and also most stable driver for my GTX570 and my previous GTX460 is the 275.33 driver with render times being 10-15% faster compared to the 296.10 driver I am currently using. Drivers any later then that not only degraded render times but also preview staility dropped significantly. I would love to use a 2 slot GTX670 or even Titan over my 3 slot GTX570 but as many reviews have shown, there is no improvement for a NLE such as Vegas.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

NormanPCN wrote on 3/19/2013, 8:54 PM
Titan will out compute and out game anything else Nvidia has for "consumer". So in general one can say yes. To be specific one has to measure a specific program at a specific task.

One has to be careful. Some things can bottleneck performance and even getting a 100x faster GPU can make no difference if something else is the limiting factor. For example, I upgraded from an AMD 5850 to a 7950. About a 2x increase in performance. NewBlue Titler Pro pegged tthe meter with my 5850. Using GPUz as the monitor. With the 7950 it does not peg the meter. Only 60 something percent. Something else is bottlenecking the performance. I run something like the LuxMark benchmark and it runs over 2x faster on the new card. One program could use the extra GPU, and the other could not.

So can Vegas, or Main Concept actually use the full power of a Titan? Who knows. I do know that the Main Concept OpenCL encoder did not solidly peg my 5850 card, so I wonder.

Tesla cards are the pure compute cards and they will outdo Titan at compute, at a price of course.

"Compute" is a rather generic term but I don't know any other way to describe it, because even gaming is, quote, compute.
Grant_M wrote on 3/20/2013, 3:51 AM
Hi

I am also waiting for my Titan to arrive.
Just for info though below;

Regarding the Titan OpenCL was 'not working' on release drivers, I did pick this up on one of the respected review sites, which was to be honest quite concerning.

http://www.anandtech.com/show/6774/nvidias-geforce-gtx-titan-part-2-titans-performance-unveiled/4

Looks like they were using release driver version 314.09 at the time.
Not sure if this is fixed in your version or if it would impact on the Titan/Vegas performance.

skeeter123 wrote on 5/17/2013, 12:24 AM
GrantM,

Did your Titan arrive and how is it performing?

thanks!
wwjd wrote on 5/17/2013, 6:36 PM
I just got a titan. haven't installed vegas's yet. I'm sure it will work ok on some level.
previous system, I always used the beta drivers and kept up but didn't have the problems I read about in this forum. But my output is pretty small also.

I'll post up after I get vegas in.
wwjd wrote on 6/8/2013, 10:09 PM
TITAN working okay here. Just installed Movie Studio 11, Pro 11 (32), Pro 12 - all RUN fine but haven't done any rendering yet. On latest driver. Preview is better than ever.

What should I test, or look out for?
NormanPCN wrote on 6/9/2013, 9:35 AM
The Vegas video engine uses generic OpenCL code (GPU). It will work on anything that supports OpenCL. This is Vegas generating the video stream. Playback and render as.

As for file encoding, aka render as.

From online reports, the Main Concept AVC CUDA encoder seems to be targeted to specific GPU architectures and Kepler is newer than the version included with Vegas.

I can verify this is the case for Main Concept OpenCL encoder with AMD GPUs. My newer GPU does not work with the OpenCL encoder but my old card does. Looking inside the OpenCL DLL I can see older AMD architectures listed but not the new one.
OldSmoke wrote on 6/10/2013, 11:00 AM
wwjd You can try and render the render-test 2010 or the SCS benchmark project to MC Internet 1080p with GPU on in both, Vegas and the template, and let us know the render times.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

wwjd wrote on 6/10/2013, 2:27 PM
where is this render test footage and the SCS Benchmark please? see if I can get to it this week
NormanPCN wrote on 6/10/2013, 2:39 PM
The link to download the file is on this page.

http://www.sonycreativesoftware.com/vegaspro/gpuacceleration
wwjd wrote on 6/10/2013, 9:42 PM
system: win 8 i7 3770k, 32gb, 10k os drive, 7200 data drive
Vegas 12 fresh load, latest version
Nvidia Titan: latest driver 320.18, set on "QUALITY"

Rendering Sony PressReleaseProject, MC Internet 1080p:

GPU ON in Vegas Preferences:
- CPU ONLY: 2:35
- OpenCL: 2:36
- CUDA: 2:36

GPU OFF in Vegas Prefs:
- CPU ONLY: 3:22
- OpenCL: 3:22
- Cuda: 3:22

render file size 40kb larger in the GPU version ??
OldSmoke wrote on 6/10/2013, 9:52 PM
Here are my tests from my system from an earlier thread I opened:

http://www.sonycreativesoftware.com/forums/ShowMessage.asp?Forum=4&MessageID=859391

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

wwjd wrote on 6/10/2013, 10:46 PM
Ok, so, Titan is not broken as suspected in the above posts?
OldSmoke wrote on 6/10/2013, 11:28 PM
Not sure what you mean by "broken". The Titan is a powerful card for gaming and maybe other 3D render jobs but not for Vegas 12. Whether this is an actual hardware issue or just a driver issue is beyond my understanding but it could well be Nvidia's strategy to make video editors to go for their professional line of cards like Quadros and so on. But I have not seen a user with a Quadro card based on the Keppler chip confirming that it is any faster then a GTX580; in fact for the additional price it should be 3x as fast!

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

NormanPCN wrote on 6/11/2013, 9:52 AM
The Titan (K20) is the most powerful thing you can get for 3D visual rendering and OpenCL/CUDA. The common misnomer is that Kepler is not so good for compute. This is directed to the 6xx series cards which were optimized for that while removing the stuff for compute. This to differentiate the gaming from the workstation cards.

The K20 has everything for both, and more of it overall. It is a massive chip. The Titan name comes from the fact that the K20X is the chip that was used the the Titan supercomputer and they certainly would not put something crippled in that.
wwjd wrote on 6/11/2013, 10:14 AM
it does work great for my gaming! :D
OldSmoke wrote on 6/11/2013, 10:22 AM
@Norman Is there any proof that a Keppler based Quadro card such as a K5000 works better then a GTX580? I got the money to spend on such a card but I don't have proof that it is actually any faster in terms of timeline preview and rendering.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

NormanPCN wrote on 6/11/2013, 11:28 AM
@OldSmoke

Sorry, I don't really follow that very closely. I am in the AMD camp, with a 5850 previous and 7950 now, so I follow that world more closely.

In the Nvidia world my familiarity is mostly from reading architectural information for curiosity since since I am a low level technical developer by profession, but not in that industry.

The problem is that Vegas is not really used for benchmarking by those sites out there that are into that sort of thing. Benchmarking is difficult for Vegas. Preview is hard to benchmark as it tops out at real time fps. File encoding, render as, does not care about fps and just goes as fast as it can and therefore can be benchmarked. A render as benchmark includes the encoder and the preview computation engines.

The thing I wonder is at what point do you have all you can get. At some point each GPU algorithm will top out in the amout of parallelism it can use and sustain. Simply, twice the GPU will get you far less than twice performance. This is just like doubling your CPU cores. You tend to get diminishing returns.

In the past the Quadro cards literally had the same chip as less expensive gaming cards yet were way more expensive. There were some other diffs of course, but mostly minor. As of the 6xx series Nvidia has made an obvious change to make that ideal only for its intended market. I have not followed and compare spec of Quadro cards to see what additional execution unit functionality Nvidia put into the chip on some or all Quadro boards. I do know they threw the kitchen sink at the K20 and K20X.

TVJohn wrote on 6/11/2013, 8:07 PM
One thing I have noticed over the years...the hottest new GTX type cards are aimed at the Gamers. NLE code and other applications take a year or so to catch up with the new code and hardware.
For Editing purposes, last year's card generally works more reliably. "Factory" overclocked cards also should be avoided.