resuscitating the beaten horse...

megabit wrote on 6/21/2015, 6:17 AM
OK, so just a quick question: so far there's been a consensus here that the best CUDA acceleration Vegas can get is from the older, Fermi nVidia cards, like the GTX 580 I have. Unfortunately, this card doesn't offer either DP or HDMI in 4k resolution; has anyone compared /read about comparison of Vegas timeline performance with GTX 970 which does offer 4k output (both DP and HDMI 2.0), but obviously is 2 generations younger than the Fermi 580?

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Comments

Enerjex wrote on 6/21/2015, 8:28 AM
The 970 was horrible (unusable) in Vegas for me until the latest drivers seemed to magically make it good. Seems comparable to my 280x so not bad. I had a 580 a while back and it's faster in preview/timeline performance than the 580. There new drivers introduced a new global profile for Vegas, this coupled with the recent update to opencl 1.2 in Nvidia drivers would suggest they decided to do a bit of work on it.

JohnnyRoy wrote on 6/21/2015, 8:54 AM
I just want to clarify your statements so that you understand what's really going on here with regard to Vegas Pro and CUDA:

> "so far there's been a consensus here that the best CUDA acceleration Vegas can get is from the older, Fermi nVidia cards, like the GTX 580 I have."

Vegas Pro only uses CUDA for rendering. The MainConcept AVC encoder is the only one that I know of that has a limitation of only recognizing older Fermi cards. No other part of Vegas Pro is affected by this. If you don't use MainConcept AVC then it doesn't matter what video card you have, Vegas Pro will use it in some way.

> "has anyone compared /read about comparison of Vegas timeline performance with GTX 970 which does offer 4k output (both DP and HDMI 2.0), but obviously is 2 generations younger than the Fermi 580?"

Vegas Timeline performance has nothing to do with CUDA. Vegas Pro does not use CUDA for timeline playback. Vegas Pro uses OpenCL for timeline playback. Any NVIDIA card is as good as any other NVIDIA card for timeline playback because they all support OpenCL. AMD cards have a better implementation of OpenCL than NVIDIA cards so if you are looking to increase your Timeline playback performance, it is recommended that you get an AMD card. The AMD Radeon R9 295X would be the best performer for this.

Just to recap:

(1) Vegas Pro uses OpenCL for Timeline GPU Acceleration (CUDA has no affect here).
(2) AMD Radeon's will give better timeline playback than NVIDIA cards because they have better OpenCL support.
(3) Only the MainConcept AVC encoder is limited to older CUDA and OpenCL cards.

Hope that helps clear things up.

~jr
jwcarney wrote on 6/21/2015, 9:09 AM
thanks jr. But I don't think Nvidia supports past 1.2 and 2.0 has been formalized since early 2015. I am wondering how strong AMD's support for this since they have been talking DirectX 12 and Metal (And Apple is talking Metal, not OpenCL).
JohnnyRoy wrote on 6/21/2015, 9:57 AM
> "I am wondering how strong AMD's support for this since they have been talking DirectX 12 and Metal (And Apple is talking Metal, not OpenCL)."

DirectX and Metal are like OpenGL. They are Graphics Languages. OpenCL is completely different.

OpenCL is an Open Computing Language. It does not perform graphics operations like drawing and shading. It gives you access to the enormous parallel computation power that GPU's have in their core processors. It makes the GPU an extension of the CPU for offloading computational tasks that can be performed in parallel. This is why it's great for processing video which is more about crunching numbers than drawing vectors.

Also don't forget that AMD and Sony have been working closely together for years to squeeze every ounce of performance out the AMD implementation of OpenCL. Also in CompuBench OpenCL benchmark for Video Processing, AMD scores the highest median with 13 out of the top 20 positions.

~jr
megabit wrote on 6/21/2015, 10:18 AM
Thanks guys for clarification. Problem is even the newest Radeon cards still come with HDMI 1.4 - so the only display option for 4K@50/60Hz is DisplayPort. Meaning I'd need a 4k monitor, and I'd prefer a UHDTV... The latest GeForce cards offer both HDMI 2.0 and DP.

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

JohnnyRoy wrote on 6/21/2015, 12:30 PM
> "Problem is even the newest Radeon cards still come with HDMI 1.4 - so the only display option for 4K@50/60Hz is DisplayPort. "

You know, I had no idea what "DP" was in your post. Now I know you were referring to "DisplayPort". My two ASUS ProArt PA246Q monitors are both DisplayPort and my AMD Radeon HD 7950 works great with them and Vegas Pro via DisplayPort cables.

> "Meaning I'd need a 4k monitor, and I'd prefer a UHDTV..."

I hope you're not meaning to color correct on a TV? That could be disastrous. No two TV's in my house look the same because manufacturers use all sorts of hocus-pocus features to make blacks blacker and whites whiter, and pictures sharper to the point that you have no idea what the original video looked like.

You want to color correct with a calibrated monitor, not a TV. I would use something like the ASUS PB287Q 28" Widescreen WLED Backlit LCD 4K UHD Monitor. That would solve your DisplayPort problem as well.

~jr
megabit wrote on 6/21/2015, 12:48 PM
With all due respect, Jerry - can't you just accept that somebody has different priorities and determinants in life?

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

NormanPCN wrote on 6/21/2015, 12:50 PM
"Vegas Timeline performance has nothing to do with CUDA. Vegas Pro does not use CUDA for timeline playback. Vegas Pro uses OpenCL for timeline playback."

One additional point of pedantic clarification. Some might equate, timeline playback, to mean editing only. The timeline video engine is always used no matter what.

The only difference between "render as" and edit playback is what is done with the rendered video frames. Sent to your display device or sent to a file encoder. The file encoders are independent of the video engine.
JohnnyRoy wrote on 6/22/2015, 1:29 PM
> "One additional point of pedantic clarification. Some might equate, timeline playback, to mean editing only. The timeline video engine is always used no matter what."

Excellent point. Improving timeline playback will also improve render times because render includes both timeline playback and encoding.

The key here is that Vegas Pro GPU acceleration only uses OpenCL for timeline playback but it can use either OpenCL or CUDA for render encoding.

~jr
MikeyDH wrote on 6/22/2015, 2:15 PM
So would it be a good move to be rid of my Nvidia GTX460 and get the Radeon R9 295X for smoother timeline playback. I see a liquid cooled X2. Is this the one you speak of JR?
JohnnyRoy wrote on 6/22/2015, 4:44 PM
> "So would it be a good move to be rid of my Nvidia GTX460 and get the Radeon R9 295X for smoother timeline playback."

That will always depend on why you are not getting smooth timeline playback now. If it's something that can benefit from GPU acceleration and you are taking a big jump like that, it should definitely get you smoother timeline playback.

> "I see a liquid cooled X2. Is this the one you speak of JR?"

I don't believe I was recommending any particular card. I have an AMD Radeon HD 7950 because I have a 2010 Mac Pro and that's the fastest Mac Edition card for it so I can't use an R9 series but I believe my HD 7950 is somewhere between the R9 270 and 280 in chipset and performance.

AMD just came out with the Radeon R9 390X 8GB! The reviews say it's a monster. ;-)

~jr
MikeyDH wrote on 6/22/2015, 8:25 PM
Thanks John. My timeline playback isn't as smooth as I'd like with BCC and transitions added. My quadcore system is at its max at 8GB of Ram and there are times when it struggles with the GPU on or off. I get it done but at times it can be taxing.
JohnnyRoy wrote on 6/23/2015, 5:39 AM
> "My timeline playback isn't as smooth as I'd like with BCC and transitions added."

BCC uses OpenGL. Some of these effects can be quite intense and simply won't playback in realtime no matter what you do. Sony has done an excellent job of GPU accelerating their built-in FX. In particular, Color Correction use to slow down timeline playback until it was GPU accelerated. Now I get smooth playback even with a full chain of color correction plug-ins. BCC has always been slow depending on the FX used.

~jr
megabit wrote on 6/23/2015, 7:54 AM
As you know my Friends, I'm going to use an UHDTV rather than a 4k PC monitor for my edit monitoring purposes (and Johnny - also CC and grading, just as I've used my 50" Panasonic FHD plasma with my 1080p editing). The main reason is that my editing studio must serve other purposes as well (such as being my Home Cinema, with all the audio high-end equipment already in place and lack of another room in my small house to accommodate it); also I spend much, much more time doing my other, main profession-related things (CAE), as well as simply running my home office, than I do editing video in Vegas - hence the need for my desk to accommodate 2 computers instead of one, which doesn't leave much (if any) free space... All in all, the display must end up hanging on the wall rather than standing on the overcrowded desktop - and as to display color/levels calibration, I can assure you that with the middle-class Panasonic Viera TX-50AX800E I can do it no worse than on one of those entry level Asus monitors you linked me to...

Anyway, I have one more question to those more knowledgeable than myself: we all know that even with the regular HDMI 1.4, displaying 4K from a computer *IS* possible - just with the refresh rates reduced to 30/25 Hz (depending where we live). Please help me understand the relation between this limited refresh rate (which would probably make the Windows desktop a mess, difficult to work with due to flickering, illegible fonts etc.) on the one hand, and the fps speed of movies displayed this way on the other... Since I will most probably limit myself to a 24/25/30p camera (the 50/60p ones being another, more expensive league) - is the abovementioned limitation of HDMI 1.4 most cheaper UHDTVs are still equipped with a real-life problem? The Viera TX-50AX800E I mention is *NOT* limited like this - it offers a single HDMI 2.x plus DisplayPort, so I'd be covered for up to 4k@50/60p. But this model is not cheap - so do I really need those inputs if my camera - most probably just the AX100, *maybe* X70 - won't be capable of 4K@50/60p, anyway? Of course, the main Windows desktop would go to a regular desktop monitor with the mediocre resolution of 1680x1050, with only movies and my Vegas previews displayed on the UHDTV...

I'm asking this very important question as I have no idea how a 50" non-plasma flat screen "feels" at 25/30 Hz refresh, and it's not easy at all to "test-drive" one as most shops have the highest possible demo content permanently connected to their UHD Bravias or Vieras... My only personal experience is with plasmas: at a refresh rate as low as 25 Hz they flicker like crazy; heck - even the nominal 50 Hz (in EU - in the US it would be 60 Hz) is too low, so before plasmas were discontinued even by Panasonic, they actually used refresh rates being integer multiples of that (like 100 Hz and up in EU, or 120Hz and up in the US). So with a plasma, this would be a no-brainer - just like with *any* display type but a 4K camera capable of 50/60p... But I have *no* experience whatsoever re: 4k playback of 24/25/30 fps material at such a low refresh rate on *LED displays* (and there are many LED flavors to further complicate the picture - pun intended :)).

Of course one might say: forget the entry level stuff and go for at least the medium-level, if only for it to be future proof enough. But I'm very much on a tight budget, hence it's not that simple a decision for me - please help me to make a right one! TIA,

Piotr

PS. Oh, and please don't try to talk me into a monitor instead of UHDTV - I have reasons important enough for having already made my mind on the latter.

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)