Comments

Serena wrote on 8/6/2013, 1:04 AM
Is this saying that Vegas is designed to work with Intel on-board graphics (i7 etc) rather than with Nvidia etc?
rmack350 wrote on 8/6/2013, 1:21 AM
Looks like the video is coming from Intel rather than Sony. So it's addressing Intel talking points.

Rob
Soniclight wrote on 8/6/2013, 1:23 AM
Well, Serena, looks like I'm up the creek paddle-less.
AMD 6-core (actual) and nVidia v-card.
I might as well just sell my VP, eh?

(Nah.)
Serena wrote on 8/6/2013, 1:52 AM
Gosh, I'm not sure what it means! When I bought my i7 machine I started out without a separate graphics card (which worked excellently with Vegas) then Sony gave us the NewBlue Titler Pro (which said my graphics GPU was inadequate), so I installed a Nvidia GTX560Ti. Now working with DaVinci Resolve I've had to add a DeckLink card to get the view onto my grading monitor, so maybe I don't need the Nvidia card any more. Starts to get involved, but would be less so if Sony would give us something more solid than that comfy video. And I had a deeper knowledge of computers!
Grazie wrote on 8/6/2013, 2:45 AM


IBC is only a Month away . . . . tick...tock...tick...tock...

- g

ushere wrote on 8/6/2013, 2:47 AM
all i know, and i don't know much, is my i7 laptop with hd4000 screams through renders with quick sync leaving my cpu in the dust.

unfortunately nb titler etc., DOESN'T work with hd4000, so that's a pita.

it looks (as ever 'looking' into scs's thinking can be) that there might be some movement towards working with the new gen of intel chips with gpu's in them. personally i wouldn't be sorry to see:

a. the back of all the mystery surrounding video cards (either nvidia and/or amd) and their idiosyncratic working / non-working drivers

b. something from scs, regarding gpu rendering / playback that actually works consistently across the board - and as advertised

c. an end to physically separate, 'huge', power hungry, noisy cards

d. a 'standard' that developers (nb, boris, etc) can confidently work with (ofx comes close).

of course, i'm probably pipe-dreaming - i'm sure scs will implement their own idiosyncratic variation and we'll be back to trying to figure out why everything works in one system yet not in an identical one....
Serena wrote on 8/6/2013, 3:26 AM
I have the Sandy Bridge HD3000 and that worked very well -- the HD4000 is much better (so I read). Adding a GPU card disables the on board graphics. Resolve isn't much impressed by my GTX560Ti and I use Vegas GPU accel off. Intel's philosophy with the i7 boards was much tighter integration of GPU and CPU with cache sharing, which sounds like a good philosophy. NB titler isn't much loss!
farss wrote on 8/6/2013, 3:34 AM
Where are we with all this now that we're entering the world of 10 bit and beyond video?
How to get 10 bit video out of Vegas to our monitors??

Bob.
Grazie wrote on 8/6/2013, 4:42 AM
What I'd like to know is where does that leave all those with QUADROs and the nVidia support they're getting? And where does that leave INTEL and those needing support as a resort of SCS's new best friend? Are we seeing SCS covering ALL the bases as needs be?

This throws up many more questions than there are answers available to be matched to in Michael's video. However, maybe, just maybe SCS has at long last wrangled the type of INTEL input that's needed to give us Grunts the support without the need to go to the GPU?

Interesting.

- g

farss wrote on 8/6/2013, 8:53 AM
Grazie said:
[I]"However, maybe, just maybe SCS has at long last wrangled the type of INTEL input that's needed to give us Grunts the support without the need to go to the GPU?"[/I]

The point of OpenCL is that it's "open", it shouldn't matter where it runs, in the CPU, the GPU or a hall full of monkeys with abacuses :)

On the other hand maybe the Intel Guru's have given SCS a few pointers on how to make calls to OpenCL so things don't get messed up.

In reality I think the video is just a piece of fluff that we shouldn't read too much into at all. When I think back over the years to all the things we've been told SCS are taking on board and we've been invited to go along for the ride I don't have much hope at all. First it was BMD, then they were on the nose, go with AJA. Then we were told of grand plans for some AMD - SCS future, nothing came of that. Then it was NVidia. I'd love to see SCS have just one simple vision of working with one 3rd party and getting that sorted with proper user documentation and support. Oh for the days when all we had to think about was the right Firewire chip, sigh.

Bob.
bill-kranz wrote on 8/6/2013, 10:34 AM
I am building up a new PC and have a I-3770 CPU from Intel and I am favoring a yet unbought NVidea 770 graphics card.

This should suffice for my copy of Studio 12 and I am hopeful it will
run okay the Pro 12 I plan to install also in 4-6 months for now.

A lot of these parts are still expensive for me even the OS will be over $100.00. RAM can cost more then the video card.

However I have been quite happy with both Sony products no real problems with the base level video documenting I do. I do have a chance to buy into the BM Pocket cinema camera and that will obviousely put me in a different catagory of input, production and output.

Looking forward to the ride!

Bill
Dan Sherman wrote on 8/6/2013, 11:08 AM
Sorry,...but what is OpenCL?
I know GPU, CPU, IOU, NFL, NBA, LBGT and NHL!
videoITguy wrote on 8/6/2013, 11:11 AM
RE: Subject: New Update!!!
Date: 7/23/2013 5:25:41 PM ---- Notable fixes/changes in version $BuildNumber$670

Added an Optimize GPU display performance check box to the Video Preview tab in the Preferences dialog when Windows Graphics Card is selected in the Device drop-down list. We recommend leaving this check box selected. You can clear the check box if you experience system, external monitor, or video preview performance problems.
Improved AMD GPU performance when using Catalyst 13.x (format of driver version number changed).
Improved Intel GPU performance when using driver 9.18.10.3070 or newer.

bill-kranz wrote on 8/6/2013, 11:12 AM
Reposted from a Wiki page...

Open Computing Language (OpenCL) is a framework for writing programs that execute across heterogeneous platforms consisting of central processing units (CPUs), graphics processing units (GPUs), digital signal processors (DSPs) and other processors. OpenCL includes a language (based on C99) for writing kernels (functions that execute on OpenCL devices), plus application programming interfaces (APIs) that are used to define and then control the platforms. OpenCL provides parallel computing using task-based and data-based parallelism. OpenCL is an open standard maintained by the non-profit technology consortium Khronos Group. It has been adopted by Intel, Qualcomm, Advanced Micro Devices (AMD), Nvidia, Altera, Samsung, Vivante and ARM Holdings.

For example, OpenCL can be used to give an application access to a graphics processing unit for non-graphical computing (see general-purpose computing on graphics processing units). Academic researchers have investigated automatically compiling OpenCL programs into application-specific processors running on FPGAs,[1] and commercial FPGA vendors are developing tools to translate OpenCL to run on their FPGA devices.[2]
Serena wrote on 8/6/2013, 7:16 PM
>>>How to get 10 bit video out of Vegas to our monitors??<<<<

The path is through a BMD Decklink (options/secondary viewer), click 10-bit output having clicked 32-bit video. My HP Dreamcolor grading monitor, for 10-bit input, demands RGB into its display port (or reverts to native gamut), so I have to feed the Decklink output through a BMD HDLink Pro 3D DisplayPort . Don't know about the needs of other 10-bit monitors.
videoITguy wrote on 8/6/2013, 7:41 PM
Serena, your point is monitoring 10bit video digital intermediates in a 32bit setting of Vegas Project.?? correct?.. well understood. BUT from Vegas what are you using as a render codec to final output that is any more than 8bit rendered file ? ?
Dan Sherman wrote on 8/6/2013, 7:46 PM
Oh.
Editguy43 wrote on 8/6/2013, 8:50 PM
What did you think of the guy using the touchscreen laptop, I think it seems to be a waste of movement going from screen to keyboard (around 34 seconds). not sure if it would make editing better.

Paul B
Serena wrote on 8/6/2013, 9:02 PM
>>>>what are you using as a render codec to final output that is any more than 8bit rendered file ? ?<<<<<

I use the 10-bit Cineform codec, which renders 10-bit avi files out of Vegas. To date I've been using 8-bit source files (PMW-EX1 internally recorded) but transfer these to 10-bit 4:2:2 for post. So getting 10-bit into the monitor hasn't been necessary and have been happily working through a DVI RGB path. Now I'm moving into the BMD cinema camera/Resolve world and while it isn't essential to put 10-bit on the monitor (Resolve will output 8-bit for monitoring), it is preferable for grading. Resolve will not output to a secondary monitor via a graphics card (excuses based on monitor quality) so a Decklink card is needed. My initial error was buying a BMD Intensity Pro card, which works but is 8-bit (although the BMD website implies10-bit).
Serena wrote on 8/6/2013, 9:08 PM
Incidentally, I've received the following advice on the subject of GPU/Vegas/Resolve which might be of interest to others.

"GPU support in Vegas is really tricky because there are many factors that can contribute to issue. The card, the driver version and any plug-ins that require GPU acceleration must all match up exactly.

I believe the reason why SCS and easily optimise for Intel integrated GPUs is that there is little variation between the models.

Resolve on the other hand is a lot simpler and easy to recommend cards for. The Nvidia 560Ti card you have will give okayish performance in Resolve, with only 1 or 2 nodes applied. You may struggle to get realtime HD 1080 playback with many nodes applied. The GeForce cards actually outperform the Quadro cards in the price-vs-performance area. In fact, the Quadro 4000 would offer next to no performance benefits over your existing 560Ti in Resolve.

If you are looking for better performance, any of the Nvidia 7xx series cards will be fine for realtime HD 1080 playback in Resolve, up to a decent number of nodes.

Resolve will not run or use any Intel integrated GPU."
rmack350 wrote on 8/7/2013, 1:26 PM
Paul,

This is an Intel video, not an SCS video. They're showing off Intel initiatives like Ultrabooks with touch.

The moment I thought funny was the quick shot of opening the color corrector and immediately switching to the horizontal view. Seems the operator there didn't like the vertical layout. What a surprise.

Rob
Byron K wrote on 8/7/2013, 1:57 PM
There's some moderate perfromance enhancements according to Toms Hardware:
http://www.tomshardware.com/reviews/core-i7-4770k-haswell-review,3521-3.html

"designed specifically for the professional."
@ 1:55 who uses page peel and page roll on a regular basis?? Sales and Marketing pukes... uhhhg ((;

...OK I confess, I used page roll once in my friends baby's 1st birthday video... 3 years ago... man, I could have used power of Open CL back then. (;
TheRhino wrote on 8/8/2013, 5:11 PM
RE: "The moment I thought funny was the quick shot of opening the color corrector and immediately switching to the horizontal view. Seems the operator there didn't like the vertical layout. What a surprise."

I HATE the vertical color wheels with a passion and the need to continually select "custom" even when I go back and re-visit a clip that was already edited...

This is why I still start-out projects in V10e and only switch to V12 when I need to... If V13 provides the option to lock-in the horizontal color wheels I will definitely buy it. Otherwise I might pass on the next upgrade unless the render speeds blow me away...

Edit: Intel needs to give us an 8 or 12-core desktop CPU. My 3 year-old 6-core 980X is still very competitive with these "4th generation" CPUs...

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

rmack350 wrote on 8/8/2013, 6:41 PM
The vertical color wheels are useless on a laptop. There's no excuse for them to default to this layout.

Rob