10-bit color displays - impossible in Vegas 12?

stephenv2 wrote on 1/29/2014, 8:31 PM
I've just upgraded by setup with nVidia K5000 cards and Dell 10-bit Ultrasharps connected via DisplayPort 1.2. 10-bit color is working great in Photoshop and Premiere Pro.

However, testing reveals higher bit depth settings in Vegas still only gives you 8-bit display color. There is almost no information I can find on this - but I've seen a few posts claiming that it might be possible with BlackMagic card - but that would lose acceleration I believe. And can't find anyone that has objectively testing.

I'm using a ramp gradient file specially created for revealing 8-bit vs. 10-bit.

Comments

TheHappyFriar wrote on 1/29/2014, 8:38 PM
Are the Adobe programs using the GPU to accelerate the output? If not, could be the 3D math part just handles 8-bit output. You could try disabling acceleration in Vegas.
videoITguy wrote on 1/29/2014, 8:41 PM
Could be that you are confusing several things in the manner of getting 10bits.
1) a hardware chain that is able to pass and display 10bits of info from a file that has inside of its container 10bits of information.
A) Does your video monitor display 10bits if it is genuinely fed 10?
B) Does the hardware path between monitor and computer bus support a transport of 10bits channel -?- in other words - the video card, the connector and cable structure? This is where the 10bits accuracy of BlackMagic Intensity comes into play - because just as a piece of hardware it will not restrict 10bits. But all depends on the entire hardware path, not just a single piece.
2) File structure is important - most codecs that encode video do so within an 8bit clamp - so even if you are processing a 10bit source it is going to come out of the scenario as just 8bits for monitoring. Encode of graphics and still photos are yet a different matter.

Please post your gradient file that details 8bit versus 10bit - so we can see what you are playing with.
stephenv2 wrote on 1/29/2014, 11:32 PM
Yes, both Photoshop and Premiere are using GPU acceleration to provide the 10-bit.
stephenv2 wrote on 1/29/2014, 11:37 PM
I'm not confusing anything. 10-bit color display comes from Quadro cards, Display Port + Windows 7 to a 10-bit display, in this case U3014's.

Adobe, Dell and NEC all make apps to test 10-bit support. You can get a file ramp.psd via Google or make To create this test image, follow these steps:

In Adobe Photoshop CS6 create new file
900 px wide and 300 px tall
RGB color and 16 bit
Use gradient tool to create gray scale gradient from RGB: 64/64/64 to RGB: 96/96/96
Safe file as 16 bit tif file
videoITguy wrote on 1/30/2014, 4:40 AM
Read my first post again - you have not got the concept as I see it.
megabit wrote on 1/30/2014, 9:28 AM
Is your project pixel format 8 or 32 bit?

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

videoITguy wrote on 1/30/2014, 9:33 AM
You are importing a file type of photoshop .PSD 10bits recorded?

then? your project specification is? 8bit/32bit toggled?

do you have match media to project toggled?

are you evaluating the trimmer window, the preview pane, or are you redirectlng output to specified port?

Are you evaluating a video rendered/? render type?
If rendered are you evaluating in what playback system - Windows player or VLC?
larry-peter wrote on 1/30/2014, 9:35 AM
The AJA Kona series allows for 10-bit output to a monitor, but the performance with Vegas is sketchy unless "recompress edited frames" is enabled, which I always assumed took you back to 8-bit.
robwood wrote on 1/30/2014, 11:26 AM
don't think any version of Vegas displays at 10bit;
pretty sure it's 8bit only.

however, i've read you can import a 10-bit file to V12 and
have the extra range show up on the scopes (working 32-bit)...

http://blog.davidesp.com/archives/category/10-bit-depth

...so Vegas12 can work with what's there.
megabit wrote on 1/30/2014, 1:12 PM
I asked whether the OP uses 8 or 32bit pixel format because I know that it certainly is possible to maintain the 10 bit color in VP12 renders if the latter, plus the correct color space is used. If Vegas can render out 10 bit color resolution files, why wouldn't it be capable of displaying it (with all HW chain 10-bit capable)?

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

musicvid10 wrote on 1/30/2014, 1:51 PM
I think the preview engine in Vegas is 8 bit RGB.
Never heard of it being capable of anything else.
Encoders are a different matter.
farss wrote on 1/30/2014, 2:12 PM
I would look into the recently added ACES pipeline. If the marketing is honest then I believe it would be mandatory that 10bit values can be sent to a display, probably only via HD-SDI or 3G SDI though.

Bob.
NormanPCN wrote on 1/30/2014, 3:05 PM
If Vegas can render out 10 bit color resolution files, why wouldn't it be capable of displaying it (with all HW chain 10-bit capable)?


Rendering has nothing to do with the Windows graphics system and so it is free of historical color support in Windows graphics APIs. Vegas is an old mature app and depending on how it is internally setup deep color could be a bit of work to support. Existing standard/historical APIs to draw are not used for color beyond 24-bit.

It's not difficult to do deep color, but as stated, depending on internal design, it may require touching a lot of stuff, and therefore more danger of bug introduction. Also, if Sony has had designs on doing a Mac version of Vegas, like Soundforge, moves like this would be pushed off to when that change is made since SCS would likely develop/use a cross platform GUI subsystem and then simply design deep color support into that subsystem natively. Two moves for the price of one.
astar wrote on 1/30/2014, 4:28 PM
Seems odd that 10-bit support with Vegas would be such an issue. How can Sony support Pro level codecs with no support of this. It would seem like just details of a hardware chain, and getting away from consumer grade gear. Does anyone have a working configuration they care to post?
megabit wrote on 1/30/2014, 4:33 PM
As Bob says - with 32bit pixel precision and ACES color space it should be possible.

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

stephenv2 wrote on 1/30/2014, 10:17 PM
Sorry been offline today. But to videoIT guy, just https://www.google.com/search?q=10+bit+color+windows+7

Several good sites where you can download or create gradient files plus test 10-bit support on your end.

I did some further testing. No combination of ACES, 32-bit or GPU on/off or settings in preview seemed to get rid of banding. Used a 16-bit TIFF I made, a ramp.psd file from google link above and made a Sony gradient solid. All banded at all settings I tried. Both TIFF and PSD work perfectly in Premiere and Photoshop when they are set to 10-bit.

If someone has a Blackmagic or AJA with 10-bit monitor and can test, that should at least let us know true 10-bit display works.

I do Vegas can process greater than 8-bit as I've used that a number of times - but would be nice to figure out a way to see it especially via Displayport and 10-bit monitors means you don't have to spend huge money to get greater than 8-bit display.
DiDequ wrote on 1/31/2014, 3:11 AM
If you have no dark or bright pictures, your eyes will probably not see the difference on a 10 bits screen.

8 bits is 16 millions colors, much above what your eyes can see.

Most people can hardly make a 0.5% difference between two colors.

This does not mean 10 bits monitors are not useful... they are more accurate, calibrations have to be better, so the general quality will be better.

I suppose people spending hours calibrating their 8 bits monitor will get as good results... in specific temperatures and external lightning conditions. 10 bits monitors are very probably less temperature dependant, but not better with the external light sources varying minutes after minutes.

Am I right or wrong ?

Didier.
farss wrote on 1/31/2014, 4:47 AM
[I]"8 bits is 16 millions colors, much above what your eyes can see"[/I]

Kind of wrong. With only 8 bits per channel you cannot represent the full gamut of colours the human eye can see without issues. There's a quite good argument that even three channels is not enough, four would be better. Unfortunately splitting the visible spectrum into 4 is easier but building such a sensor has been a challenge.

I think what your argument come from is [I] 3 channels of 8 bits each along with a compelling story is enough for us to suspend disbelief [/I] and that is demonstrably true.

My position here is quite the contrary, I wonder how many channels of how many bits and at what resolution does is become difficult to suspend belief? I think this is what the engineers should be working towards, to give the artist the best possible palette to work with. We've done this with sound, it's much harder with vision but that's no excuse not to aim for the ideal.

Bob.
DiDequ wrote on 1/31/2014, 5:08 AM
"Kind of wrong. With only 8 bits per channel you cannot represent the full gamut of colours the human eye can see without issues"

The full gamut is a problem, the resolution/perception inside it is another problem.
0.5% difference is the maximum a standard eye can see, Farrs, you might have better eys than I do !

Both 10 bits and 8 bits display share a same portion of gamut. Inside it, you will not notice any difference. Just try !!! 8 bits per channel, this is far enough inside this common area.

Going outside the smallest (8bit) remaining inside the biggest (10) is what I mentionned speaking of dark and bright colors. And the human gamut is much wider, yes, I do know this, so, 10 bit monitor will never display all that you can see.

And we DO use a fourth channel, but it is used for transparency.

Didier.
DiDequ wrote on 1/31/2014, 6:36 AM
Just to explain better what I'm saying above:
The picture below show some text. Save it and open it with photoshop (Gimp as it is a 8 bits picture is OK)
If you increase the contrast, saying +80%, you will see the text very clearly.

Can you see this text better on a 10 bits monitor ? I can see it on my cheap Belinea O display and my Asus 120 hz monitor because they are calibrated. Both 8 bits monitors. But I must look very carefully, otherwise, I do not see anything.

And the background only use one color; a video can show a lot more colors, shapes, this makes the difference more difficult to see.

Now, use photoshop, with a 10 bits neutral grey, decrease the color difference between background and text, and display it on your 10 bits monitor. Convinced ?

At the end, the videos you will sell will be displayed mostly on 8 bits Tvs or monitors.
What you see on your 10 bits monitor is not what your customers will see. This is not really Wysiwyg.

My opinion is : 10 bits monitors are well suited for people printing pictures, or just to be happy with very sharp pictures others cannot see. The more you can, the less... in french, qui peut le plus peut le moins.
Vegas pro runs in a video world, so, it's OK to be happy.
farss wrote on 1/31/2014, 6:54 AM
[I]"0.5% difference is the maximum a standard eye can see, Farrs, you might have better eys than I do !"[/I]

0.5% of what though?
Our eyes have a dynamic range of around 30 stops. If we want to recreate that me thinks we'll need a lot more than 8 bits per channel :)

Of course current LCD screens are incapable of anything like that dynamic range but OLEDs are getting us closer to the limits of our visual perception.

[I]"And we DO use a fourth channel, but it is used for transparency."[/I]

Indeed but our cameras only use R,G and B. That apparently poses a challenge accurately capturing the colour of some monochromatic light sources such as LEDs. Some work was done on 4 colour sensors such as the Foveon but from memory that introduces another set of problems. My understanding of the issue here is the dyes used in front of the photo detectors and crosstalk.

Bob.
Marco. wrote on 1/31/2014, 7:23 AM
If we talk about the dynamic range of human vision "system" we must clearly distiguish between the range caused by many different adaption processes or the range happening without any adaption process.

Without any adaption in the human "system" 8 bit video is capable of carrying about (just under) half the human range. In theory 9 bit is sufficient to carry the whole range (asumed you'd use the full range of 9 bit). In practice you'd need at least 10 bit.
DiDequ wrote on 1/31/2014, 8:59 AM
.5% of what though?
According to wikipedia, it is only 6.5 stops, not 30 - and 8 bits is enough...
This is the problem, nobody really knows. I suppose some specialists have studdied the human eye, and they do not agree each other.
We only are users.
So, with my above picture, can Vegas users using a 10bits display see the difference in my grayscale text.
Therorically, according to this gamut, Yes, but these gamuts do not reflect our human eyes response !


A,B and C represent the same color.
A Yellow Srgb gamut
B Adobe SRGB gamut
C Prophoto gamut


Or, there might also be a sales reason, everything must be better with a 10 bits display, otherwise, why should we buy 10 bits monitors.

Je me fais l'avocat- du diable. I do not know how to translate this. Of course, a 10 bits display is very nice and I would appreciate working with one...


Ah 0.5 % is the ratio in % of the 2 greyscales levels I have used in my above grey text. You can check it with a color picker. 755 (3x255 is 100%)

Our eyes are not proportional as they have more difficulties to feel same % differences with different colors.
here is a 0.7 and 1.5% difference with yellows compared to the background...


Didier.
larry-peter wrote on 1/31/2014, 9:29 AM
The number of colors the eye can supposedly see is irrelevant because perception doesn't work that way. It includes the brain/mind connection. Just as detail perception is concentrated where ATTENTION is placed, so is the color resolution. You can claim that an 8 bit camera can see more colors than the human eye, but when I put my attention on the colors of a sunset I don't see banding. My 8 bit camera does.