Comments

farss wrote on 7/4/2012, 9:12 AM
Only way to know for certain is to test it.
Not entirely easy with Vegas, I did it for something else like this.

In a 32 bit project create an 8 bit greyscale and multiply it by 25% grey. Check with waveform monitor set to cRGB while doing this.
Render out to your 10 bit codec. Bring back into 32 bit project and use gain to lift the "white" of the greyscale back to 100% on the scope. If the bottom end of the greyscale isn't truncated due to 8 bit resolution your 10 bit values are being written and read OK.

All in all Vegas and industry standards such as 10bit are just too much drama. Having a 32bit pipeline and very little in the way of 10bit media generators and scopes is kind of half baked.

Bob.
robwood wrote on 7/4/2012, 12:00 PM
"is Vegas getting 10 bit data on to the timeline?" -Dustin

i think the problem isn't getting 10-bit on the timeline as much as getting it OFF the timeline... Vegas will import some 10-bit media, but i think that even if you render to a 10-bit codec, it's still rendering an 8-bit file to that codec.

i would like very much to be wrong about this.

is there documentation anywhere saying Vegas can maintain a 10-bit pipeline/workflow/etc? i would love to see that too, given the amount of CC work i do.
videoITguy wrote on 7/4/2012, 12:12 PM
Hmm 8 bit!

Use the following search term for threads in this topic

"32-bit full range" without the quotes
Rain Mooder wrote on 7/4/2012, 1:08 PM
I did the farss test with a 32 bit project. Greyscale ramp multipled by 0.25 and
rendered out to
a) Quicktime 32 bpp with DNxHD set to 220mbit 10 bit
b) Quicktime 32 bpp with DNxHD set to 220mbit 8 bit
c) AVI with Cineform to YUV 4:2:2 High profile
and then multipled by 2.0 to look at the ramp.

The (b) option looked like 8 bit in the RGB scopes, the (a) option
looked like 8 bit but with bad rounding now and then. The picture looked
a bit worse even so I would stick with b given the chance. (c) looked
nearly perfect. The cineform is very close to the original ramp. How
amazing the world would be if the B.M. shuttle could record to cineform
avi or mov.

So the answer to my question is probably no. Either vegas can't write
or read 10bit dnxhd at full depth and it is likely that it can not do either.
farss wrote on 7/4/2012, 4:42 PM
"Vegas will import some 10-bit media, but i think that even if you render to a 10-bit codec, it's still rendering an 8-bit file to that codec."

I've only tested this with the 10bit variant of the Sony YUV codec. It certainly seemed to work correctly with that codec.

As far as I know though nothing in this regard is documented.
Given that it is a nightmare to get an EDL out of Vegas and the only alternative to sending your movie to a post house for grading is to render to a DI the lack an attention to detail in this area is not encouraging.

Bob.
fausseplanete wrote on 7/12/2012, 4:35 AM
I started doing some experiments on a few NLEs, including Vegas 10 (64-bit), looking for banding in the NLE's own Waveform Monitor. These are reported at http://blog.davidesp.com/archives/573, though the Avid experiment is not complete (as I say i the blog, <<I assume I am missing something here, some knowledge and/or step and/or monitoring method>>). Any helpful feedback would be appreciated.

I conclude from these experiments that Vegas 10 (64-bit) does not (at least the way I use it) see the extra two bits. I informed one of the 10-bit recorder manufacturers who agreed that they may in future provide on their website a sample clip of narrow-range ramp to help people check that their NLE and its configuration can see and make use of those extra bits.

I guess that getting an advantage out of having the extra bits for recording is yet another moment for thoughtfulness, involving camera and/or scene configuration and even the post-processing such as Neat Video. My main wish for 10-bit is to make narrow-band (underexposed or part of wide-latitude) footage more gradeable (which in certain cases means "more recoverable"). More bits means more margin, which itself is generally a good thing for live events.

I wonder if a manufacturer-independent 10-bit forum exists anywhere.
Andy_L wrote on 7/12/2012, 9:22 AM
Guys forgive me if this is redundant, but are you saying that Vegas, even in 32-bit project modes, is still rendering an 8-bit file on export, rather than 10-bit where appropriate for any given codec?

videoITguy wrote on 7/12/2012, 9:50 AM
YES YES YES - it has been dealt with for years in this forum, there are many technical experts here who have testified to the process of managing workflows in VegasPro with colorspace, and proper range for videscope analysis and why you might use 32 bit mode.

Using VegasPro as a compositor is the issue - not the output bit resolution.
robwood wrote on 7/12/2012, 11:27 AM
"...one of the 10-bit recorder manufacturers may in future provide a sample clip of narrow-range ramp to see and make use of those extra bits... - fausseplanete

that would be greatly useful.
robwood wrote on 7/12/2012, 11:34 AM
...there are many technical experts here who have testified to the poorly implemented and confusing process of managing workflows in VegasPro with colorspace... -videoITguy

fixed.
Laurence wrote on 7/12/2012, 12:52 PM
This sort of thing is why I was so into the versions of Cineform that included First Light. Just awesome except that I finally gave up after spending hours looking for random black frames that occur when I use the Cineform codec.

Any real world experience of if the random black frame is actually fixed? I understand that officially it is fixed but that users are still experiencing problems.
rmack350 wrote on 7/12/2012, 4:35 PM
but i think that even if you render to a 10-bit codec, it's still rendering an 8-bit file to that codec.

I just did a couple of tests setting the project to 32-bit float and then making a few renders to new tracks. If you click on a few clips in the project media you get this sort of feedback from Vegas:

Sony YUV 10 bit : 1280x720x30
Uncompressed: 1280x720x32
DNxHD 10 bit: 1280x720x32.

The SonyYUV seems like it must have been encoded as 10-bits/channel, which makes me assume that the others were encoded at 8-bits/channel (with alpha).

Once upon a time, Uncompressed would have been identified by Vegas as 1280x720x128. Maybe that's changed, or maybe I'm missing something i my encoding...I don't know. The next thing to try is to confirm that these are actually 8bpp or 10bpp encodings.

But I suspect that 10bit data isn't making it into the DNxHD encoding.

Rob
Andy_L wrote on 7/12/2012, 6:39 PM
YES YES YES - it has been dealt with for years in this forum, there are many technical experts here who have testified to the process of managing workflows in VegasPro with colorspace, and proper range...

Aren't we talking about two different things here? It's perfectly possible to output a file that conforms to video levels as either 8 or 10-bit, right?

If I'm understanding the thrust of this thread, it's that Vegas isn't always preserving a 10-bit file from import through export when it's supposed to.

Yes?
videoITguy wrote on 7/12/2012, 7:04 PM
The impact that this thread is trying to create IMO is to answer the question can VegasPro handle a 10bit source and output with intermediate processing on the timeline a result as a true 10bit file. The answer to that as I see it is NO,

Because while VegasPro can uptick some things (codecs, etc) to a 10bit value it is only taking 8bits into an extrapolation phase..this is not the same as having 10bits = and remain at 10bits. This is also confusing because it can do processes extrapolated to 32bits for better compositing scenarios - the output remains in fact 8bits of precision.
rmack350 wrote on 7/12/2012, 7:51 PM
The answer to that as I see it is NO...

I think I'd want to see it demonstrated that this can't be done. I can't prove what Vegas is or isn't doing in the next 15 minutes but I think it's capable of reading 10bit files, processing them in 32-bit float, and then writing 10-bit files.

However...and this is a big BUT...I don't know if this works with Quicktime. The only time I've ever seen evidence that this was working was with AVI files.

I've never seen documentation on this from SCS but maybe it exists. I haven't gone out of my way to find it.

Rob
malowz wrote on 7/12/2012, 9:43 PM
my tests show the same behavior as mentioned...

2 Codecs can input/output 10 bits, "Sony 10-bits YUV" and Cineform... i mean, if you ignore the BLACK FRAMES I GOT and 2 times http://i.imgur.com/J9UxZ.pngTHE VIDEO JUST WENT BLACK[/link] out of nowhere... yeap, they still are there... im using lastest version of vegas/cineform (downloaded the codec this week)

as mentioned also, older vegas, rendering uncompressed yield a video with same bits as the project, now its only 8 bits in any mode.

i tested with a 16bit black<>white gradient made in photoshop, and a raw image, reduced and save as 16b tiff. (vegas correctly reads as 48bits tiff).

tested DNxHD 10bit, uncompressed, Canopus HQX, Sony YUV, Sony 10-bit YUV and Cineform.

applied a strong "S" curve, 8bits result a strong banding and "full of gaps" histogram. 10bits codecs give a much better, smooth histogram.

Sony 10bits YUV show 30bits in properties, but cineform shows 32 (24+alpha) but even showing as 8 bits, the video shows results as 10bits

vegas should now have a more advanced import/export modules, to allow more flexibility on codecs/decoders on timeline, but i believe such "radical" advance may result in a more buggy app. im fine with no GPU acceleration, 8 bits I/O with 32bits processing in the middle when needed. want 10 bits? use sony yuv 10bits. 2TB HD's are cheap ;)

the only final question is: can vegas open correctly a 10 bit source video from the camera?
Laurence wrote on 7/12/2012, 10:27 PM
Does the Quicktime software extensions supplied by Apple that are used by all Windows programs (including Vegas) that read and write Quicktime even work at all at ten bit depth? I don't think they do. I think you have to use a non-Apple_Quicktime format in order to read/write 10 bit color depth. I'm not sure though.
robwood wrote on 7/13/2012, 8:13 AM
Does ... Quicktime even work at all at ten bit depth? -Laurence

you can read 10-bit ProRes 4:2:2 on a PC.
it's possible to write ProRes QT on a PC, but dunno if 10-bit can be.
rmack350 wrote on 7/13/2012, 1:09 PM
Well, I'm home and trying out a render to DNxHD in AfterEffects. Comp is set to 16bit and I've just got a box with a gradient moving across the screen. I then do an 8-bit DNxHD render and a 10-bit DNxHD render.

For what it's worth, MediaInfo says that the files claim to be 8bit and 10bit. Vegas reads them as 1280x720x32 and 1280x720x24, but this may be an alpha channel that slipped in somehow. (10bit YUV reads as 1280x720x30)

The two DNxHD files themselves are identical in filesize, which seems weird but maybe my aefx gradient is really only 8-bit and maybe the DNxHD codec would do exactly the same thing with 8-bit data in either 8 or 10-bit encodings. Or maybe I don't know what I'm doing, which is more likely. I'm not experienced with aefx.

Anyway, Vegas can at least tell that the files are different from within the project media window. And if I render new DNxHD files out of Vegas, MediaInfo also says that the files claim to be 8 and 10bit. Vegas sees them both as "32".

Seems to that I really need some sort of color counting tool. I think MediaInfo is just reporting what the file says about itself in the header.

Rob
Laurence wrote on 7/13/2012, 1:42 PM
The question isn't so much if Vegas can read and write 10 bit files in formats including the Quicktime .mov format. The question is, if Quicktime is involved are there actual numbers at the ninth and tenth bit or are they just zeros in empty placeholders. I suspect that Quicktime is only doing 8 bits and that there are just zeros in the two least significant places when writing 10 bit Quicktime files, and that whatever is in these two least significant bit positions is just truncated and replaced with zeros when a Windows program such as Vegas reads 10 bit quicktime, but I'm not sure this is the case. I just suspect it. Does anyone know for sure?
malowz wrote on 7/13/2012, 1:44 PM
how to differ 8bits to "more than 8 bits":

download this file, there are 2 16-bits images, a gradient and a photo
http://www.mediafire.com/?y25bjqp7nx51jac

set project to 32bits full range, gamma 2.222. put the gradient on timeline, export on the desired format to test.

open up in vegas, apply this curve to the video of the gradient. see the result in histogram:

IMAGE
robwood wrote on 7/13/2012, 2:37 PM
nice. thx for the scope info malowz.
apit34356 wrote on 7/18/2012, 12:04 AM
Nice example graph. Not to be too slow on the uptake here, but it appears that vegas 11 handles 10bit and 32 bit output well. Of course, viewing 10bit output is problematic with most LCD in the market. The compression math appears to save the mid range values but beats up the highs and lows for file size.
GlennChan wrote on 7/19/2012, 2:57 PM
1- Most people on this forum probably wouldn't benefit from a workflow that maintains at least 10 bits throughout.

The extra bits are useful only when you are dealing with very clean (e.g. computer generated) sources. If you are dealing with camera footage you probably won't be able to see any benefit.

You'll know you have bit depth issues if you are seeing banding artifacts (or dithering artifacts). These issues aren't always caused by Vegas. (e.g. 8-bits in a LCD panel is NOT enough in consumer equipment. Yet most people on this forum are using exactly that. If you don't see the problems with your monitor, then you may want to rethink whether you need more bit depth in Vegas.)

2- You should be able to handle 10 bits if you import and export footage from Vegas over SDI (and, say, use the sony 10-bit codec). But how many Vegas users actually use SDI (you need a Decklink card with SDI in/out)?

2b- If importing media from computer files:
How the codecs work is that they pass data as 8, 16, or 32 bit values. (e.g. 8-bit unsigned integer, 16-bit signed/unsigned, 32-bit floating point) The processors in desktop computers work fastest with 8/16/32 bit values.
I believe that Vegas only handles 8-bit unsigned integers. So any underlying 10-bit values are converted to 8-bit when it is handed over to Vegas.

3- The whole 10-bit thing has to do with how things evolved in the "video" world. By the video world, I mean higher-end post production where you have decks/VTRs that cost five-six figures, almost everything is connected by SDI (with some things connected with analog connections), and everything is real-time (a deck plays back and records in real time; there is no waiting for rendering to complete).

Not everything is pretty in the video world. A lot of high end systems handle very few codecs, so sometimes you are ingesting everything through SDI and converting oddball formats into something that will go over SDI. Almost all of the decks and VTRs apply some type of compression... so there are some situations where you can achieve better results on a desktop computer.