Comments

Coursedesign wrote on 3/3/2007, 10:23 AM
That has been discussed here many times over the years, and today Vegas is the last major NLE to be 8-bit only.

Actually, to compare oranges with oranges: Vegas 8 bits are in RGB space, while when NLEs such as FCP and Premiere Pro use 8 bits, that's in YUV space (I use "YUV" loosely here, do a search in this forum or a buy a 1249-page book on color theory if you want the whole enchilada).

8 bits in "RGB space" is about equivalent to 9 bits in "YUV space". This means that Vegas can do a little bit better in 8-bit format, apart from conversion issues back and forth (cameras use YUV, etc.)

It is when the others move into 10-bit video, 16-bit video, and even 32-bit float that Vegas gets left behind completely and just can't do the job.

I have been originating in 10-bit since 2+ years, so I have been looking for ways to work with Vegas and to figure out where the limitations might be.

You can capture in BMD's capture app and do straight cuts in Vegas without losing your bits.

As you have seen though: as soon as you do any grading, transitions, effects, etc., you're chopped back to 8-bit.

As best as I can figure out without having my head inside the design team huddle, it seems that the 8-bit limitation is not caused so much by Vegas as it is by the chosen Windows architecture.

I am not keeping up on this as much as I used to, but it is possible that Microsoft's video APIs are living in the past, and the only way for Vegas to go high-bit today is through QuickTime, like Premiere Pro.

je@on wrote on 3/3/2007, 12:38 PM
It's times like these when we'd like to hear from someone at Sony Madison...

*crickets*
winrockpost wrote on 3/3/2007, 3:29 PM
Since I know not a soul at the sony vegas think tank i cant answer the question. But i have a question. Does 8 bit 10 bit 24 bit or whatever have anything to do with color correction on dv or hdv video and my eyes or my clients eyes seeing the difference ?
Serena wrote on 3/3/2007, 4:09 PM
It's important to recognise that 8 bit per channel is24 bit RGB colour (as in JPEG images and capabilities in Photoshop 7). 8 bit colour would be very limited indeed. Colour is subordinate to luminance and it's the range of the latter that's important and is read out of every pixel.
farss wrote on 3/3/2007, 7:50 PM
Conversely most half decent digital still cameras let you record RAW and working with that is quite a different experience to working with 24 bit RGB. Of course very, very few display devices can display more than 8 bit, most LCDs are more like 6 bit but that's not what it's about, it's about preserving more of the tonal range of the scene so you have more to work with in post.
How much does it really matter, well all technical arguments aside when your competition has a major bullet point and you don't it matters. When several of the newer cameras just will not work with Vegas, it matters. Even Sony's own Digital Betacam is 10 bit and running Betacam SP through 8 bits seems to be not a good thing.

When all said and done probably the reverse argument is the one to ponder, is there any compelling reason not to support 10 bit (or more)?

Bob.

willlisub wrote on 3/3/2007, 8:19 PM
I think as Vegas editors grow and move to more expensive projects, there will be a percentage of them who will have to switch or add another editing station that can handle 10 bit video.

10 bit (whether absolutely justified or not) is being bantered about more and more these days as a standard for high end video. Many of us know that Vegas is an extremely capable and desirable editing program even if the rest of the world doesn't understand or agree.

As more people jump into HD, I think the "importance factor" of 10 bit might increase even more.

I continue to fight the urge to buy a FinalCut system, but if we don't see 10 bit sooner than later, I will probably give in at some point and purchase FCP, or something else, instead of another Vegas station.

I would probably do some side by side comparisons with HDV coming from the Canon XH-A1 (which looks pretty sweet when lit properly). If 10 bit editing helps at times, then it would be hard to not switch or add to a 10 bit system.

Sony, if you are listening, while I don't have a clue how much work or effort it would take to make Vegas 10 bit, I sure hope you will move it up your list of "must have" features. You have such an incredible editing package, I don't know how you could not want to update it to 10 bit. I think it's growing in importance.

It sure would be nice if you'd give use some feedback on this............

GlennChan wrote on 3/3/2007, 8:55 PM
is there any compelling reason not to support 10 bit (or more)?
1a- The reason for this is likely because Vegas is designed to work with the Video for Windows architecture, with VfW plug-ins, with its own extensions to VfW plug-ins.

To support higher bit depth, it will need to change over to the Directshow or Quicktime architectures. Users may also want backwards compatibility with VfW plug-ins.
*Also Vegas supports Quicktime alongside VfW, so supporting Directshow in a similar way may not be that difficult?? On the other hand, Quicktime codecs are fairly slow in Vegas, and that's no good.

There may be some nitty gritty implementation details that I'm not aware of.

1b- Performance may be slightly slower with 10-bit or higher. You need to push 16-bit (or 32-bit) numbers around instead of 8-bit numbers- this doubles or quadruples memory bandwidth and memory usage.

1c- A new engine may introduce new bugs. For example, FCP has lots of bugs when it comes to 10-bit. Do you really want that? (Although it could be said that FCP has lots of bugs in all areas.)

1d- The most-used filters have to be re-programmed to work with the new engine, and (re-)optimized.

2- There are some advantages to supporting higher bit depths.

-New codecs will likely be supported better in Directshow than in VfW. Cineform is a good example.

- If Vegas moves to 32-bit float rendering, it can support linear light processing (makes compositing and a lot of effects look very noticeably better), out-of-range colors (practically all video cameras generate these), and xvYCC wide gamut color space. The first two will likely be useful.
Linear light processing can significantly improve video quality.

- Handled intelligently, a 32-bit float engine would hopefully resolve the studio RGB versus computer RGB issues that Vegas has.

- Support for 10-bit and higher formats.

3- On the other hand, one could argue that Vegas should stick to what it's good at and not screw that up.

Do you really want higher quality if it will slow things down (and Vegas already isn't too too fast)??

Will changing Vegas' engine make the program more complicated to use? (Although currently, the studio RGB versus computer RGB issue is very confusing and not obvious.)

How many Vegas users really need the higher bit depth? (Counterpoint: SDI is 10-bit; 8-bit Y'CbCr needs ~10-bit R'G'B' to avoid losing information when performing color space conversions; the Red and SI cameras will benefit from higher bit depth.)

4- It would nice if Vegas could have all of the following: Faster performance, higher bit depth rendering, and the ability to run on any system without weird bugs.
Going with GPU accleration might give you the first two, but not the third.
Sticking with CPU rendering may not give you the first.
The status quo doesn't give you the higher bit depth (and doesn't give you faster performance either).

So it might be tough.
Coursedesign wrote on 3/3/2007, 9:50 PM
FCP does have a few extra bugs when working with 10-bit, but really nothing extraordinary. It really works just fine, and it's used for 10-bit editing every day by large numbers of pros worldwide.

FCP does indeed have more bugs than Vegas, even a lot more.

Now take a look at what FCP can do and what Vegas can do.

I wonder if Vegas would win a "bugs/features" comparison...?

FCP is so much more capable, even more so if you're comparing Final Cut Studio with Vegas+DVDA.

The Video for Windows architecture is an antique that is really holding back progress both here and in other areas.

I can understand Madison's reluctance to start over with a new architecture, but if they don't I think there will only be Vegas Movie Studio left, while the pro customers graduate to FCP and Premiere Pro where the basics have been taken care of since some time now.

Performance is becoming less of an issue, especially now that we have several years of experience with GPU acceleration (in numerous other products).

The APIs for this are finally sufficiently well defined, the stability is good enough for the most demanding professional use, and the performance boost is dramatic even with $100 consumer GPUs.

And Sony corporate is pushing xvYCC wide gamut color space big time in big screen TVs (and making good money because of it, because it makes for a much richer-looking picture).
Serena wrote on 3/3/2007, 10:01 PM
>>>>is there any compelling reason not to support 10 bit (or more)?<<<

Absolutely every reason for more. Trying to fit the real world scene luminance into 8 bits takes a lot of care (graded filters, scrims, etc) and generally there isn't time. Even if your final display device can only manage a 6 bit luminance range you need to be able to choose how the scene is displayed on that. But when whites are burnt out and blacks gone sooty, options in post are reduced. A good display in a dark room probably shows a 12bit range (I haven't checked, but that's typical 35mm cinema spec) so more than 10bit recording depth is far from excessive.

EDIT: probably most of you know that CCD video chips can record to a greater depth than 8bits, but the signals put out to tape are limited to that. I have a CCD camera in my astronomy kit that uses a Sony monochrome video chip (how old is that??) and I get 16bit luminance depth in those images. Same generation colour chip gave 12 bits. Of course there is a rider in this, for the chips are cooled to reduce thermal noise, for otherwise they would be useless for long exposures. Obviously video requires fast readout, so it doesn't necessarily follow that in a video application the full capabilities could be employed. Farss probably has the technical knowledge to elaborate.
GlennChan wrote on 3/3/2007, 10:33 PM
And Sony corporate is pushing xvYCC wide gamut color space big time in big screen TVs (and making good money because of it, because it makes for a much richer-looking picture).
I don't think xvYCC will be relevant for a while... it'll take a few years before there are enough wide gamut displays to be worthwhile grading something for xvYCC. I don't believe current grading systems + monitors even support xvYCC yet.
rmack350 wrote on 3/3/2007, 11:33 PM
The real call for 10-bit/channel is competitivenes. When people are looking at Red or the SI camera and getting big ideas, they sure as heck aren't going to consider Vegas.

I see this every day in my own shop, where we use PPro. Even though we aquire on DVCam, everything has been going into those systems over SDI as 10-bit. The boss/editor/etc has been convinced that this is the way to go. Later this year there's a fair chance we'll finally replace our DSR 500 with a Red One. Again, we'll be looking at 10bit.

Now, there are serious problems with the hardware we're using with PPro and the boss is aware of Vegas but without 10bit color there isn't a chance in the world we'd ever use it.

The potential of using 10bit color processing is a big selling point. Granted that the core of Vegas users don't need it, most users would like to imagine themselves needing it.

Rob Mack
farss wrote on 3/4/2007, 1:59 AM
Rob,
Amen.

And as for xvYCC, err, already have two cameras that shoot it, dinky little consummer things. Kind of funny, I could edit footage from a $150K Sony camera and not a $1K one.
And next week I'm going to the local Sony launch of the xvYCC camera and display product line, sold out very quickly so I think it'll be very relevant real soon. Is it all smoke and mirrors stuff, probably but try telling some guy that just bought the Sony spiel that and wants his footage edited.

Bob.
Coursedesign wrote on 3/4/2007, 9:59 AM
2007 will be the year of expanded color space.

In addition to Sony pushing it to great advantage in their video cameras, big screen TVs, and new generation of Luma LCD monitors, there are a bunch of new screens from Dell and Apple with greatly expanded color space that will start shipping shortly.

Many other manufacturers will follow, this is not a temporary fad.

Resistance is futile! :O)