Levels 16-235 not really needed anymore?

wwjd wrote on 12/31/2013, 2:43 PM
First, and most importantly, I do understand most BROADCAST GEAR has the limitations of 16-235 levels.

That said, I don't make anything for industry broadcast, so I guess my question is self defeating.

Do I ever need to bother with LEVELS 16-235 for anything? Even if I DID make something that ended up broadcast, it could be easily and quickly adjusted for that.

I haven't seen any problems using wider levels on any modern systems I test on: 3 different tvs, bunches of computer tests, youtube, vimeo, Android, iPads, playback on PS3 and Xbox, even played at a local theater digital projector.... and being wider levels didn't make any difference.

Last project I did I used 16-235 and it messed up my ASPECT MASKING badly, not to mention making it look a little washed out so I had to push contrast past the limits - and it STILL looks not as rich as I wanted... but that was the goal: mess with level limits and see what it does.

So.... why bother with levels 16-235 limiting?

Discuss?

Comments

Marc S wrote on 12/31/2013, 2:50 PM
Hmmmm.... I always conform my videos that are destined to DVD and youtube to 16-235 and it comes out right. The Youtube player stretches the levels to 0-255 expecting the source to be 16-236.
larry-peter wrote on 12/31/2013, 3:27 PM
It completely depends on the codec used for rendering. Some expect 0-255, some 16-235 to produce proper blacks and whites. Glenn Chan has participated in many threads on this forum and IMO he has studied and documented the use of levels in Vegas more than anyone. Search for any of his threads.

Vegas' handling of levels is praised by some and cursed by others, but the end result is it's up to the user to provide the correct levels for the rendered format.
VidMus wrote on 12/31/2013, 7:10 PM
Vegas has test patterns that are 16 to 235 in levels.

Use those with your desired codec in a test project and see if the FINAL results being YouTube or whatever displays them correctly.

Make another test project using your videos and then also see if the FINAL results are displayed correctly or not.

The reason for test patterns is not so much to see if what is between the beginning and end is correct, it is to see if the FINAL results (target) are correct as in being the same as the beginning.

The FINAL results (target) should be equal to what you started with be it 16 to 235 or zero to 255.

And unlike some here, do not use the waveform monitor incorrectly just so it will match a different tool! I use that waveform monitor and Vectorscope extensively to make sure my videos are within range knowing what my final results should be.

The great compliments I get on my videos are proof that I am doing things right.

I shoot 60p 16 to 255, convert to 16 to 235 and 30p. I use zebra stripes with my cameras to make sure peoples faces are no more than 70 IRE on exposure. I set the clip properties to Disable resample and Undersample rate to 0.500 to get 29.970 fps and the project properties also to 29.970 fps for NTSC instead of 59.940 double NTSC. Deinterlaced method is set for none.

I now render using the Sony mp4 at 5,000,00 bps instead of using an intermediate and HandBrake. And just Main Concept for DVD's. Vegas resizing is still not the best but having started with the highest quality progressive videos, all that extra work is no longer justified and the results I get are great using the above settings!

Because Vegas does not properly mark 7.5 IRE on the waveform monitor I use Studio RGB (16 to 235) and leave the 7.5 IRE setup unchecked in the settings so that 16 is at zero IRE and 235 is at 100 IRE. Above 235 is above 100 IRE and below 16 is below zero IRE and out of range.

Tip: You can use masking to isolate a person's face in Vegas to see exactly what level they are at and adjust accordingly.

Another tip: Vegas does not always display codecs properly and it even does not always correctly show if a video is progressive or interlaced. Media Info also does not always give the correct information either.

If you try to de-interlace a progressive video it can look like it has jagged edges.

Sometimes you unfortunately have to temporarily disregard the tools and make tests for the final results. Make tests and KNOW what you are working with. Then you will know how to use the tools accordingly.

I hope all of this helps.

musicvid10 wrote on 12/31/2013, 7:52 PM
Here 'ya go.
Top one is 0-255 for image files and RGB codecs, bottom one 16-235 for YUV.
Perfectly calibrated to match Vegas' scopes.

The only thing that's changed since analog days is that broadcasters are more insistent on brickwall 16-235 luminance and chroma levels, rather than the chroma slop that was allowable before. As far as digital playback and web delivery, it's going to make those choices for you, so proceed wisely.
http://dl.dropboxusercontent.com/u/20519276/dualgray1.png



Marco. wrote on 12/31/2013, 9:00 PM
There are lots of mythes still around.
EBU based digital broadcasting nowadays passes a dynamic range of 1 to 254 in most cases though it's still adviced to deliver program according to the EBU recommodation r.103 which describes a tolerance of -1 to 103 % for the matrixed luma sum of R', G' and B' and -5 to 105 % for R', G' and B'.
.
Almost any video hardware properly adjusted is meant to handle a range of 16-255 (this is also true for any DVD and Blu-ray-disc delivery).
The flash player YouTube and other web services use regularly stretches 16-235 up to 0-255 but using a flash video in full screen mode will take regard to the hardware accelaration setting and could also let the dynamic range untouched so it could leave 0-255 without any clipping (it then depends just on the setting of your grafic device). If your grafic device is set to use a range of 0-255 for video playback you will easily see the difference when playbacking a Youtube video and switching from a smaller player window into the full screen mode.

So in many cases you are fine to use a dynamic range of 16-255. In cases the flash player is involved you'd better restrict to 16-235 though in a certain case (of full screen mode usage) you could also use 0-255. Using a computer display and other software players it's always bound to the setting of the player and the setting of the grafic device - some use 16-235, some use 0-255. Don't know if there are grafic devices which allow a 16-255 setting which would make the most sense, imho (to adopt to regular video devices).

Whatever you do - never mix up "reference white and "peak white" which are different pairs of shoes.
musicvid10 wrote on 12/31/2013, 9:58 PM
Flash has changed levels handling? Must be quite recently.
Would appreciate some insight.
Youetube uses your system Flash player, so delivery levels should be consistent.
farss wrote on 1/1/2014, 1:52 AM
Doesn't the video driver also play a role here?
I know I recently got myself confused by having Vegas being told to send 16-235 to my secondary display device and my NVidia driver doing the same thing.

Bob.
DiDequ wrote on 1/1/2014, 4:45 AM
Probably, this limitation was due to the technology used years ago.
Today, almost any flat calibrated monitor can display 0 -255 levels.
Same with Tvs set to "cinema" mode.
If well calibrated, you can get more contrast with 0 -255 levels, more colors.
The gamut is wider.

Probably, future will be only 0 -255 levels.

Agree with those here saying as long as your result matches your source, it is OK. You can then do color/ contrast/brightness correction if needed.
Marco. wrote on 1/1/2014, 5:24 AM
I think Flashplayer always mananged levels this way. There has been some excitement around for the last years because many people seeing the luma change when switching from a small player to the full screen mode but they didn't know why this happened.

Test it. Set your grafic device to pass through 0-255, playback a Youtube video (or any Flash video), ensure (by the right-click menu of the Flashplayer) hardware accelaration is enabled and switch from the small player mode into full screen mode. You certainly won't see a difference if your grafic device is set to expand 16-235 levels to 0-255.
Marco. wrote on 1/1/2014, 5:25 AM
Yes, the setting of the video driver is important.
VidMus wrote on 1/1/2014, 5:33 AM
I wish everything would go to 0 to 255 so we can stop dealing with all of this non-sense!
Marco. wrote on 1/1/2014, 6:05 AM
Yes, there have been some changes with the move from analog to digital broadcasting.

Now though it is very likely any up-to-date monitor is capable of displaying 0-255, black level still is considered to be set at RGB 16 for some reasons (noise processing is one of it).
Usually for monitor calibration a pluge test signal is used with three black bars, one with 0 % level, one slightly below and one slightly above. A well calibrated monitor will show the difference between the level of 0 % and that one slightly above but will show no difference between the level of 0 % and slightly below.

White is quite a different thing. In most cases official recommendations (like the EBU recs) will advice to set reference white to 80 to 120 nits (some cases even bit higher levels like 130 or 140 nits). Now take a look at what the monitors' available peak levels are … Them reach from 50 % up to more than 200 % above the recommended reference white level. Much "headroom" for whites brighter than reference white and we are welcome to wisely use this headroom whenever we know there will be no expansion from 16-235 to 0-255 involved by players or drivers.
Marco. wrote on 1/1/2014, 6:08 AM
Though there are pieces of sense in it (especially in being careful with handling the black levels) I just agree with you.
Marco. wrote on 1/1/2014, 6:41 AM
By the way – there's some interesting stuff written by Charles Poynton (again):

"Legalizers should be outlawed"

.
wwjd wrote on 1/1/2014, 9:54 AM
thanks for all the informative replies, guys.
see, there are a multitude of options tech and playback methods out there, so when I set my monitor up, edit video and it nearly always looks just great on all my preview test TVs - no obvious clipping on low or high ends - I just don't think I need to worry about it.... UNLESS I run across a preview test that is obviously bad. Definite food for thought.
wwjd wrote on 1/1/2014, 9:58 AM
I agree with the legalizer outlowed page. It is a different game now. 1s and 0s are very different from analog signal and should be treated and respected as such.
I guess that is where my question was hatched: is it REALLY even needed ANYMORE - as in current days
musicvid10 wrote on 1/1/2014, 10:13 AM
"Doesn't the video driver also play a role here?

Exactly, Bob. That is exactly what Marco is telling us, about adjusting graphics settings on certain cards to give different screen results. Flash (nor BT.709) haven't changed a whit, and I'll confirm that later today. Granted, newer Nvidia products have a market share, but the vast majority of the world still uses something else.

"You certainly won't see a difference if your grafic device is set to expand 16-235 levels to 0-255."
Marco, relatively few of us have those controls at hand, and plenty of people misuse them because they're not understood.
Just another dangerous toy for teenagers afaiac.
https://www.google.com/search?q=nvidia+damages+gpu+and+screens&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a

The simple fact is the vast majority of the world watches its video on 7 billion mobile devices, not Nvidia-enabled PC's. It's worth mentioning that most of those devices don't have Flash, either. Delivering 16-235 for faithful playback is still highly recommended, even moreso than in the past.
VidMus wrote on 1/1/2014, 12:27 PM
"The simple fact is the vast majority of the world watches its video on 7 billion mobile devices, not Nvidia-enabled PC's. It's worth mentioning that most of those devices don't have Flash, either. Delivering 16-235 for faithful playback is still highly recommended, even moreso than in the past."

There are those who also watch videos using streaming devices such as Roku, Apple TV, etc.

HD on the big screen without time spent making Blu Ray!

I can watch all of my Vimeo videos on my big screen using streaming to my Roku.

As for 16-235 vs 0-255 I really do wish there were just one standard across all devices so we no longer have to deal with this mess! Problem is, there always seems to be one company that has to mess things up just for the sake of thinking differently.


musicvid10 wrote on 1/1/2014, 12:37 PM
There is one standard, BT.709 decoding for BT.709 source The encoding flags are in the file headers. There is no ambiguation there. RGB is pretty much a nonplayer these days.
Unfortunately, acquisition levels are impossible to mandate, and I'm sure Nvidia made its choices quite deliberately, to sell stuff to morons who will never get it right.*
Home streaming solutions are usually compliant. Capture and recording, not so much.

*Present company is excluded from that comment, of course. To be clear, I consider everyone here to be lifelong learners, just as me. If that wasn't so, we wouldn't be here.
farss wrote on 1/1/2014, 3:38 PM
Marco said:
[I]"Though there are pieces of sense in it (especially in being careful with handling the black levels)"[/I]

Exactly what "black" is, is easy enough to define, no photons.
It's at the other end of the scale where all the action is. "Reference White" as defined by a chart certainly isn't the brightest thing that can end up in a shot unless you've got a reasonable budget and take care.

The highlight issue isn't anything new. Rec 709 has black at 16 and white at 235. Compare that to Kodak's 10bit Cineon definition, here black is at 19 and white at 635, that leave a lot more bits to store things brighter than "white".

Bob.
DiDequ wrote on 1/1/2014, 4:36 PM
1 Most people do not use calibrated displays.
2 Most people use a web browser that do not use default Icc profile.
https://ie.microsoft.com/testdrive/graphics/colorprofiles/default.html

For these people, using 0-255 range, 16-235 or anything else does not matter at all. No need -it's "donner de la confiture à des cochons" (Give jam or marmelade to pigs ?)

1 Fortunately, most screens use a default Srgb icc profile loaded by Windows that give quite correct pictures. But no Icc profile on smartphones, or pads. (millions of persons concerned)

2 Some web browsers have color management disabled by default (Firefox...)
Furthermore, few people know only few applications use Icc profiles.
Displaying a photo with the built in microsoft software gives you awfull results. How many people using Irfanview with it's correct setting to replace the MS "visionneuse de photos"
Even vegas pro 12 is not perfect: you have no choice: it uses the default icc profile.
Renderring a video using the full 32 bit feature can give you oversaturated still pictures, videos - it took me some time to be able to render with this 32 bits mode !!!

We are lucky we cannot print videos (just photos), because things become worse, as we should convert RGB (or SRGB ) to CMYK using other Icc profiles, and use a special monitor Icc profile to see how our print will look like. (gamuts are widely different and in some cases, calibrating a printer is not enough.)

There is one big advantage to use 16-235 setting as a baseline : if you want to simulate a fluorescent color, you wil use R/ or G/ or B above 235 or R+G, R+B, G+B above 235. Your eyes will be fooled because they think white is 235-235-235. You can try with a still picture with photoshop or Gimp, filling the background with a 235-235-235 color, and a 50 -255-255 color (or anything elese above 235)

3 Who cares about the lighting of the room - viewing a video outside or in your bedroom will give people different results.
I could ommit temperature problems, but I will not : a video from Youtube seen by someone outside in winter in Montreal will look different on the same device in Bamako, Mali.

4 We cannot make our icc profiles for our Tvs - we can only calibrate them ( we cannot profile them)
Most people use the dymamic setting. For these people you should probably use a 50 -200 setting...

You members of this forum are concerned.
In french, we say "qui peut le plus peut le moins". You do know that.

0-255 or 16-235 ? again, if you are happy with your result, that is fine, because most people don't care. For a Youtube video.

For a Dvd or blueray film, things are different - I have no choice with the Sony avc/mvc "render as" option : it uses 16-235 levels !

Music is the same with mp3, ogg or any compressed file -some untrained people even prefer listening mp3s - just because it is more easy for our ears.

Didier.
john_dennis wrote on 1/1/2014, 5:39 PM
"[I]For a Dvd or blueray film, things are different - I have no choice with the Sony avc/mvc "render as" option : it uses 16-235 levels ![/I]"

I don't understand what you mean. With a source in the range 0-255, I observed that nothing in the Sony AVC Blu-ray render template corrects the output by force. It's up to the editor to adjust the levels.

Example
musicvid10 wrote on 1/1/2014, 5:54 PM
Player luminance and icc profiles are two different things. You'll really muddy the waters if you confuse the two.
;?)