Getting ready for 4K editing...

megabit wrote on 3/8/2016, 5:33 AM
Will it be Vegas or not, I'm about to enter the world of 4K editing. Of course there will be a new fast PC, but I wanted to seek your advice about monitoring...

So far, for HD editing, I've been using a 50" 1080p plasma TV, hung on the wall and tilted down just above my editing desktop. Should I follow this and buy one of the newest HDR UHDTV, or simply a large, 40" 4K monitor?

Here are my pros&cons about a TV:
- its use can be doubled of course as a UHDTV :)
- some information claim that the Super UHD/Quantum Dot UHD from Samsung use 10-bit panels, but will it be possible to feed 4K 10-bit 4:2:2 video through HDMI? I doubt it - however, on the other hand, the UHD BD players are 10-bit per color, and supposedly connected via HDMI - so who knows?
- even though heavily processed, most TV sets do have a Direct (1:1) setting

My pro-monitor arguments:
- it's supposed to give a more 1:1 picture
- it can be a true 10 bit panel (important for grading in DaVinci, perhaps in Catalyst in the future)

Advise & thoughts, please

Piotr

PS. Oh, and I'm perfectly aware that high-end editing suit is all about calibrated monitors, SDI connectivity etc. But please spare me explaining this, as my budget is limited... And - with a high-end graphics card (like Qaudro or even GeForce), I can build a working 10-bit connection to a relatively cheap monitoring solution - just advise please whether this should be a PC monitor, or a higher-end UHDTV will be enough...

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Comments

Steve Grisetti wrote on 3/8/2016, 7:48 AM
Actually, when editing 4K video, Vegas uses proxy files -- so it's not necessary to pimp out your system in order to handle these files.
megabit wrote on 3/8/2016, 8:32 AM
What if I like pimped systems, but hate generating proxies? Please answer my original question :)

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

relaxvideo wrote on 3/8/2016, 9:12 AM
i also hate proxy files, instead i buy new cpu for native file editing :)

#1 Ryzen 5-1600, 16GB DDR4, Nvidia 1660 Super, M2-SSD, Acer freesync monitor

#2 i7-2600, 32GB, Nvidia 1660Ti, SSD for system, M2-SSD for work, 2x4TB hdd, LG 3D monitor +3DTV +3D projectors

Win10 x64, Vegas21 latest

Wolfgang S. wrote on 3/8/2016, 1:47 PM
That is wrong - Vegas can use proxy files, but it is not a must!

Do you want to edit 8 or 10bit footage? That will make a huge difference. With a powefull PC 8bit is no issue really. 10bit is more tricky.

Vegas can be used to bring a 10bit UHD picture to a 10bit UHD monitor. Can be done with cards like quadro cards (your beloves r9 390x will deliver 8bit only) by using the display board. With Blackmagic hardware you can also have a 10bit output (yes, via hdmi 1.4 and using a Decklink 4K extreme up to 50p or an Intensity 4K up to 30p). But the BM hardware is limited to HD only.

For GPUs: to display 10bit you will have to run your project as 32bit floating point project with good or best/full half and that is not very performat in terms of playback from the timeline. Even with a high end pc with 8cores that will be a limited success. At the end you will end up to edit the footage in the 8bit modus maybe (but if you use 10bit footage it is great to render in the 32bit floating point modus).

In the end you will end up with the conclusion that Vegas can be used for a 10bit workflow, I do that here. But you will not be satisfied completely, since Vegas does not utilize expensive hardware in the best way that would be possible. Sorry to say so.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti * Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED (ProArt Studiobook 16 OLED (i9 12900H with i-GPU Iris XE, 32 GB Ram. Geforce RTX 3070 TI 8GB) with internal HDR preview on the laptop monitor. Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG, Atomos Sumo

farss wrote on 3/8/2016, 3:15 PM
It really comes down to the question of video card capability and the software you're going to use. Also are you thinking UHDTV or 4K DCI?

Bob.
megabit wrote on 3/8/2016, 11:32 PM
To Wolfgang:

It so happens my partner in the Moldflow business is going to equip me with a really strong workstation, so I hope CPU power will not be an issue. And yes - I'm going to edit 10-bit (not exclusively, but often). A top-end Quadro card is perfectly able to pump it into the monitor; I wouldn't like to go SDI as this means additional expenses - not so much on the card, but mainly the monitor (plus, the solution you mentioned is HD only). Also, I'm going to use Resolve for grading, and Vegas for final touches (or buy Catalyst Edit at some point, if Vegas is not up to the task and Catalyst is already matured).

To Bob:

As I said, having spend on the FS7 I will not be willing to pay for a fully blown professional monitor, just not enough cash. And this is what my OP question boils down to:

- do you think the newest SUHD TVs from Samsung (like the UE48JS9000, touted to use 10-bit panels) will provide the full 10-bit path via HDMI 2.0a? My example of their UHD BD-player would suggest they will, as the player is said to provide 10-bit color...

- as I said, a PC 4K monitor can have a 10-bit panel for relatively low price, but if I went this route I'd also have to use HDMI as any other kind of input means the monitor price would rocket up (especially for 4K, if at all possible?)

Not sure what you mean by DCI - but if you mean the 36 bit per color requirement, then obviously not; the monitor I can afford will probably only allow 10 bits. Please elaborate on your question, Bob - and advise about pros and cons of using SUHD TV (provided it has the full 10-bit path from HDMI to the display, and the latter is truly 10- bit in their "Quantum Dot" HDR panels) vs. a monitor I just described (like the iiyama Prolite X4071UHSU-B1 40'').

All in all: a SUHD could double as a TV, which is also important; it would allow me to see the final material as it will look delivered to a TV (either from a PC software playback, or using the UHD BD to-box player). On the other hand, such a TV's DSP will heavily "improve" the look of my picture, unless the Direct/Natural mode allows to really switch it off completely. A PC monitor will never lie to me this way, and always provide more of a 1:1 look, and it can be calibrated.... So, a tough decision, and it so happens I need to take it these days (I'm going to lease both the camera and the display, and my financial plans require that I start right away, or never :(

Thank you guys, and waiting for the final verdict :).

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

farss wrote on 3/9/2016, 12:11 AM
[I]"
Not sure what you mean by DCI "[/I]

DCI = Digital Cinema Imitative which is not the same resolution or bit depth as UHDTV.

[I]"PC 4K monitor can have a 10-bit panel"[/I]

True but for that to be of any value you need a card that'll feed it 10 bit and you need editing software that'll support that.

I read you have the FS7, we just bought several FS5s, I'm liking them :)
Thing is though off the top of my head they'll record UHD 8bit 4:2:0 or HD 10 bit 4:2:2. The thing I really like is out of the box the FS5 will also underscan the sensor enabling use of S16 and B4 lenses. There has to be a price to pay though in not scanning all of the sensor.

The only honest advice I can give you is none :(
Everything is so fluid now it's difficult to keep track of all of it let alone form an informed opinion and I've been across some lengthy discussions with people shooting for broadcast. I can tell you they were shooting 4K on a F5 but only for the shots that were going to have compositing work done on them. The rest was Log HD.

Bob.
John_Cline wrote on 3/9/2016, 1:14 AM
My Samsung JS9500 SUHD 4KTV will display 50/60 fps at the following bit depths and chroma sampling:

8-bit @ RGB 4:4:4, YCbCr 4:4:4, YCbCr 4:2:2 and YCbCr 4:2:0
10 bit and 12 bit @ YCbCr 4:2:2 and YCbCr 4:2:0

This is true for the 6401 to 6800, 641D to 700D, 7000, 7200, and 8000 series models.
megabit wrote on 3/9/2016, 1:17 AM
Bob, after fw update 3.0 the camera is czpable of 10 bit 422 in multiple modes including XAVC 4K. Also I'm aware what DCI stands for😀. So my question is simple really - what other pros&cons of TV vs monitor as a full screen display can you think of?

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

megabit wrote on 3/9/2016, 3:00 AM
John,

I'm thinking of the js9000 - I assume it will handle all those signals, as well (you didn't mention the 9000 series in your list)?

Also, I wouldn't doubt HDMI 2.0 will accept all these formats; my question is: can you actually see the difference between 10 bits and 8 bit colors gradation? This would mean the path from HDMI to the display, and the display itself, is indeed 10-bit (at least)... Please provide me with this info as soon as you can. TIA!

Piotr

PS. Oh, and one more thing: which of the mentioned functionalities do you think I'd lose if I used a simple, 10-bit capable, 4K PC monitor instead? Asking, because they are cheaper than the latest and greatest Samsung HDR TVs... Also, is there going to be even better PQ in Samsung 2016 lineup, which is using the "Quantum Dot" technology (while the 2015 used "S" like Super in their model names, the 2016 will be using "K") - but I've heard is just a marketing thing?

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

John_Cline wrote on 3/9/2016, 4:53 AM
Yes, the 9000 series handles all those signals. I am feeding mine with an nVidia Quadro M4000 card through a Display Port 1.2a to HDMI 2.0 adapter. Yes, I can see the difference between 8 bit and 10 bit images. The Samsung SUHD "Quantum Dot" TVs have 10-bit panels and are capable of around 1,000 nits of brightness and the blacks are amazingly plasma-like. SUHD TVs achieve 92% of DCI-P3 colour space, a coverage that would have been even higher had Samsung insisted on going cadmium-free (cadmium is an environmentally toxic material). Quantum Dot is like painting with better, purer paint. The human eye (and brain) see pure colors as brighter. For example, the more "pure" the red, the brighter it seems. So a wide-spectrum red, which is pretty typical on most TVs, won't seem as bright as a more focused red, such as that from a quantum dot display (or RGB laser or OLED displays, too, for that matter.)

Keep in mind that the 9000 series TVs are curved and all the others are flat. I have a 65" JS9500 in the viewing room and a 55" JS8500 in my office/edit suite. The JS9500 series is back-lit and the JS8500 is edge-lit but I can't really see any difference.The 65" JS9500 was $3,800 and the 55" JS8500 was $1,450.

Not that this would have anything to do with your decision, but the 8500 series and up Samsung SUHD TVs are 3D capable and come with some nice, lightweight active LCD shutter glasses. It can also synthesize 3D video from 2D sources and it does it astonishingly well. I'd really like to know what software they're using to pull it off as I've never seen it done better.

As far as what you'll lose with a 4k-10bit PC monitor, it's basically going to be screen size, brightness and quantum dot display technology. Although, I have a friend that has a couple of ASUS 28" PB287Q 4K monitors and they look really good ($500 at Amazon).

4k is a quickly evolving technology, one thing is certain, it's definitely here to stay.

You might find this article interesting: http://www.hdtvtest.co.uk/news/ue65js9500-201502234012.htm
megabit wrote on 3/9/2016, 6:06 AM
Thanks John.

You are referring to the SUHD (2015 lineup) as Quantum Dot - my understanding was SUHD was "nanocrystal technology display", while KUHD (2016 line) will be Quantum Dot. So am I to understand that there is no real reason to wait for the KUHD, and pay the premium for the latest and greatest? I only mean the PQ differences; I've heard about better connectivity and home "intelligence" of the KUHD series - but this isn't so important to me.

Oh, and one more thing - curved vs. flat: is seems it's completely the other way around here in Poland; all the 8500, 9000 and 9500 SUHDs are curved and only 8000 is flat (but again, this one cannot be bought yet - it only probably arrives as KUHD). Are the curved displays any worse for my intended use? Thanks a lot

Piotr

PS. Do you see any lag in your 8500 vs 9500, due to the 4-core CPU vs. 8-core one?

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

John_Cline wrote on 3/9/2016, 7:32 AM
Nano Crystal Color is just Samsung's name for Quantum Dot, they're the same thing. Quantum Dot Technology is LCD’s answer to color-rich OLEDs.

Liquid Crystal Display panels, which have been around for decades, are backlit in two ways. Up until around 2005, all LCDs were backlit with CCFL (Cold Cathode Fluorescent Lamps); then “white” LEDs replaced fluorescent lamps. Essentially, all mainstream LCDs now use LEDs, or light-emitting diodes.

When LCD technology is driven by what’s known as white LEDs, the color spectrum on those displays is very broad; in order to make saturated red or green or blues on an LCD display, you need narrow bands of light.

So what TV makers have done is incorporate a quantum dot film into the displays, effectively replacing the white LEDs. The quantum dot technology absorbs light from blue LEDs instead, and then it emits it at specific wavelengths to make colors look more vibrant and less “blah.”

This whole process is more power-efficient than the usual LCD backlighting process and less costly than OLED technology.

Regarding the curved display, I like it on the JS9500 and if you're going to be directly in front of it, it will be fine. I got a JS8500 flat display for my office but that's just because I just couldn't justify spending another $3,800 for a second JS9500. As it is, I've got $5,300 invested in two 4k TVs, but that just indicates how much I believe in 4k. Also, the JS8500 is just a bit slower to respond to commands than the JS9500 but not enough to bother me.

I've used a lot of monitors over the years, both consumer and professional, and I think the Samsung SUHD TVs produce the best looking images I've ever seen. Period. They definitely rival the new LG OLED televisions and they're a lot less expensive. If you've been using a 50" plasma, then you'd probably be really happy with a 55" SUHD. In fact, I got the 55" SUHD in my office to replace a 50" Panasonic plasma.
megabit wrote on 3/9/2016, 8:01 AM
Thanks John,

After what you said about Quantum Dot technology I guess the 2016 Samsung line-up being renamed to it from Nano Crystal Color of 2015 is just a marketing thing.

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

megabit wrote on 3/9/2016, 10:22 AM
If you've been using a 50" plasma, then you'd probably be really happy with a 55" SUHD

John,

Sorry to bother you, but it looks you're the right person to ask as you yourself replaced a 50" flat plasma with a curve 55" SUHD. Please let me ask one more question:

- assuming I'm sitting some 1.5m from the wall, will the (curved) 48" SUHD be sufficient? Or is 55" a minimum, because of it being curved? TIA

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

farss wrote on 3/9/2016, 1:51 PM
[I]" So my question is simple really - what other pros&cons of TV vs monitor as a full screen display can you think of?"[/I]

I would have hoped the answer to that was self evident and hasn't changed since the last century but given the direction the conversation has taken maybe it needs to be restated. :(

A "TV" is a piece of consumer electronics. It's designed to display pretty pictures in a showroom to entice consumers to buy it.

A "monitor" [I]should[/I] have been designed for monitoring of some signal by a human's eyeballs. If the signal represents something ugly then it displays ugly.


That's very trite and simplistic but from that point on the conversation gets messy very quickly. Simple example:
I have an Asus ProArt monitor that I'm reasonably confident displays colour correctly. It does not display motion correctly...I think. Footage that may look a juddery mess on it using Vegas looks just fine on my HDTV. Which is correct?

Bob.
John_Cline wrote on 3/9/2016, 5:03 PM
Piotr, actually, I replaced a 50" plasma with a 55" flat SUHD monitor. Regardless, with a 4k TV, it's a matter screen size versus distance and the ability of the human eye to resolve all the detail that a 4k display can produce. People that buy a 65" 4k TV and then sit 5 meters away simply aren't going to see any more detail in 4k programming than they would if it were a regular 1080 HD display.

In our case, we both sit about 1.5 meters away and I don't think that a 48" display would really be sufficient. In fact, to resolve all the detail in a 4k image, I should have gone with a 60" or bigger display. The curve on the display is pretty slight and probably really isn't a consideration.

Based on THX recommendations using the calculator at the link below, the "ideal" viewing distance from a 48" 4K display based on the visual acuity of someone with 20/20 vision is 37.5 inches or 3.1 feet (95.25 cm or .9525 meters). For a 55" TV, it's 42.9" or 3.6 feet (109 cm or 1.089 meters). Download the Excel spreadsheet and plug in your numbers.

Online distance calculator

Screen size vs Distance Spreadsheet
Wolfgang S. wrote on 3/10/2016, 1:16 PM
Yes, you will see a (small) difference between 8bit and 10bit monitors - but the difference is minor what I see here with my Dell.

A simple check if you have an 8bit or 10bit video can be done with some simple test graphs. And yes, believe it or not, it is possible to display 10 bit using hdmi as you have it in the Decklink 4K Extreme. There is no need to go to SDI.

What I tried to explain to you: even if you have your high-end processor in system, you will end up with limitations of Vegas in the fps shown in a 4K/UHD 32bit floating point project. But test it!

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti * Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED (ProArt Studiobook 16 OLED (i9 12900H with i-GPU Iris XE, 32 GB Ram. Geforce RTX 3070 TI 8GB) with internal HDR preview on the laptop monitor. Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG, Atomos Sumo

megabit wrote on 3/10/2016, 1:44 PM
No need for 32bit real-time playback in Vegas. Grading I will do in Resolve; final edits that needs full fps - in Vegas 8 bit. Rendering out in Vegas 32 bit - all should work fine.

Piotr

PS. And who knows, one day Vegas may be replaced with Catalyst Edit, which plays back almost as fine as Resolve does - did you notice?

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

megabit wrote on 3/10/2016, 11:13 PM
Yes, you will see a (small) difference between 8bit and 10bit monitors - but the difference is minor what I see here with my Dell.

For grading 4K 10 bit 2:2:2 Slog material from FS7, I guess it's not minor and it's essential - 8 bit would introduce color banding that isn't there, thus spoiling all the process by triggering false grading decisions. Of course I realize the final rendered deliverable should be 8 bit anyway if I was going to deliver on DVD like I used to, for watching on TVs out there 99% of which are 8 bit. But it so happens most of my final products will be just for myself as I had to give up my "professional" video activities - and these I will play back from my PC on my 10bit SUHD TV, so I can render in 10 bit :) I remember how I hated color banding of my EX1 HD stuff on my 8bit 50" HD plasma!

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Serena Steuart wrote on 3/10/2016, 11:41 PM
Piotr, I thought you used to record to an external recorder from the EX1; that SDI out is 10bit. Only 8 bit on internal recording.

Serena
megabit wrote on 3/11/2016, 12:54 AM
Dear Serena,

Tue - but my external recorder was nanoFlash - only capable of 8 bit, 4:2:2. So I never recorded 10 bit color before.

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

megabit wrote on 3/11/2016, 1:01 AM
A "monitor" should have been designed for monitoring of some signal by a human's eyeballs. If the signal represents something ugly then it displays ugly.

Bob, this is my opinion as well - hence the original question. But considering I'm preparing for a rather hobbyist activities for the last part of my active life ;-), I guess I prefer "pretty pictures" on the modern UHD TV...

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

John_Cline wrote on 3/11/2016, 1:29 AM
I have all of the image and motion "enhancements" turned off on my SUHD displays. They are truly monitors, ugly signals do indeed look ugly but on the other hand, pristine signals look spectacular.