VGA Vs. HDMI - Best Display to Edit on?

MadMaverick wrote on 3/22/2016, 1:07 AM


I have my computer hooked up to my monitor via VGA, and to my TV via HDMI, to serve as a secondary monitor for a display of my video.

As you can see in the video, there's a difference between displays. The HDMI display looks more washed out.

I guess the thing to do would be to go with two of the same kind of hook ups. I believe I can do this with 2 VGA's... but what do most people use for editing? Which would be the most all around efficient, and look best on most peoples displays?

The footage in the video is raw and the color hasn't been adjusted... but I can see where it'd be very frustrating to spend so much time messing with the color, only to have it look like crap on a different display. I guess that kind of thing is out of our control though.

Comments

john_dennis wrote on 3/22/2016, 1:24 AM
Trust the vga.
John_Cline wrote on 3/22/2016, 1:49 AM
First of all, colors and levels looked quite different when you view an LCD monitor from off axis, either horizontally or vertically, you really need to be viewing them straight on and from the video you posted, it does not appear that you are.

Secondly, all other things being equal, a monitor hooked up via HDMI should look better than VGA. HDMI is digital and VGA is analog. Televisions have various automatic "enhancement" settings which when turned on make them less than ideal for editing or any critical viewing.

Lastly, all bets are off unless you have calibrated your displays, at this point, you can't trust either of your displays. There are relatively inexpensive hardware devices which will allow you to calibrate your monitors and there are some software methods that while less effective, will get you pretty close. Windows itself has software monitor calibration. Calibrate your monitors, then watch a variety of commercial high quality video programming so that you can develop a reference. Only then will you be able to judge what to do with your own material. Also, never attempt to compensate for people's maladjusted TVs, that's their problem.

The Datacolor Spyder 5 series hardware calibration is a popular option. I have the Spyder5 Elite which has calibration settings for video use.

http://spyder.datacolor.com/display-calibration/

musicvid10 wrote on 3/22/2016, 10:25 AM
Calibrize is a free software solution, but it's usefulness depends entirely on the acuity of the user.
That done, your monitors will still be mismatched.

rmack350 wrote on 3/22/2016, 1:38 PM
You've already gotten good advice but let me run a couple of thought experiments past you.

How do you test your VGA and HDMI outputs to see if there's a difference between them? You should run them both to the SAME display and then switch between them. My expectation is, all other things being equal, that there would be a pretty minimal difference between the two.

That leaves you to consider the differences between the two displays. Ideally, you want to calibrate both of them so that they'll be as accurate as they can get, but you'll probably find that they'll never match. Even two identical monitors are hard to match.

TV's are notoriously inaccurate, and they offer profiles like "Vivid", "Cinema", and "Normal" so that you can get several flavors of inaccuracy. You probably want to calibrate the Normal profile and then assume that it's showing you the same sort of happy medium of inaccuracy that any average person will see on their own differently inaccurate TV. Yes, TVs usually don't look great but these are what people have so you want to do your best to make sure your picture looks "okay" on a TV even when its poorly adjusted.

I've never calibrated my own TV so I can't provide advice but the internet is full of it. I don't keep the TV near my computer so I never compare the images.

Whatever you do, you want to have one display whose color you trust and then grade for that. Professional graders put a LOT of money into this but as a hobbyist or small-time pro you need to just do your best. And pick one display to judge color from.

Back when I was lighting professionally in a largely analog world, I found it easy to tell at a glance whether a field monitor was right or wrong. Is there detail in the blacks? Should there be? Are the bright areas blowing out? Should there be detail there? These are real basic considerations that you should be able to ballpark just by eye.

As for VGA, this is an analog component-type signal. It can degrade over long cable runs or over faulty cables. The digital HDMI signal would also degrade over distance but 1's and 0's are still 1's and 0's. Digital doesn't degrade so much as it just fails.

Rob

<edit> Actually, I only found it easy to tell at a glance when a field monitor was wrong. This was an objective truth. Judging a field monitor to be right was mostly subjective.</edit>
JJKizak wrote on 3/22/2016, 3:13 PM
And some TV's calibrate each HDMI input separately.
JJK
MadMaverick wrote on 3/23/2016, 12:02 AM
Thanks for all the info guys. This is probably a no-brainer, but should I go with HDMI over VGA?
john_dennis wrote on 3/23/2016, 12:56 AM
In spite of my three word answer before, I'll try to summarize the thoughtful advice that others shared.

1) The physical layer transport mechanism (analog on DB15, analog or digital on DVI, HDMI, Display Port) matters less than

a) the quality of the display panel(s)
b) the calibration of the display panel(s)
c) where they exist, overcoming any [I]picture enhancements[/I] in consumer TV(s) that give you an inaccurate depiction of the video you are editing.

@ JJK "[I]And some TV's calibrate each HDMI input separately[/I]."

Mine does, and it has four HDMI inputs.
diverG wrote on 3/24/2016, 1:02 PM
Outputting through a BlackMagicIntensityPro 4K card fed to a standard TV via HDMI does a pretty good job. TV also acts as a second screen fed from the on board intel graphics. (DVI input) when not editing.

Sys 1 Gig Z370-HD3, i7 8086K @ 5.0 Ghz 16gb ram, 250gb SSD, 2x2Tb hd,  GTX 4060 8Gb, BMIP4k video out. (PS 750W); Vegas 18 & 19 plus Edius 8WG DVResolve18 Studio. Win 10 Pro (22H2) Bld 19045.2311

Sys 2 Gig Z170-HD3, i7 6700K @ 3.8Ghz 16gb ram, 250gb SSD, 2x2Tb, hdd GTX 1060 6Gb, BMIP4k video out. (PS 650W) Vegas 18 plus Edius 8WG DVResolve18 Studio Win 10 Pro (22H2) Bld 19045.2311

Sys 3 Laptop 'Clevo' i7 6700K @ 3.0ghz, 16gb ram, 250gb SSd + 2Tb hdd,   nvidia 940 M graphics. VP17, Plus Edius 8WG Win 10 Pro (20H2) Resolve18

 

farss wrote on 3/24/2016, 3:08 PM
I would expect feeding the same digital values to a monitor or TV via HDMI and DVI to produce different outcomes. One is a video interface, the other a graphics interface. It's also quite likely that HDMI is going to be scaled which means what you're seeing is not pixel to pixel accurate.

Bob.
chet-hedden wrote on 3/24/2018, 12:07 AM

I have been producing 3D films with SVP 13 for several years using a 23-inch NVIDIA 3D Vision-compatible gaming monitor. However, I now want to fine tune the vertical and horizontal disparities in my 3D films on a 60-inch 3D TV because I intend for the project to be seen on 60" TV or smaller and editing on a smaller monitor does not result in disparity sizes that are comfortable when viewed on a large monitor or TV.

To do this, the Vegas Pro manual states the following:

"If you want to preview your project on a 3D television or monitor, you can use the Preview Device tab in the Preferences dialog to configure a 3D display for previewing your project."

"If you're using an NVIDIA graphics card that supports 3D Vision technology and a 3D Vision monitor, choose the Windows Graphics Card setting from the Device drop-down list in the Preview Device tab and choose Left and Right from the Stereoscopic 3D mode drop-down list."

"If you're using an NVIDIA graphics card that supports 3D Vision technology and a 3D-capable HDTV, choose the Windows Graphics Card setting from the Device drop-down list in the Preview Device tab and use the Stereoscopic 3D mode drop-down list to choose the method your monitor uses to display stereoscopic 3D content — typically Side by side (half) or Line Alternate. Be sure to set the 3D mode in your television's setup menu and the Vegas Pro Preview Device tab."

The 3rd paragraph above is the relevant one to my issue. My TV is connected via Quadro card and DVI/HDMI cabling and all the settings are as above, and the VP Preview Monitor has been dragged to the TV, but there is no 3D when viewed by the TV's own (Sony) active shutter glasses.

It has been suggested a passive system like an LG or Vizio might work better because there is no shutter synchronization issue. Finding good 3D TVs is difficult because they are not being manufactured currently and are only available used online, and I don't especially want to buy another TV just to see if it will work.

Does anyone know of a passive system will work any better, or it there is a way to get the active system to work?

 

matthias-krutz wrote on 3/24/2018, 8:21 AM

it is off topic: to fine tune I use the red / cyan mode. For this I put a transparent grid, created with Photoshop in a separate track and move it with stereoscopic adjustment on the far point. All deviations can now be exactly checked and corrected in the preview window or separate monitor. Image diagonals over 100 "in the projection are comfortable to look at, all other visual orientations have repeatedly led to smaller and larger deviations and thus to eye and headaches.

The good thing, the result is even with a small monitor to achieve.

Here's the grid for your own test.

 

Desktop: Ryzen R7 2700, RAM 32 GB, X470 Aorus Ultra Gaming, Radeon RX 5700 8GB, Win10 2004

Laptop: T420, W10, i5-2520M 4GB, SSD, HD Graphics 3000

VEGAS Pro 14-18, Movie Studio 12 Platinum, Vegasaur, HOS, HitfilmPro

max8 wrote on 3/24/2018, 6:23 PM
[...]and the VP Preview Monitor has been dragged to the TV, but there is no 3D when viewed by the TV's own (Sony) active shutter glasses.[...]
or it there is a way to get the active system to work?

 

Hi,

that sounds like you are really using the normal preview window that is (de-)activated via Alt+4, which you are dragging on to a second screen of your extended desktop (or did you mean "switched"?). If so: this window is belonging to the GUI of Vegas and is not the external preview.

I've never edited a 3D project but usually the preview device tab is used to choose and control an external video device, which may be connected through a dedicated HDMI/SDI video card or (as in this case) a second output of a normal graphics card. To enable this output use Shift+Alt+4 or the "dual-monitor-icon" near the top left corner of the (GUI) preview window.

chet-hedden wrote on 3/24/2018, 11:58 PM

Thank you Matthias for the suggestion and the grid. It's an interesting, straight-forward method for a small screen.

Max -- I'm curious about your suggestion that I may not be interacting with the external monitor when I drag the preview window over to it and then click the "dual-monitor-icon" in the upper left corner of the window. Normally that is when the display switches to stereoscopic. How is it possible to open the external display any other way?

max8 wrote on 3/25/2018, 7:35 AM

As I've said, I don't know the differences with 3D editing, but usually you only need to click on the "dual-monitor-icon" to activate the external preview function and Vegas displays the ("external") video preview on the selected monitor on top of any other content (Explorer, Firefox, etc....). The GUI video preview may stay where ever you want it to be and doesn't know anything about being dragged to the "external" preview monitor as you have to specify this monitor in the preferences and not by dragging a window to it.

But if you already activated the external video preview and got a full screen display on your second monitor that is not stereoscopic it may be a specific 3D problem.