Secondary monitor: HDTV component vs HDMI

megabit wrote on 8/3/2008, 8:08 AM
Could someone more savvy explain to me what is the reason that the same video (from the EX1 BTW), shows much less mosquito / lowlight noise on my 50" plasma when fed to it via the component output of my ATI HD3870 card, than when using the secondary DVI->HDMI?

Is it due to some sort of filtering effect, resulting from the (potentially) lower quality of analogue vs digital signal?

Or because the component is interlaced, and -- during deinterlacing in the HDTV - it looses some resolution, along with the noise (BTW, the resolution drop is not visible to my naked eye)...

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Comments

JohnnyRoy wrote on 8/3/2008, 9:30 AM
My guess would be that the Vegas secondary DVI -> HDMI is showing you raw unprocessed video (which is what you want to see) and the ATI HD3870 card is using ATI's Unified Video Decoder and ATI Avivo™ HD video and display technology to decode, deinterlacing, and post process your video for the "ultimate PC entertainment experience".

It is the video equivalent of listening to audio with a SoundBlaster card. You should not trust anything that comes out of it unless you are watching a movie for your own pleasure.

~jr
megabit wrote on 8/3/2008, 11:31 AM
Johnny, thanks for the reply - I'm aware of that, but have switched off all kind of picture 'ENHANCEMENTS" in the Catalyst Control Center (BTW, they are exactly the same for DVI->HDMI and Cmponent/HDTV outputs) - yet the difference is striking.

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

megabit wrote on 8/4/2008, 3:29 AM
Guys - anyone? C'mon!
With my older ATI card (non-HD. i.e. not HDCP-compliant), the secondary DVI port was the only viable option for monitoring through the HDTV's HDMI input (the component out worked, but the quality was below any standards). Now it really is good, in fact very good - you may wonder what's the reason to complain... Well, with NOT even a trace of mosquito noise, or chroma noise in dark areas - I'm wondering if I really need the Flash/Nano XDR 4:2:2 option... Really, so good the component HDTV output is!

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

megabit wrote on 8/4/2008, 3:52 AM
Oh, and one more thing: if the ATI Catalyst Control Center has exactly the same video enhancements settings for a secondary display connected via DVi or Component - why should I assume that the worse quality video I'm getting with DVI->HDMI is the "true" quality, and not the other way around?

I'm asking, because - for timeline monitoring in Vegas - it's of course very important that I am watching the true quality, without any hokus-pokus done to it before it reaches my plasma...

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

farss wrote on 8/4/2008, 3:57 AM
The component output is analogue and fed from an A->D converter followed by low pass filtering. Differences in the quality of those can have an impact on image quality. The monitor reverses this process. Again it too can have an impact on how the final image looks.

In theory the DVI out from the card goes into the monitor and you should get a pixel for pixel view of the your image. But you might not either.

Neither output might be showing you the true nature of your image. One way to check might be to see if you can get a full resolution image through either of them. Alternating black and white pixels should be displayed as exactly that. Any scaling, filtering or interpolation is likely to turn such an image into a mess.

Bob.
megabit wrote on 8/4/2008, 4:15 AM
Bob,

The low pass filtering, inherent in the component signal path, seems to be the only reasonable explanation of the phenomenon to me. I did test both outputs with your B&W 1920x1080 pattern file (Odd-Even Stress.jpg) you sent me last year for testing the true nature of the V1E's infamous "line twitter"; it looks OK with both the digital and analog outputs!

This is really a milestone in ATI's component output quality, hence my confusion. Of course, this is not to say I don't need the advantages of XDR's 50/100 Mbps, 422 format - but certainly the device doesn't seem to be my first priority investment now (with a good matte box, proper filters, and a HD-SDI field monitor still waiting for some free resources in my budget)...

I remember when i first saw an EX1's clip, blown onto a 50" screen through HDMI, I was a bit disappointed some "busy pixels" were still there (particularly around contrasty edges); this was the reason I endorsed the Flash/Nano box announcement without hesitation.

If I could somehow verify the component output is not deceiving, and shows the true quality of the EX1's image, I'd certainly be buying other add-on's before the Convergent Design's recorder.

Unfortunately, even if I had access to a true, broadcast quality monitor, I still am not certain how to definitely establish whether the DVI is spoiling my video, or the component is tarting it up!

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

4eyes wrote on 8/4/2008, 2:19 PM
Just some observations using my ATI HD cards.

It gets confusing on which settings to use on the ATI HD Cards de-interlacer(s). Not to mention the HDTV's internal settings.
The ATI's have hardware de-interlacing and it also depends on the software that's being used as to how it's using tha AVIVO along with the de-interlacer, software or hardware de-interlacing.

The first thing I would check is the dvi -> hdmi cable itself.
I've had a few bad ones. I don't remember reading the length of your dvi -> hdmi cable.

farss wrote on 8/4/2008, 2:38 PM
That test pattern might not tell you the whole story as it's only alternating lines. It'd take quite a bit to low pass filter lines, that's why I said you'd need alternating pixels but I can't find such a test pattern nor figure out how to build one easily, I'm certainly not going to do that by hand in PS, life is too short.

Regarding the XDR, I think all your delivery options are 4:2:0 anyway. Outside of heavy post work I'm not certain exactly what benefits recording 4:2:2 to deliver 4:2:0 brings to the table, recording 10 bit 4:2:0 might be of more benefit than anything but again it depends.
I've so far only tried CC on 10bit 4:2:2 in SD and it does seem to work better than from 8bit 4:2:0 but again it's hard to avoid any nasties that your delivery format might create.

Aside from all the technical stuff though, no one watches HD with their nose on the screen :)

Bob.

megabit wrote on 8/4/2008, 11:23 PM
It gets confusing on which settings to use on the ATI HD Cards de-interlacer(s). Not to mention the HDTV's internal settings

OK - let me explain:
Using either interface (DVI or component), I always turn off the card's automatic deinterlacing option, and move the slider towards the left end, where it says "weave". This way, I'm assuming it doesn't deinterlace at all. Thus, when fed interlaced (component input always says "1080i"), the only deinterlacing could take place in my Viera plasma itself. Now, all my video is 25p - so if the plasma is intelligent enough, it won't de-intelace it., either. And it seems to be the case, as I never ever saw the line twitter which results from e.g. bobbing progressive video (I can still simulate it in VLC).

On the other hand, DVI can be send progressive (the plasma says "1080p"), or interlaced (depends on whether I set it to 25 or 50 Hz in the CCC). In either way, I'm getting noisier image than with Component out, so it's definitely not a result of deinterlacing (and associated resolution loss)!

Also, my DVI -> HDMI cable is 1.5m and (judging from its price) not the lowest quality...

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

DJPadre wrote on 8/4/2008, 11:29 PM
One thing about HDMI, is that it will allow you to manage ultrawhites.
Depending on the delivery display device, if you KNOW what kind of device the job will be displayed on, you can manage these levels accordingly

as for how to go about getting an accurate image, well.. all i can say is flatten everything as much as you possibly can. Or if possible, use custom settings which you turn on/off depending on the program in use. I know NVidia have this option (ie game profiles which changes the GFX card performance/clour settings)
Maybe the ATI does as well...
megabit wrote on 8/5/2008, 12:33 AM
I just tested the direct EX1 connection to the same component input of my plasma, and - if this is any good indication - my conclusion is that DVi output from my ATI cards (both the current and the previous one) is indeed sort of destructive to the picture quality, introducing more chroma and mosquito noise than present with the component connection (both from the HDD via ATI, and directly from the camera).

This is an enormous surprise to me!

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

farss wrote on 8/5/2008, 1:22 AM
I don't see how you can reach ANY conclusion from what you are doing. You started with two unknown devices and now have added a third. The component input to the monitor could be masking defects that only the DVI inputs reveal or the DVI inputs could be being processed through sub standard circuits in the display and creating image defects.
Based on what I've seen with monitors that's be the first thing I'd suspect but that's only a gut feeling and I wouldn't put money on it. The composite video input into our old Bravias looks like crud and yet the same signal going into our good monitors looks pristine as does the signal from a TPG. Feed TPG into monitor and it too looks like crud, conclusion, Bravia is rubbishing composite video.
On the other hand feed the Bravia 720p via it's HDMI input and it looks pretty good. 1080i looks worse. Yip that Bravia is a bit suspect.

Only real way to reach any conclusion with this kind of puzzle is with purpose built test equipment.

Bob.
DJPadre wrote on 8/5/2008, 1:54 AM
I agree with Bob.

However the only way a test like this can be of any benefit or value.. or even comparison, is if your consumer monitor/tv has an option to turn off ALL image processing.
Some units call this "game mode" as this function turns off image processing from within the panel, as gaming consoles themselves process their image internally.

Why do i mention this? Because by turning OFF any 3rd party image process, allows one to see..

This applies for GFX card imge processing, through to display devices such as TVs and the like.
One reason i bought one of these panels which offer this "feature" as that it allows a true representation of the preview.
Any image processing is straight from the source itself with no other influence whatsoever
megabit wrote on 8/5/2008, 1:59 AM
I made a reservation: "if this is any good indication" :) Nevertheless, I did try the same on two completely different monitors:
- the 50" Panasonic plasma HDTV, and
- the 24" 1920x1200, Fujitsu-Siemens monitor with DVI and Component inputs

- and the results are identical: component Is definitely cleaner.

But you're right that it IS inconclusive as to the REASONS behind these results; I guess my next test should be feeding a single monitor with Component, DVI/HDMI, AND HD-SDI inputs: if the HD-SDI is as clean as component, we'd have some more indication. Unfortunately, don't have access to such a monitor at this moment.

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

owlsroost wrote on 8/5/2008, 2:17 AM
I suspect that the display is applying clever noise reduction to the analog inputs (which, not unreasonably, it assumes are likely to be noisier than the digital inputs).

Some modern display processors have noise reduction specifically designed to suppress digital compression artifacts (e.g. mosquito and block noise). Since a lot of this sort of noise reduction is based on temporal filtering techniques, the 'static' spatial resolution isn't affected (so test patterns still look OK).

Also, the video card/drivers know a lot more about the characteristics of the display when using HDMI (e.g. supported colour spaces, chromaticity, audio support etc) so this *might* make a difference to the video card processing.

Tony
megabit wrote on 8/5/2008, 2:28 AM
suspect that the display is applying clever noise reduction to the analog inputs (which, not unreasonably, it assumes are likely to be noisier than the digital inputs).

While I'd suspect the same with regards to a HDTV, I don't think it applies to a computer monitor like the 24" FS mentioned above.

And even with the 50" Panasonic plasma HDTV, the colour, MPEG -compression related artifacts (including mosquito noise), and so forth, can all be managed in the "Picture" menu. I have all of this stuff turned off - both for HDMI and Component inputs. So, unless it's doing some trick regardless of theses settings, this does not explain the difference,

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)