Could someone more savvy explain to me what is the reason that the same video (from the EX1 BTW), shows much less mosquito / lowlight noise on my 50" plasma when fed to it via the component output of my ATI HD3870 card, than when using the secondary DVI->HDMI?
Is it due to some sort of filtering effect, resulting from the (potentially) lower quality of analogue vs digital signal?
Or because the component is interlaced, and -- during deinterlacing in the HDTV - it looses some resolution, along with the noise (BTW, the resolution drop is not visible to my naked eye)...
Is it due to some sort of filtering effect, resulting from the (potentially) lower quality of analogue vs digital signal?
Or because the component is interlaced, and -- during deinterlacing in the HDTV - it looses some resolution, along with the noise (BTW, the resolution drop is not visible to my naked eye)...