I think this all boils down to Apple looking at HDMI output not being used for a "monitor" ( something you will sit in front of and need to be able to read text on all day ) something about the way they are generating the signal is making some TVs "manipulate" the image, which mangles things at the pixel level.
I have seen some reference to changing settings on the TV for sharpness and other "image enhancement" settings, which makes perfect sense. If the TV isn't "massaging" the image it should look just as it should coming out of the computer which is a 1920x1080 60Hz screen image, shift that size by one pixel and it's going to be distorted since that isn't what is being displayed.
After messing with my TV settings for hours, apparently some have a "dot for dot" setting that sounds like it removes ALL processing of an HDMI image that seems to work. I was able to get it pretty good by taking down my sharpness and turning off some other smoothing options, but it's not acceptable to me for use while writing code sitting 3 feet from the screen.
I purchased an Apple "Mini DisplayPort to VGA adaptor" apple part number MB572Z/A and used the VGA input on my TV, which works almost perfectly, I can make out every pixel on the screen.
I'm not sure if the HDMI to VGA would work as well, and I have not tested it.
Part of it I'm thinking is the overscan adjustment in display settings, if you fiddle with that you can see the size of the image changing which means that the display needs to "process" them to make them display at that size, and I think the image quality issues are a side effect of that.
Also if you look at the display information under "about this mac" under "more informatio" under "Graphics/Displays" when using HDMI it has a setting called "Television:" and it's set to Yes.... this means that it knows it's a TV and can be sending out the image in a way that confuses some TVs, but the other implication is that they could actually fix it if they wanted to spend the time to give some more adjustments to increase the chances of it working with a random TV model.
It's interesting that it now knows exactly what adapter I'm using and that there is firmware on it....
VGA is an analog standard and I was wondering if the thunderbolt spec included analog outputs on it, I'm guessing it doesn't and that the adapter has some hardware that is taking the displayport output and making it analog....... but I'm not sure and I haven't taken the time to look at the specs.
Having said all that I am having some odd problems, the pixels are clear over most of the screen width, but in some parts a little fuzzy, I can adjust this on my TV to make them clear in most areas, but I seem to have a cloudy zone that I can't get rid of. But I can deal with that.
I'm also having a random screen wobble that comes and goes, everything will just shift a bit to the left and right through a cyclical pattern for about 15 seconds, then settle out, I suspect this is some type of variance in the analog timing for the VGA signal, could be the adaptor..... I'm going to work on that with Appls support and see if I can swap it out and get some improvement there.
Hope this helps somebody, and hopefully Apple will warn people that the HDMI may not give them "monitor" level quality, but would be fine for using it to display movies and such on a TV in pretty much all cases, maybe that is the design goal.
Message was edited by: kevinKl4 - fixed a typo