You seemed to solve my issue.
I was running a Ipad 2 in to my Sharp 46" LCD with a Digital AV Adapter and it wouldn't register an image.
Turned the Sharpeness down to the most negative setting and - BINGO!
Weird or what.
I thought I had to downgrade the 1080i to 720p, but that wasn't possible.
I've posted this in another thread, but I thought I'd add it to this one to help those out with the same issue I had.
I reckon I've found a solution to this - or at least I managed to resolve the issue you're all describing at my end; a new 2011 MacMini driving HDMI to a TV (1080p) with a shocking picture quality (text with white halos, stark contrast, etc). This (unfavourably) compared to a Core 2 Duo Mini which had perfect picture quality at 1920x1200.
So after trying
HDMI out -> TV = poor
Mini DisplayPort -> HDMI = poor
Mini DisplayPort -> VGA = great picture quality at a low res, but after clicking 'detect displays', avail res's changed and the picture took on a very purple hue...?!
The TV didn't have a DVI connector, so I couldn't try that path.
Just before throwing in the towel, I though I'd try coupling the MiniDisplayPort/T'bolt -> DVI adaptor with the 3rd party DVI -> HDMI adaptor we had connected to the previous MacMini.
To my great surprise and relief, this solution works! It makes sense too - the Mini now thinks it's connected to DVI and shows the full range of resolutions it previously showed with the previous Mini, 1920x 1200 being one of them rather than the TV-only res's (1080, 720 etc) it displays over HDMI.
SO, MiniDisplayPort to DVI, then DVI to HDMI and hopefully, champagne and caviar!
I was with my brother`s 1080p monitor here in my room, a samsung, cuz it was the only with DVI entrance, the image was perfect, everithing reaaaly round(chrome symbol)
After a week I bought my cable mini dvi - hdmi, and I put it on the Samsung monitor.
The result is the reason that i`m here, crap image, had to fix the image with tv scan and stuff...
After my brother woke up I grabbed my 720p monitor, LG, and put my mac mini with hdmi into it and the result? PERFECT!
even the color settings is ``LG TV``
I`m not telling you guys to buy a LG, but this proves that is a software problem and not cable or TV.
I guess that the only way to solve it with another TV is to use the ``guys above``method...
Andrey Boarao from Brasil
just bought my mac mini, and i want to cry because it doesn't work with my phillips P42PFL7409
if i plug in thunderbolt - dvi then dvi - hdmi, will the sound come out from the HDMI port?
i noticed this problem has been around for some time already
any one found any permanent fix for it? so we can use HDMI port in normal way without having to use coupling technique.
is the newer version of HDTV also having the same problem?
I'm also having this problem. Although I appreciate the fixes in this forum, they are not actual solutions. So far I've heard:
- Use an adapter that converts Apple's old mini display port to VGA in the Thunderbolt port - I've already done this, and yes the display looks awesome on my monitor that only supports a VGA input. However, I am trying to get a second display to work (which is an LCD TV)
- Use an adapter from mini display to HDMI through the Thunderbolt port - Again, same issue as above. I need to utilize the HDMI port on my Mac Mini.
- Do a bunch of ridiculous adjustments to your TV to make it kiiiiind of look almost okay. - Obviously not a true solution either.
I'm just really annoyed at Apple for not even testing this. Or if they did, they chose to overlook it. And now that people are complaining about it, what do they do? Sit quiet until more people complain. I read (can't remember if it was this thread or elsewhere) that someone sat with the techs at the Apple store, and their solutino was #1 I listed above. That's just not acceptable. As someone else mentioned on this thread, "I only bought this crap to write iPhone apps." Well, now I'm stuck with one monitor to write my apps, while a perfectly good PC could be allow me to use both monitors and help me greatly with my production.
My only other option at this point is to shell out $50+ for an external (USB) graphics card to plug in my TV or my other spare VGA monitor. I'd be happy to do that if Apple's willing to pay for it (since I'm not doing this for gaming or anything). But we all know they won't do that... Unless practically all of their users complain (remember the "Death Grip" fiasco with the AT&T iPhone 4?). Probably at least 50% of the users of these things won't ever notice the blurriness, or they won't ever plug their machines into a display via HDMI.
I guess I'll try to hit up Apple support, but I definitely don't want to devote my whole life to troubleshooting their issue (which you know will happen). I hope this is a driver issue that will just get resolved in an update to Lion.
I forgot to mention that I think I also read one solution would be to go from HDMI to DVI and try it from there. However, neither TV's that I own have DVI (only VGA). I suppose I could try HDMI to Component?
Yet another option is to get something that will convert HDMI to analog VGA. However, that again is $50+.
I think i've cracked the problem.
My mac was running in 64 bit mode and I tried changing it to 32 bit instead.
To do this, go to utilities > terminal and enter "sudo systemsetup -setkernelbootarchitecture i386" and then enter and then type in your password and press enter. And restart. You should notice the improvement straight away.
If however, this doesn't work. You can revert back to 64 bit mode by going into terminal and typiing "sudo systemsetup -setkernelbootarchitecture x86_64" and follow the commands.
Hope this works.
I think it was mentioned in a previous post but renaming the source in the menu setting for the TV fixed the crappy picture for me. I have a Mid 2011 Mac Mini and 3 Samsung LCD TV's. Looked like crap on my Syncmaster P2570HD and B2430HD. I went in the menu settings for the TV and changed HDMI to "PC" on the "Input Settings" / "Source List" and boom, problem solved. It got rid of the fuzziness and "underscan" option. I also did an expert color calibration and tweaked the sharpness.
This might be a Samsung only thing but my guess is it would work on others.
Having the same issue going from my MBP 15" to a new Lenovo L2321w monitor via mini-displayport to displayport. The monitor is detected as a television and is resolution is set properly to 1080p. Overall the image is okay, somewhat fuzzy, but the real downer is text. Text is terribly fuzzy and just plain lousy to look at.
I would not have a problem using the displayport to DVI adapter which seems to fix it, except the monitor only has VGA and displayport connections. I've tried going from displayport to VGA and it is much better, but still suffers a bit from the analog connection being somewhat soft, especially at that high a resolution.
So I cannot use DVI, and there's no way to boot Lion in 32 bit mode as was suggested. How do I get this new monitor to look as good as it should?
I think you're on to something, Thanzig. I noticed that flipping around through the various names for the inputs (at least on a Samsung TV) changes how it shows on the display. The display actually refreshes for the ones that have the word "PC" in them. I still believe this is Apple's issue, but hey at least we've got something.
Here's something else interesting...
I got some new RAM today and upped my Mini to 16 GB (which is why booting into 32 bit mode would not be acceptable for me). However, after doing this, it seems that the display is much clearer on the HDMI output into my TV (regardless of the name of the input). I could just be going crazy at the moment, but I swear it is so much clearer. In fact, I'm sitting in my chair at my desk which is a good 8 feet or so from my TV, and I can read this clearly as I'm typing it. I don't recall being able to do that before.
So if you have some extra RAM lying around for some reason... Throw it in and see if it makes a difference (or just upgrade it). Before, I had 4 GB.
If I have time, I'll revert back to the 4 GB and see if there is actually a visible difference back-to-back, but I really really think it's better now.
I have had 4GB, 8GB and now 16GB in my Mini and never noticed a differenence in the resolution. I have the higher end mini with dedicated video card so my system doesn't steal RAM, but I've never heard of RAM affecting resolution.
I'm downloading XCODE now. I hear there are some settings that enable some HiDPI display modes via quartz debug. I've heard it gives you some extra smoothness on text. Will post pics and results later
I just solved my issue - I have a 2011 Mac Mini w/ HDMI to a 1080p 42" TV.
I was having issues with clarity of icons, text, etc. Things were ok but not 1080p ok.
I just went through my TV settings and noticed my sharpness level was extremely high. I turned it down and could see the clarity improve.
Worth a shot, worked for me.
I have this same problem too with my MacBook Pro. I tried the MiniDisplayport to VGA and it works fine with that. I agree it looks aweful, discolored and pixelated through HDMI. I found a problem with the VGA though, iTunes does not allow me to play my TV Shows and movies from iTunes through the monitor. This is very agrivating all it does when I play the movie, or TV show is come up with a black screen and will not play.
I actually have this problem with my Dell Monitor - Running Windows the display runs at 1920x1080x32@60HZ (the highest resolution, but on OSX it recognizes the display as a TV and it's a little less crisp.
OSX also won't play HD content (I hate HDCP), while Windows will.
The Intel HD 3000 graphics are still premature on OSX, so I think future software updates with probably fix this problem.
Yes, apple apparently knows about this. If you watch something from iTunes in quicktime it does almost the same thing accept it says "This movie cannot be played because it is on a display that is not authorizet to play protected movies. Try moving this window to a different display or disconnecting any displays that are not HDCP authorized." This is both annoyig and takes away a lot of possibility from the computer. Apparently it is supposed to do that so you can't record it with some sort of hardware. Even so I think this should definitly be changed.