Previous 1 8 9 10 11 12 Next 172 Replies Latest reply: Feb 14, 2016 9:40 AM by Go to original post
  • Bumbed_Jason Level 1 Level 1

    i cant understand how apple can nail it in my mac book pro 13" and miss it on the mini.  My new pro is laser sharp on any of the (6) HD TVs I have in the house. (2)Samsungs, Panasonic plazama, Orion, Dynex, Sharp, seeing this display on a 65" plazma is unreal. 


    How in the **** cant they get this graphic/video package that is in this freakishley small laptop in to the mini???!!!!!

  • turtleneck and beret Level 1 Level 1

    I agree with xboxremote. I own and have worked with several Macs and this problem always involves a Mac with MiniDisplay Port (ie. both Mac Minis and MacBooks) connected to a HDMI display. (Regardless of MDP->HDMI cables or MDP->DVI->HDMI adapters.)


    The fuzzyness people mention in this thread is due to OS X not outputting the correct resolution. When you use underscan (or overscan, for that matter) you are scaling the output and therefore lose the 1:1 pixel ratio and thus, the sharpness. This has nothing to do with post-processing on the display, although you can compensate for the symptomes with it.


    What I don't understand is why this issue usually appears on previously working setups/configurations.


    Sadly, I don't have a solution that hasn't been mentioned before in this thread;
    1. Set the signal type  to PC instead of AV in the display's on-screen menu.

    2. Use option (alt) while clicking "scaled" when selecting resolution in the display settings in OS X. Sometimes this doesn't show the desired (and sometimes previously working) resolution.

    3. Just disconnect and reconnect the cable and pray to Jobs that the OS will renegotiate the connection and set the correct resolution. This used to work quite often, years ago on a MacBook5,1 but the last year it never worked.


    I have also seen two cases with flickering external displays. This seems specific for certain Macs (from 2008-2010) with MDP (connected to DP or VGA/D-SUB). Once the issue appears it seems to get worse over time. For my MB5,1 the issue started with a Dell display at work. One complete reinstall later, the issuel was still there. After a couple of months I connected it to a Samsung and a Benq I had previously used for years without this issue (the Samsung had the scaling problem mentioned above) and it too had started to flicker. Reconnecting usually works, but not always.


    I find it strange that these issues have been around for sometime and there still is no official word on it, other than option-clicking the scaled resolutions...


    Good luck, everyone!

  • J. Miguel Level 1 Level 1

    This one worked for me. Using HDMI to HDMI connection on base Mac Mini 2012 to Samsung TV.


    On your Samsung TV


    1. Go to menu.

    2. Select Input

    3. Edit name

    4. Go to HDMI/DVI

    5. Select PC or DVI PC

    6. Exit Menu

    7. Go to System Preference

    8. Select Displays

    9. Select for Best for Display or

         You can play around with scaled and select the resolution that you think will work best for you. Im using 1360 x 768


    I think there is a need for you to edit the name of the of the connection used in the HDMI port. I noticed that other resolutuion options appeared only when I chaged the name to PC or DVI PC.

  • Level 1 Level 1

    This is the same solution as previously posted. You're right, though, it does solve the problem (on Samsung TVs, anyway).

  • turtleneck and beret Level 1 Level 1

    Why are you using 1360x768 on a 1920x1080 display?

    After all, the problem people in this thread are having is that the correct resolution for the external monitor is unavailable in OS X.

  • J. Miguel Level 1 Level 1

    Im using a 23" samsung tv. Its the same resolution that is used by the computer if I opt for the best for display option. I tried playing around with the resolution and the 1360 x 768 was the same as the best for display. I was only trying to point out that information.


    I tried to edit my post but I was already way pass the time allowed where one can make a correction to his post.


    I think that by default, the samsung tv software assumes that the hdmi connection will be used by a media player(blue-ray, dvd, etc...) and not by a computer. The samsung tv im using predates the use of hdmi as an added communication port in computers. But the good thing here is that they also anticipated that hdmi may also be made available in computers in the future hence they included that as an option that you could choose, albeit manually.

  • turtleneck and beret Level 1 Level 1

    Oh, I see.

    Yeah, it makes sence that HDMI would be preferable for media. Who would want to use HDMI for a monitor, given the problems users in this thread obviously are having?


    I wonder why OS X would fail to detect the display's native resolution (even if the HDMI port would be set to "media" type). It's not that blue-ray players and HDMI amps would output resolutions like 1360x768 to a 720p/1080p display.

    (If they did the quality loss would be two-fold since the media would be resized from it's original format (let's say 1080p) to 1360x768, losing roughly 50% of the picture data, then letting the display scale ~1MP image to the actual ~2MP of display elements, blurring the picture.)


    If the native resolution detection (best for display) was working as intended, why won't Apple just say it is - and explain what user error causes the phenomenon described in this and many other threads in just this forum - than just pretend that nobody is experiencing this issue?


    Oh wait, let's pretend I'm a "Genious™"!
    Solution: If you'd bought the incredible

    Apple LED Cinema Display

    this would never had happened!
    *ajusts beret, strokes hipster-mustache*

  • J. Miguel Level 1 Level 1

    I think that Mac OS X could not possibly detect the native resolution of the TV because the software on the TV defines itself as such and Mac OS cannot override that.


    Its a two way communication. Mac OS can only act on the information being provided by the connected device. Its like a mouse. By default, MAC OS will assume its a right hand mouse unless you change the setting to make it a left hand mouse and only then will the MAC OS adjust accordingly.


    I would love to get the 27 inch iMac or even just the 21.

  • turtleneck and beret Level 1 Level 1

    As described previously in this thread, it happens that OS X successfully detects the native resolution and after some time it fails to do so. Also, detecting maker and brand usually implies detecting supported (and thus native) resolutions.


    I know, DDC has been around since 1994.

    If that was the case, other OSes and devices would have the same problem, which they don't.


    Your mouse analogy would be  more fitting if OS X did not allow you to change the setting. (If you could select a resolution regardless of what supported modes have been detected by OS X this wouldn't be a issue, since you would simply choose the display's native resolution instead of having to rely on OS X to detect it. But like mentioned in prior posts: 'Use option (alt) while clicking "scaled" when selecting resolution in the display settings in OS X. Sometimes this doesn't show the desired (and sometimes previously working) resolution.')


    I'm starting to think that this could be a DPCP/HDCP related issue, using mDP to HDMI cables and/or converters.

  • epik151 Level 1 Level 1

    I've got this fuzzy text issue on a HP monitor. I'm fairly certain it worked earlier in the year.


    I've tried different hdmi cables and different minidisplay to hdmi adapters.


    Very frustrating.

  • H.M.J. Level 1 Level 1

    I was having ridiculous video scaling problems with XBMC. I just started using this fantastic, if hair-pullingly (is that a word? :)) user-hostile, poorly documented piece of software. My 13" late 2010 MBP was detecting my 52" Sony Bravia TV, connected with a miniport-to-HDMI cable, no problem, and was offering 3 possible resolutions: 1280x960 60Hz (NTSC), 1080p, and 1080p. I chose the intuitively obvious 1080p setting. Wrong choice! I could not, no matter how hard I tried, 'calibrate' the video to fit full-screen. One of the corner 'L's was always off-screen. I could find no good documentation about how to go about calibrating the video screen size. This is the only useful thread I found specific to the Mac. After reading a few pages of the thread, I decided to try just accepting the very odd 1280x960 size, which is the first choice anyway, and the only one that shows the 60Hz NTSC specification. After this leap of faith, and that's all I can call it because it is 100% counter-intuitive, everything fell into place. I could easily calibrate the full screen mode and it just filled the TV screen, except for a black border, which I still have to learn how to eliminate. I played a huge (18GB) 1080p MKV and got virtually BD quality at full 1080p resolution. Breathtaking. Apparently that crazy resolution is the native resolution of the TV, so just go with it and you will have no problems. So thanks for starting this thread. It served as an inspiration to me and helped solve a problem that I believe shouldn't exist. XBMC developers need to work on better documentation and easier-to-use UI. Other than that, it is a very powerful media center option.

  • turtleneck and beret Level 1 Level 1

    Your Bravia is 1080p, having 1920x1080 pixels (thus 1920x1080 picture elements that make up your screen if you look very closely). This means that 1920x1080 / 1080p is the native resolution.


    If you are using 1280x960 resolution output for 1920x1080 content means that the player is downscaling the content (from 1920x1080 to 1280x960) with a 41% loss in image data before it outputs it.

    When received by the Bravia, these remaining 59% have to scale back up to 100% of the Bravia's pixels (otherwise the image would only occupy 59% of the screen, but at least be pixel-prefect) - causing 41% of the image composition to be interpolated by the Bravia, which makes the image less sharp. (Not pixel-perfect.)


    How to solve this? If not mentioned in the previous posts, I don't know. But I'd guess that selecting input type on the Bravia could help. If not, does it have DVI input? Even though using VGA would feel slightly medieval it would at least provide your Bravia with the native resolution (that is 1080p).


    Don't blame XBMC. Blame Apple for staying silent on (what is likely to be) DPCP/HDCP issues over mDP.


    Don't settle for less!

    1080p - beacuse you're worth it

  • H.M.J. Level 1 Level 1

    You are absolutely correct. Thanks for the feedback. Here's an update:



    I had display mirroring on because it was just easier to deal with. So many other issues that this was a minor one. So I turned mirroring off and selected 1080p for the TV output. BTW I mentioned that there were two 1080p options. In mirroring mode (only this mode shows the refresh rate for reasons beyond my comprehension--Apple silliness?), you can see that the first one is 24Hz and the second one is 60Hz. I tried both, in non-mirrored mode, but only 60Hz produces jitter-free (no frame dropping) motion. OK, so now on to calibration.



    Display Mode: Full Screen #1

    Display: 1920x1080

    Video calibration:



    Here's where it gets nasty. As I mentioned, I was not able to get the bloody ugly giant ("for dummies") XBMC cursor to the lower right corner so I could adjust the overscan. The upper left was just a matter of moving the cursor as far left and up as possible and then working the cursor (arrow) keys to get the 'L' in position. This strategy did not work for the lower right. I could not get the damned cursor down there. After trying multiple resolutions, both higher and lower than 1920x1080, I was able to do the entire adjustment with lower resolutions but not higher. Then, out of frustration/anger/boredom, I decided to hit the ENTER key. ~*{LIGHTNING STRIKE!!!}*~ OMG, this toggles all the needed adjustments! Upper Left, Lower Right, Rectangle Aspect Ratio, Subtitle Position. So now that I discovered this closely guarded secret (I told you the XBMC documentation sucked, didn't I?), I was able to achieve perfect 1080p resolution.



    Now it gets us into "uncertainty" territory. Although my 13" MBP screen only supports a maximum 1280x800 resolution, and apparently OS X chose 1280x960 for the TV output because it was what was closest to this, I am hoping that the video card is actually outputting a full 1920x1080 resolution. But wait. If I turn mirroring back on, I see that the maximum output I can set is a whopping 1600x900! BUT!!! At the bottom of the the resolution list it tells me "Usable resolution 1280x800." So now I'm not sure what I am really getting output to the TV when I switch mirroring off. It could be the actual 1920x1080, OR it could be the lousy 1280x800 upscaled to 1920x1080--UGGH! I looked up the specs for my model and found the following:



    Supported resolutions: 1280 by 800 (native), ... and lower



    Graphics and video support:



    Dual display and video mirroring: Simultaneously supports full native resolution on the built-in display and up to 2560 by 1600 pixels on an external display, both at millions of colors



    YES!!! So I am getting true 9120x1080 output. Fantastic. Thank you Apple. So that 1280x800 limitation only applies to mirroring. Whew! Never going back to mirroring...



    OK so I started playing that 18GB movie and it looks good. I won't be able to tell for sure until tonight after dark but I think it looks a lot better now than before.



    Problems solved.



    One new annoying little problem. I closed XBMC and reopened it and now I can't get it onto the TV screen no matter what I try. I tried dragging it to the TV screen and it disappears from the MBP screen but does not show up on the TV. I opened/closed the lid, tried sleep/unsleep, toggled mirroring, all to no avail. Someone here MUST know how to do this. Please let me know. I refuse to go back to mirroring.



    One other issue. I have XBMC set up in remote control server mode so I can access it from my iPhone. Every time I start it, I get an annoying pop-up asking me to allow/deny access to the internet. I can't click this from the iPhone remote app so I have to get up and do it manually. I tried a free keyboard app but it raced the CPU at 50+%. It is not Little Snitch that is doing this. It is the system. How do I allow permanent access to this app?

  • H.M.J. Level 1 Level 1

    OK, I solved one of the new problems. I had set up the Full Screen #1 video calibration in XBMC, mostly because that was all that was available in mirroring mode, and that's what I became 'used to'. I realized the error of my ways and did the calibration for Full Screen #2. That solved it. I now get the full XBMC screen on the TV.

  • H.M.J. Level 1 Level 1

    I watched the huge movie file at night and it was stunning. Full screen, no black bars, extraordinary sharpness, and fluid, blur-free motion. No frame drops. It was worth the struggle, though I feel it shouldn't have been that difficult.