Skip navigation

OSX Mavericks does not detect external monitor

19427 Views 10 Replies Latest reply: Feb 21, 2014 2:05 PM by williema14 RSS
profputr Level 1 Level 1 (0 points)
Currently Being Moderated
Oct 31, 2013 12:16 PM

After upgrading my MacBook Pro (15" Mid-2010, NVIDIA GeForce GT 330M 512 MB) to OSX 10.9, my Mac no longer detects my Cinema HD Display.  Any help much appreciated.

MacBook Pro, OS X Mavericks (10.9), Apple Cinema HD Display
  • sterling r Community Specialists Community Specialists (3,220 points)
    Currently Being Moderated
    Nov 2, 2013 1:44 PM (in response to profputr)

    Howdy ProfPutr,

     

    It sounds like your monitor was workinf fine before you upgraded to OS X 10.9 but now is not recognized. I recommend the troubleshooting steps from this article named:

    Apple computers: Troubleshooting issues with video on internal or external displays

    http://support.apple.com/kb/ht1573#5

    No video or no signal, image distortion, "snow," or flickering

    Check connections

    When using an external display be sure to check the following:

    1. If you're using an Apple notebook, confirm the AC power cable or adapter is securely connected to the computer and the cable providing power to the display is also secure. It is always good to have your notebook connected to AC power when an external display is in use.
    2. Confirm display adapters are fully seated in their respective connections and that they are supported models and for the computer and display. Refer to these articles to assist you with adapter compatibility and further configuration information:
    3. Remove all display cable extenders, KVM switches, or other like devices and retest to determine if the issue is resolved.
    4. Try unplugging the video adapter or cable and then plug it back in.
    5. If more than one video adapter is in use (or "daisy-chained"), troubleshoot by using only one adapter.
      • Example: A mini DisplayPort to DVI adapter connected to a DVI to HDMI adapter is an unsupported configuration because there is a series of adapters in use.
    6. If available, try using a different display and or adapter (or use a different connector by using DVI instead of VGA, for instance).

    Reset the system
    You can reset the Mac's parameter RAM and SMC.Reset the resolutionStart by resetting the Mac's parameter RAM. If the display does not come up, was previously set to an unsupported resolution, and still results in no video:

    1. Start up in Safe Mode.
    2. From the Apple () menu, choose System Preferences.
    3. Choose Displays from the View menu to open the preferences pane.
    4. Select any resolution and refresh rate that your display supports.
    5. Restart your computer.

     

    Thank you for using Apple Support Communities.

    All the best,

    Sterling

  • ddoukas Level 1 Level 1 (0 points)
    Currently Being Moderated
    Nov 15, 2013 10:48 AM (in response to sterling r)

    I have done all the above on my 2011 iMac with DVI ->Thunderbolt/mini adapter, but still have no picture - just the occasional flicker of a faint image. Also, unchecked "Dispalys have diferent Spaces" in Mission Control.

    I really need that display for educational purposes...any other possible fixes?

  • KillDash9 Level 1 Level 1 (0 points)
    Currently Being Moderated
    Nov 26, 2013 8:16 AM (in response to profputr)

    Did you try "Detect Displays" (hidden in the Diplsay Preferecne Pane).  System Preferences > Displays > then hold down the 'Option' button and note the 'Detect Displays' button that appears in the botton right. Click it. Might not work - but worht a try.

     

    http://www.macobserver.com/tmo/article/os-x-10.8-reclaiming-detect-displays

  • henrybasset011 Level 1 Level 1 (0 points)
    Currently Being Moderated
    Dec 12, 2013 12:02 PM (in response to profputr)

    I'm having the same problems. This is really problematic for us, to the point where our CIO is considering throwing the Macs out for Linux. Frustrating.

     

    Did anyone find a resolution to this problem? We're on a 27" Cinema with a Mac Mini Server.

     

    Help!

  • BlackRockWeb Level 1 Level 1 (0 points)
    Currently Being Moderated
    Dec 17, 2013 9:02 AM (in response to profputr)

    I get this problem on my late 2009 macbook pro with an external dell monitor connected with a dvi adapter...

     

    Sometimes to get it to work when it won't see the 2nd monitor,

     

    1. I unplug the DVI cable then shut down the laptop
    2. start it keeping the 2nd monitor unplugged,
    3. once I am logged in, I plug in the 2nd monitor DVI adaptor and usually after a few plug and unplug attempts; the second monitor usually gets recognized and starts working again

     

    this supremely frustrating and time consuming work around has been the only way i have got my 2nd monitor to work since the latest mavericks update....

     

    GL HF 

  • BlackRockWeb Level 1 Level 1 (0 points)
    Currently Being Moderated
    Dec 23, 2013 7:56 AM (in response to BlackRockWeb)

    quick ammendment, unplug the dvi cable and then unplug the monitor from the dvi cable... then shut it down...

    once you are logged in and everything is loaded, then plug in the dvi cable, and then connect the monito cable to the dvi cable...

     

    following this process is the only way i can get it to recognize my dvi cable.

     

    display mirroring is still not showing up in my menu bar, but everything else is g2g at least...

  • TimCAL Level 1 Level 1 (0 points)
    Currently Being Moderated
    Dec 29, 2013 4:29 AM (in response to profputr)

    I have the same problem as BlackRockWeb. The original OSX Lion that came with my Macbook Pro recognized my external monitor right away (AOC e2343Fk). This problem started after a software update that followed  a year of no problems. I have no problems with my Windows laptop recognizing the monitor.

  • scottcf4 Level 1 Level 1 (0 points)
    Currently Being Moderated
    Jan 18, 2014 7:26 AM (in response to sterling r)

    Sterling, I have had this exact problem on several versions of OSX. It was always quite easy to fix by click the "Detect Displays" button in display preferences. However, this is gone in 10.9. Can I still run this command via terminal, do you know why it was removed or what it's equivalent is? It was such a simple fix, and I am frustrated to see it removed.

     

    Thanks.

  • scottcf4 Level 1 Level 1 (0 points)
    Currently Being Moderated
    Jan 18, 2014 7:31 AM (in response to scottcf4)

    Actually I just found this article about the detect displays command. Works great and is very easy! I am using DVI to Thunderbolt also with an older Acer monitor.

    http://www.macobserver.com/tmo/article/os-x-10.8-reclaiming-detect-displays

  • williema14 Level 1 Level 1 (0 points)
    Currently Being Moderated
    Feb 21, 2014 2:05 PM (in response to KillDash9)

    Thank you!  I have no idea how you figured that out, but it worked!

Actions

More Like This

  • Retrieving data ...

Bookmarked By (0)

Legend

  • This solved my question - 10 points
  • This helped me - 5 points
This site contains user submitted content, comments and opinions and is for informational purposes only. Apple disclaims any and all liability for the acts, omissions and conduct of any third parties in connection with or related to your use of the site. All postings and use of the content on this site are subject to the Apple Support Communities Terms of Use.