Want to highlight a helpful answer? Upvote!

Did someone help you, or did an answer or User Tip resolve your issue? Upvote by selecting the upvote arrow. Your feedback helps others! Learn more about when to upvote >

Looks like no one’s replied in a while. To start the conversation again, simply ask a new question.

Poor display Depth when running Linux

I am running Linux Mint and OSX 10.8.2 on a my Mac Mini (Late 2012). And these are the summary of the issues I am facing -


  1. One monitor does not seem to display anything when connected to the Mac Mini using an HDMI to DVI-D cable. I occasionly see a flash of what is on the screen. The monitor is not in standby, but behaves as if it is displaying a back screen. When I use the apple provided HDMI-DVI Adapter and connect to the same monitor using a DVI-D cable, everything works perfect on both OSX and Linux.
  2. When I connect another monitor using an HDMI-DVID cable, everything works as expected as long as I am running OSX.But, when I boot into Linux, and start X, I see a poor display depth on the same monitor. The X server is putting out a 24 bit display depth. I switched between display depths and can notice a difference between when I specify 8, 16 and 24. But what I see is >16 but less than 24. See Image -User uploaded file
    Using the apple HDMI to DVI adapter does not make any difference.
  3. My Linux is installed on a seperate physical drive. I can verify the poor depth on Ubuntu 12.10 Live and Mint 14. I boot into linux by pressing the option key during boot and selecting the EFI partition of the second drive. I do not have rEFIt or rEFInd.
  4. My Mac Mini model is 6,2. So, http://support.apple.com/kb/DL1616, firmware does not update on the computer as this is meant for 6,1 only! Reading threads on the forums, I thought this might help.


I am now at my wit's end on how to get the depth fixed.

Mac mini, OS X Mountain Lion (10.8.2), Late 2012 MacMini6,2 16GB i7-3720QM

Posted on Feb 19, 2013 10:07 PM

Reply
5 replies

Feb 20, 2013 4:34 AM in response to lordloh

Maybe you are waiting for the Linux community to improve support for the HD4000 GPU. Intel has already confessed that there are driver issues (on the OS X side) for this GPU, and it is quite possible that Ubuntu 12.10 informs you that it has an unknown GPU.


Read this post. It will suggest that Ubuntu 13.04 will come with better GPU support for the HD4000. Or try Fedora 18, which in my experience, boots almost twice as fast as Ubuntu 12.10, and has a more aggressive update schedule between releases. In the meanwhile, get to know how to use xrandr.

Feb 20, 2013 4:29 PM in response to lordloh

The unknown GPU is not an error, it is a status in the Ubuntu 12.10 System Settings > Details > Overview panel. It may or may not be present on your hardware.


The xrandr command-line utility helped me add custom display support to a Ubuntu installation a couple of years ago. The following man pages will help: xrandr(1), gtf(1), cvt(1). If you Google "How to use Xrandr," this will return examples, including multi-display setup. You will need your monitor specs for vert and horiz. frequencies and resolutions.

Feb 21, 2013 6:18 PM in response to VikingOSX

Fedora displayed low color depth on both monitors. System Settings > Details > Overview panel showed the GPU as "Intel Ivy Bridge". Xrandr is not going to solve my problem as it does not change colro depth and I am not using dual monitors. I am using one at a time and seeing different performance on them.


One monitor working perfectly on mint is getting me confused whether it is a harware issue or an X.org issue.

Poor display Depth when running Linux

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple ID.