-
All replies
-
Helpful answers
-
Sep 12, 2014 6:36 AM in response to Semson11by brenden dv,Hi Semson11,
If you are only seeing an option for half the maximum resolution on your Cinema Display (2560 x 1440), you may want to verify that you are getting a full dual-link DVI connection, as outlined in this older article about a similar issue:
Apple Cinema Display (30-inch DVI) might show 1280 x 800 as maximum resolution
https://support.apple.com/kb/HT3571
Regards,
- Brenden
-
Jan 24, 2016 9:32 AM in response to Semson11by chunez,I have the same 27 inch mini displayport cinema display on windows 8 and have the same issue. I can boot my ATI graphics card in windows and it maxes out at 1280x800, however if i put the same graphics card into my mac pro it get full 2560x1440 resolution out of the same dual link DVI port. So i know it is not a limitation of the card or cabling, so i figure it has to be EDID related. I messed around with the CRU (custom resolution utility) for a bit to no avail and eventually came across something strange. If i used the mini displayport connector on my graphics card and attached this monoprice adapter before the kanex it all worked!
http://www.newegg.com/Product/Product.aspx?Item=9SIA8SV3808394
This probably isn't useful to most people, as they would just use the mini displayport connection directly to the cinema display to begin with if that was an option. However in my application i needed to run over a 100 ft dual link DVI cable to the remote display. Surprisingly this all worked great with no signal degradation!
There has to be some special (EDID or otherwise) signal that Mac computers send out to advertise full 2560x1440 resolution that the monoprice adapter mimics. I have scoured google but came out empty handed. I'm sure the solution is simple and i will be sure to update this thread if i ever figure it out.
