Is 27" Cinema Display an 8-bit panel?

I was under the assumption that the 27" iMac used the same panel that's in the Dell (and that the 27" CD would as well). But now I read 16.8M colors. That's an 8-bit panel and not the 10-bit in the Dell (at over 1 trillion colors).

So Apple is charging nearly the same money for not nearly the same capability and only 1/3 the warranty in the Dell? Really? Is anyone buying these things?

Does this mean we can't expect 10-bit color depth and wide-gamut support in OS X anytime soon?

Apple, remember who kept you alive, when JLG and JS were at the helm. You may need us again.

MBP, Mac OS X (10.6.4)

Posted on Sep 25, 2010 12:05 PM

Reply
12 replies

Sep 25, 2010 1:37 PM in response to rhymbu

If you look at a detailed image with gradual color change on two monitors side by side one running in 8 bit and other running in higher bit depth most people can see the difference. As for the original question, if your video card does not have enough video RAM to run the screen in its native resolution it may disallow bit depths higher than 8. You may want to check your VRAM and see if you have enough.

Sep 26, 2010 10:17 AM in response to We on Motion

Cheers, WoM!

Most video cards, circa ATI X1000-generation (1300, 1600, 1900), have supported 10-bit per channel. They've had the memory to push 30" panels at native res, in 2D, for a while (256MB needed, if that). It's the display hardware that's had to catch up.

It's mainly just a comment on Apple pushing one over on the consumer and forgetting the real professionals again. When you look at their quarterlies and the nasty margins they're making on everything, it's pretty obvious that you aren't going to get a 'good deal' on anything Apple. The 27" CD is just another example of a pig in lipstick. The Dell is far and away a better panel. And I just wish Apple would either step up and offer something comparable or cede the market. But if they did do something in the league of the Dell, going back to their margins, they'd charge us $5000. Charging 50x their cost is the only thing they know how to do (hence the $30 molded plastic Bumpers that run about $0.20 to produce and package).

Frustrated.

Sep 26, 2010 1:40 PM in response to Integr8d

The Apple 27-inch model is a fantastic value when compared to the built-in camera, audio, screen size and the overall compatibility with the Mac ecosystem. Still, it’s inferior to some other displays designed specifically for image quality. If I were buying a new display I would much rather have the LaCie 324i (10-bit) for about 250-dollars more. It can be user calibrated and its non-glare panel is a nice touch. Just depends on the priority of the user I suppose.

Regards,
Dan

Message was edited by: Digital Dude

Sep 29, 2010 2:20 PM in response to Digital Dude

Sorry if this is a bit off topic. How do you produce 10-bit images? All of my digital cameras spit out jpg files that are baked in 8-bit. They do have RAW mode, and I realize RAW might contain more than 8-bits/channel.

What software do you use to produce 10-bit images? Does lightroom or aperture do this?

Do 10-bit displays look better even when you are looking at 8-bit images on the Internet?

Oct 2, 2010 11:10 PM in response to Integr8d

Integr8d wrote:
..The Dell is far and away a better panel..


Well, you have obviously only seen it in the photo.
U2711 display surface is way too grainy for the resolution.Display is cheap crappy LG IPS panel, light bleeding is obvious and annoying.
Then, there is "sharpness" control (for Display port!!).
Overall, it has nothing to do with professional quality, it's a cheap crap.

Compared, for mass market Apple display looks MUCH better. It is possible that reason Apple is using gloss, is that light bleeding from crappy LG displays is less visible with glossy finish(polarizer in the panel is obviously discarded to push down the price).

Another thing - I strongly doubt Dell U2711 has anything to do with 10-bit panel. "1.07 billion colors" simply means 12-bit internal processing, which shouldn't be even mentioned, if there is no hardware calibration possibility (and there isn't).

Oct 9, 2010 1:04 AM in response to gnagy

gnagy wrote:
How do you produce 10-bit images? All of my digital cameras spit out jpg files that are baked in 8-bit. They do have RAW mode, and I realize RAW might contain more than 8-bits/channel.

You produce >10-bit images very, very easily and commonly. For many years, professional film scanners have had 16-bits/channel capability, so many film scans are archived that way in order to preserve as much of the original film quality as possible. Mine sure are! It took a little longer for professional digital cameras to get there, but most digital SLRs currently produce 12- to 14-bits/channel raw files, even the pocket cameras that shoot raw. I just Googled some medium format digital backs and they are at 16-bits-per-channel now.

What software do you use to produce 10-bit images? Does lightroom or aperture do this?


Lightroom and I think Aperture operate in 16-bit/channel. In Lightroom you can not set it any other way. It's always in 16-bit, even if you toss in 8-bit files. The typical workflow is, you shoot your 14-bit raw file, open it, it's worked on at 16-bit, and then you export it at whatever specs you need (e.g. 8-bit JPEG or 16-bit TIFF). I assume Aperture is the same. This is why people want more than an 8-bit monitor...because nothing else in the editing chain still stoops that low.

Now, if you look at Photoshop...you can produce and do limited edits on 32-bit/channel images. 32 bit is absolutely required for full range HDR masters that you created by merging multiple 16 bit files shot over a range of exposures. These are prepped so that the most desired levels are properly positioned for downconverting to 16-bit for standard editing then 8-bit for output. They use a special trick to display the levels because you cannot see all 32 bits worth of levels at once with the eye or any monitor available.

This is just the way digital media works. In audio, the Compact Disc standard is 16 bits at 44.1 KHz, but the mastering process for a CD commonly occurs at 24 bits at 96KHz so you have enough headroom for clean edits. If you started at the final bit depth/bit rate you'd be truncating/dropping data all over the place as you make edits. Same idea with images: If you want the best 8-bit output, 8-bit source is inadequate (look up sampling theory). Professionals need headroom.

Do 10-bit displays look better even when you are looking at 8-bit images on the Internet?


A 10-bit monitor shouldn't display 8-bit images any better. Someone generating or viewing only 8-bit JPEGs all day long is probably wasting money on a 10-bit monitor.

People who buy 10-bit displays are usually producing images for high-end advertising or very high resolution printing for magazines like National Geographic, where the photo editor needs to see absolutely every detail in their 16- to 32-bit files, as much as is possible anyway.

Oct 11, 2010 9:03 AM in response to Integr8d

Once again - yes, 27'' Cinema display uses 8 bit panel (LM270WQ1).

But NO, Dell U2711 DOES NOT have 10-bit panel!!
It uses LM270WQ2 panel which is 8-bit+A-FRC!! That is, even improved, but quite similar dithering to what 6-bit panels use to display 8-bit colour.

Now, Dell is a company which haven't yet mastered to make a monitor that displays gradients without banding. Even new U2711 shows banding in sRGB mode. If we are talking here about 10-bit colour, let me laugh.

Then, AG coating is much too grainy for the resolution. Factory calibration is in reality useless - colours are way off. Light bleeding is obvious and annoying. I would exchange that "10-bit" for A-TW polariser at any time. But it seems that important things are given up for nice numbers in advertisements. Surprisingly, Apple here is more honest - at least, not bs' the customers with meaningless numbers.

Finally, there are almost no consumer video cards that can output 10-bit colour, so you need to buy Quadro or similar. Even more, only interface that in reality can output 10-bit is Display Port, and it is absolutely not clear if Dell U2711 Display Port is even 10-bit capable or not.

Oct 22, 2010 7:09 AM in response to ron App

I don't know why you're getting so worked up. A few monitors are using that panel, according to tftcentral. They're all 10-bit, capable of DISPLAYING 1+ billion colors.

"If we are talking here about 10-bit colour, let me laugh." How can you laugh, when FEW applications support 10-bit output? (dobe Photoshop is on the bleeding edge of supporting 10-bit output. Odds are, you haven't seen 10-bit displayed on anything.

"Then, AG coating is much too grainy for the resolution." Completely subjective statement. I'm looking at two u2711's, in my friend's office, next door. 'Grain' is present. But it's also present in the Eizo CG243W that I'm typing on right now. I can count, on one hand, the number of complaints I've read about u2711 grain.

As for "important things are given up for nice numbers in advertisements," I can think of a certain fruit-themed computer manufacturer that was sued for claiming 16.8M color displayable on the LCD laptop panels. And don't get me started on the built-in $5 web camera that 'adds value'.

"Surprisingly, Apple here is more honest - at least, not bs' the customers with meaningless numbers." Umm, we're computer users. There's no such thing as a meaningless number. Just ask Steve Jobs.

You can search for the ATI Radeon X1000 series of video cards, dating back to the early 00's, and see 10-bit internal processing everywhere, including 10-bit DACs. The only limiting factor was DVI and software. DisplayPort gives 10-bit over the cable. If you were logical, you would've picked software as the weak spot for 10-bit color.

Oct 24, 2010 10:32 AM in response to Integr8d

Integr8d wrote:
Alien,
Most human eyes are capable of seeing well-more than 16.8M colors...

Well, +Judd, Deane B.; Wyszecki, Günter+ in their book +Color in Business, Science and Industry+ mention 10 million. Do you have any other data, have you done your own research probably? Or, all your evidence is what you think powered by hate against Apple?

..according to tftcentral. They're all 10-bit, capable of DISPLAYING 1+ billion colors.

Had you read that same TFT Central more carefully, you would have found that the panel in U2711 is "8-bit+A-FRC".
Now, if you don't know what 8-bit+A-FRC is (which you obviously don't), that panel is capable of displaying 16.7 million colours, but through Frame Rate Control (temporal dithering) simulates over 1 billion colours.

How can you laugh, when FEW applications support 10-bit output? (dobe Photoshop is on the bleeding edge of supporting 10-bit output.

How can you not laugh reading something like that? What kind of "bleeding edge" would exactly that be?
If OS does it's job, there is no difference for application to output in 8, 10 or 12-bit. But if OS is a mess, API's don't work as they should, drivers are buggy, then Adobe programmers can proudly say in the forum "it is not as simple as you may think". Hardly anything is bleeding there, even that is not and easy task indeed to bypass the OS.

..'Grain' is present. But it's also present in the Eizo CG243W that I'm typing on right now. I can count, on one hand, the number of complaints I've read about u2711 grain.

Wow, if it's in Eizo...dirty whites are now a "professional feature".
But wait, it IS the same cheap crappy "8-bit+A-FRC" LG panel Dell uses.
And, isn't it the same Eizo which was using Samsung PVA panels with terrible hue shift in their professional monitors?

I can think of a certain fruit-themed computer manufacturer that was sued for claiming 16.8M color displayable on the LCD laptop panels.

If you could actually think, you would have read the case materials first, before posting here.
And didn't it occur to you that maybe specifically due to those idiots who made that case, Apple might not be willing to use FRC panels to not get involved in similar case again? Or maybe you will to go sue Dell this time?
Or, maybe you do not even understand that 6-bit+FRC is the same thing as 8-bit+A-FRC?

This thread has been closed by the system or the community team. You may vote for any posts you find helpful, or search the Community for additional answers.

Is 27" Cinema Display an 8-bit panel?

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple Account.