32-bit color

Why does "About This Mac" say that my screen has "32-bit color". As far as I know, 32-bit means that there are 8 bits per pixel as well as an extra 8 bits for the "alpha channel". Is that right? Also, my understanding is that the alpha channel represents transparency which is handled by the video card and never sent to the monitor. What exactly is meant by the 32-bit spec?

Posted on Jan 31, 2010 7:10 PM

Reply
7 replies

Jan 31, 2010 7:35 PM in response to JeremyLangford

I found this from another forum.

Color Depth - Most consumer panels are TN types with 6 bit color. 3 dots x 6 bits = 18 bit color. Look at your machines display settings .... if the number is 24 or bigger, the number between 18 and that number is "dithered". Note 32 bit color is not really 32 bit, it's 24 bit color with 8 bits of "non color data".

This leads me to believe that the 32 bit color referred to in "About This Mac" is made up of either 24 bit color with 8 bits of "non color data" (That seems unnecessary) or dithering.

Feb 1, 2010 10:10 PM in response to Jeremy Langford

Now I get your point. Yeah, that does seem like a misnomer in a way.

In the video memory, 32 bits are set aside for a pixel's attributes, so the bit depth could be thought of as 32-bits per pixel from the perspective of the memory map. But, yeah, I agree with you that no more than 24-bits of RGB data get sent to the display.

On the other hand, those 8 remaining bits do influence the RGB data that gets sent, so it is a 32-bit structure in that sense. It all depends on how you look at it I guess.

Feb 1, 2010 10:53 PM in response to BSteely

Do any "real" 32 bit color systems actually exist? And, if so, which colour do they drop a bit from to get to 32? (3x11 = 33, 3x10=30) I see that according wiki even those that aren't 3x8 + 8 bits of extras are usually 3x10 + 2 bits of padding:

+32-bit color+
+"32-bit color" is generally a misnomer in regard to display color depth. While actual 32-bit color at ten to eleven bits per channel produces over 4.2 billion distinct colors, the term “32-bit color” is most often a misuse referring to 24-bit color images with an additional eight bits of non-color data (I.E.: alpha, Z or bump data), or sometimes even to plain 24-bit data.+
+Systems using more than 24 bits in a 32-bit pixel for actual color data exist, but most of them opt for a 30-bit implementation with two bits of padding so that they can have an even 10 bits of color for each channel, similar to many HiColor systems.+

(from http://en.wikipedia.org/wiki/Color_depth)

Cheers

Rod

This thread has been closed by the system or the community team. You may vote for any posts you find helpful, or search the Community for additional answers.

32-bit color

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple Account.