I have recently noticed after watching the same movie on a Sony VAIO laptop running Windows 7 and on my 15" rMBP, that on Mac the gamma value is bit higher. Due to this movies are always a bit darker on Mac (with any movie playback software). Especially in the darker scenes in movies I can not see much details. I use MplayerX for movie playback and I can adjust gamma value in video tuner option, and after doing that video is not so darker and I can see details in dark scenes of the movies, which I like. However, when I watch trailers on Apple website, Quicktime X doesn't have the control that could let me change the gamma value. I'm interested in having opinion of people on how they change the value of gamma when they watch movies on their Macs. Also if somebody could suggest to do something with Quicktime X, so that movies become a bit less darker on Mac?
Would you want to give your comments on choosing gamma = 1.8?
The gamma of 1.8 Apple used for many years was actually based on the printing density of the original Apple LaserWriter.
Just a suggestion here since this is really a personal choice. Daylight white is really anything between 3000K (sunset/sunrise) and 9300K (blue star ballistic white). It's all a matter of what temperature white point is the most pleasing and natural looking to you.
Somehow, somewhere, it was decided that a 6500K white point and a 2.2 gamma was considered "normal" human viewing conditions. I'm sure pretty much everyone has noticed that 6500K is a very bluish white. Very few places on the globe do you see color like this (like if you live in the Andes). X-Rite has stated that the average, and also most commonly measured white point worldwide is 5300K. So how did this stupid 6500K become the default? Who knows. One statement in a broadcast I saw was that it makes monitors stand out in the store. Essentially the same thing flat screen TV makers do. The defaults are ridiculously garish colors with overly bright screens. They do this to make the competitor's TVs look inferior and draw you to their stuff. Best Buy even has a service to come out to your house and set up your new TV for correct color viewing. What do they do? They back off the brightness, tone down the color, balance the gray to neutral (away from 6500K), and reduce the contrast so you can actually see shadow detail instead of it being filled in to black.
There is a point to that previous paragraph. The default in the printing industry is a 5000K white point and a 1.8 gamma. These values weren't randomly picked out of the air. 5000K far more closely represents the measured white point of typical publication paper. A 1.8 gamma keeps the inks in a range which can be printed without difficulty. If you really push it, you can get a density of 2.0 to print on some papers, but 2.2 cannot be done without putting down so much ink that it won't dry. Or, will eventually dry, but gets the paper so wet it ripples. Newer papers do have a higher measured white point, but nothing looks even close to an illuminated 6500K screen, and it can still only hold so much ink. Just for good measure, let's also throw in that any paper and ink combination cannot perfectly reproduce what you see on screen. Some colors, saturation and luminosity simply cannot be done.
I use 5000K and a 1.8 gamma because 99% of the work I do is for the printing industry in CMYK color mode. For photographers, the suggested setting is 5500K (almost the same as the 5300K worldwide average) and a 2.2 gamma. The gamma being close to what we can see, not what you can really print. Expect printed output to be less dense than the screen looks no matter what type of printer you're using.
Experiment a bit with what your prints look like compared to the screen. You may find a gamma of something like 2.0 gives you the best match between the two.
For a profile that actually means anything, hardware is a must. Here's why:
All Apple monitors deliberately lack RGB controls, or any presets for specific white points or gamma choices. They only thing you can control is brightness. Why do they do this? So when you launch OS X's built in calibration program, it can reset the monitor to a known brightness (most likely 120 lm), and the video cards' LUT (Look Up Table) to a 6500K white point and a 2.2 gamma (1.8 before Snow Leopard when Apple switched to 2.2 as the default gamma). By doing that, the calibrator has at least an idea what the monitor looks like without being able to actually measure what the screen is really displaying. It knows if you're pushing the color balance to the green side, heavy midtone balance, etc. But it's doing nothing more than "guessing" at the results by having a known starting point.
There's two really big problems with this.
1) All monitors drift. LCDs tend to drift to the pink side as they age. The colorants become weaker, so they lose saturation and brightness. Software solutions such as Apple's calibrator cannot account for any of this. They always assume you are starting at a perfect, predetermined brightness level, 6500K white point, and gamma for your version of OS X. As you monitor drifts pink, you have to push your gray balance to the green side in order to achieve a visually neutral gray. The software doesn't know anything about the pink cast, so it sees that you like your gray balance with a green hue to it, and your prints will come out that way.
2) Now you throw in third party monitors with all kinds of buttons of their own which allow you to change things before you even get to the software. If you set the white point on the monitor for a 5500K white point, the software doesn't know this. When you change the white point in the software to 5000K, it thinks you're starting at 6500K. So subtract 1500 more Kelvin from where the monitor itself is and watch it go really orange.
Software only calibration is useless, especially the older the monitor gets. If you're serious about controlling the color you see on your monitor, a hardware/software solution is one of the cheapest investments you can make in color management. The i1 Display Pro is an excellent choice, as is the Spyder 4. You'll see there are three choices for the Spyder 4. The hardware is identical on each one. What changes is what the software can do. Unless you're matching multiple monitors in a room to the same specs, the Elite version is overkill. I haven't looked at the specs very close, but I would bet the Express version is too simple. As in, it will likely restrict you to very basic white point and gamma choices. If you decide to go with the Spyder, I'd recommend the Pro version.