vidguy7 wrote:
pberk wrote:
How would you expect the camera to know when you are taking an indoor photo?
That's what auto white balance does. The same mechanism is in all video cameras and most still cameras. It measures the color temperature of the lighting where you are and adjusts accordingly. If it didn't know the color temperature, your outdoor pix would look just as bad.
The fact that it can take pix outdoors with good white balance, shows the camera is capable of reasonably accurate colors.
That in fact is NOT how auto white balance works - what any camera needs to accurately calculate an auto white balance setting is a true white and/or a true black, which not all scenes have... and uses what it finds to adjust relative to a native white point that has been pre-defined. The native white point is what determines how warm or cool the whites in the photo will be rendered..
That is a different function from value readings which are gauging how light or dark a scene is and determine whether auto flash will fire.
Outdoors, you are generally working with a color rendition index of 100 - meaning that all colors in the visible spectrum can be seen. There is no artificial lighting (which is what I assume you are referring to when you say "indoors") which has a 100% color rendition index. Artificial sources skew towards one end of the spectrum or another. Incandescent lighting is warmer light, will accentuate the yellows oranges, reds, etc. Florescent lighting is a cooler source and will accentuate blues and greens. LED is a very very cool harsh light which can tend to flatten out colors and also creates a photo that is more sensitive to ambient reflected light - it is also the only type of light source that is cool enough (low power) to be used repeatedly on a phone camera without totally draining the battery.
I have gotten excellent indoor photos (of my white bulldog) with and without the flash with the iphone, with a very true white. On the other hand, I shot some photos with the phone at an event in a large light yellow room and except for the subject itself, which was lit directly by the LED and was captured with pretty accurate color, if somewhat overly vivid blues, the rest of the scene had a decidedly yellow cast as the color reflecting off the smooth walls (which are many times more reflective than an object with texture) was captured with greater sensitivity than you would appreciate with the naked eye.
All this to say that what the camera "sees" is not what the naked eye perceives - in a way the camera is much more sensitive than the naked eye, and that lighting conditions in a scene as well as the Light Reflectance Values of other colors present and the type of surfaces reflecting the light into the camera lens, will all play a role in how any camera "sees" color. This is a lot for even the most sophisticated cameras to handle, that's why for example,we bounce flash off a white ceiling, or diffuse it, rather than aiming it directly at a subject, etc.
I suspect that the native white point on the rear iphone camera was warmed up considerably to compensate for the very cool LED flash and that most of the indoor shots that are yellow are being taken under incandescent lighting, rather than the cooler, more energy efficient CFLs and the like.
With simple, inexpensive cameras that operate only in automatic modes, certain decisions have to be made by the manufacturer. They will not please all lighting situations. That doesn't mean that you can't take great pictures with them; it means that in order to do so you have to learn to see what the camera sees and compensate for it when composing your shot - or if you can't, use one of the many easily available and free or inexpensive post-production apps to adjust what you couldn't.