Looks like no one’s replied in a while. To start the conversation again, simply ask a new question.

bad pixels in the camera sensor

I've got an iPhone 4S, 64GB (Verizon) and noticed a few bad pixels during video playback on our 42" TV (played back via the HDMI adapter). It's something that most people would not notice, including my wife as she watched the video with me, but I'm a long time video/photo enthusiast and imperfections like that stand out to me like a pimple on an otherwise perfect looking face.


After several test videos and even photos, I am able to reproduce this pixel anomaly each and every time. There are 3 pixels, just slightly to the left of the center, that show up incorrectly in kind of an obtuse triangular formation. Two of them favor white, the third is dark. It's not something that is noticeable on the iPhone 4S screen because not all of the pixels are shown, but on a monitor that can render *all* pixels of 1080 HD video playback they show up.


I'm certain it's not dust as I've inspected and gently cleaned the camera window, not to mention even a small spec of dust would not show up as a single pixel (or in this case, what appears to be two pixels right on top of each other for each of the problem pixels). The anomaly shows up when lighter colors/gradients are present in that region of the screen. Darker colors do not reveal the bad pixels. Just to be clear - it's NOT a screen display issue - the pixels are noticeable on our 42" TV during playback, and also on my Mac monitor and iPad screen (ONLY when zoomed in viewing on my iPad, which is possible during playback of video using the stock photo app). The bad pixels jump around a bit because of the digital image stabilization (that should make sense to the more hard core video enthusiasts), which further supports it is a capture issue, not a video coding problem or anything like that. The issue is either with the image sensor itself, or the hardware's processing there of.


My question - has anyone else observed similar behavior? If so, is there something you're aware that can be done to correct it? This is my second iPhone 4S, the first being a perfectly-functioning 32 GB model (which did not have this issue) that I returned in favor of a 64 GB model because I was burning through space too quickly.


I'm probably going to have to go back and try to explain this to the good folks at the local Apple Store, and try to get a replacement unit, but I'm not looking forward to having to show video samples and point out three teeny little dots that most people won't even notice, but drive me crazy (especially after dropping $400 plus tax and a two-year commitment). A huge selling point of my iPhone 4S was of course, its superior photo and video capabilities. I understand they're not pro quality, but there shouldn't be any consistent pixel issues like this going on.


Sorry for the long post, but I'd appreciate some feedback, suggestions, input. Thanks.

iPhone 4S, iOS 5, iPhone 4S, 64 GB, Verizon

Posted on Oct 30, 2011 11:02 AM

Reply
6 replies

Oct 30, 2011 11:43 AM in response to ascii-T

A few bad pixels are nothing new and are found even on top end DSLRs costing 8 to 10 times what your phone is worth never mind the cost of the camera portion in your phone. What can be done with those cameras is the bad pixels can be mapped out by their repair dept.. You have to realize is this is a phone with a camera in it, not a camera with a phone and the likelyhood of Apple mapping out the bad pixels would seem slim to none. You didn't buy a $700 point and shoot camera. However you can always try.


BTW, displaying your images on an HDTV isn't going to show all the pixels the camera captured as the image is downsized to 1920 x 1080, not much different than a 2 megapixel camera so I'm not so sure what you are seeing are bad pixels you think you see may be something else. If you were truly viewing all the pixels in the image you'd only be seeing a small section of what was captured and displayed on your TV.

Oct 30, 2011 12:29 PM in response to pogster

Thanks for the feedback, pogster. I appreciate the well-informed knowledge you provided.


What I meant by displaying all pixels on the 1080p TV and computer monitor was that the video output, that is in 1080 resolution, is displayed in a way that reveals all of those 1080 resolution pixels that exist as part of the video capture frames and output file. I understand that the 1080 video capture is not pixel for pixel what the image sensor is capable of capturing and is essentially made up of an on-the-fly downsampling of the image sensor that has also been cropped for digital image stabilization purposes. On that note, the video capture is far from pixel perfect, in terms of image quality, nor did I expect it to be. I don't have unreasonable expectations form an all-in-one device (meaning I'm not expecting DSLR quality), but the persistent bad pixels are an eye-sore for me that should not be present, and were not present in my previous iPhone 32 GB model (that I now regret ever returning for more space - everything worked, I should have left well-enough alone!)


I have inspected, frame by frame, video taken with my previous iPhone 4S 32 GB looking for similar anomalies. I found a lot of seemingly random white pixels that were not consistent with the image itself and appear as little more than random graininess during normal video playback - something that is expected from devices like this, and likely from the best of the best, even if in less visible quantities (which is something that I am *not* concerned about). Again, my issue is with the consistent pixels that are visibly noticeable during playback when that area of the screen has lighter gradients displaying on it.


I know this is different than the "dead pixel" or "stuck pixel" issues sometimes present on display screens, but I've heard of tricks that can correct those (or at least the stuck pixels) and was hoping there might be some solution to do the same for the image sensor pick-up. I don't want to turn into "that guy" by going back to the Apple Store and expressing my disappointment in things that 99% of people would never notice.


I've owned many video cameras (both stand-alone and multi-devices) over past couple of decades, and only ever come across this issue with other other camera of mine - a Hi8 video camera from 1997. It developed the bad pixel (on the CCD sensor or electronics that process the image capture) after several years of frequent use. Nothing has ever produced pixel perfect performance and I don't expect that from my iPhone 4S. I'd be very happy with the video capture quality of the phone I returned, and that is what I'm looking for in the phone I'll be using heavily over the next two years.


Again - any tips, tricks or advice as to remedy this issue would be greatly appreciated. I'm not sure such a trick exists, but I wanted to pick the talented minds of fellow iPhone users before having to seek help in the form of a replacement from the local Apple Store.

Oct 30, 2011 2:19 PM in response to ascii-T

The problem with this type of communication is what you are describing and what I think you are describing can sometimes be different so my apologiies in that regard. 🙂


Are the 'bad' pixels (for lack of a better word) only present on video frames or on stills as well? Not refering to the display screen but the pixels on the sensor. Just trying to understand where the the bad pixels are coming from. If only in video then I'm wondering if its software, either in-camera or post.


From a DSLR still photography stand point the hot (or dead) pixels can be mapped out by the manufacturer and/or can be corrected in post with software. If you have a few and want to retouch for an enlargement, editing software is a fairly simple way to do it. But when you've got hundreds of frames from a video that's a little time consuming.


I don't know if I have an answer for you as I can't see but only imagine the problem. As you are likely aware, a less dense (less megapixels) imaging sensor but higher quality would produce better images but more is better from a marketing stand point. All things being equal most think 8 megs is better than 5 when it comes to cameras.

Oct 30, 2011 3:32 PM in response to pogster

pogster wrote:


The problem with this type of communication is what you are describing and what I think you are describing can sometimes be different so my apologiies in that regard.

No worries. I agree - text-only communication can be a rather clunky experience in trying to convey complex issues without any visual or (2-way live) verbal means. I was a bit worried about being too wordy in an already wordy post and feared that it might just be ignored if it was too long. Again - I appreciate your chiming in. You seem very knowledgable in digital photography, which is exactly what I was hoping to find. 🙂


Given that my description of "bad pixels" is a bit sloppy for what is really going on, I think you understand what I mean in my usage of those words.


To answer your question, the offending pixels, or very, very small little dots of imperfections, are present in both video playback and in still photos. After shooting several test videos against neutral and textured backgrounds to see what kind of results I'd get in the appearance of the bad pixels, I also took several stills in "burst mode" using an app called "Camera-" against a wood grain background, as the woodgrain background offered the most visible and consistent view of the blemishes. In motion video, the dots (bad pixels) are pretty easy to see on a large display, or when zoomed in during playback on my iPad 2's screen. In stills, it was kind of tricky to identify them, because without the persistent motion of the background, the bad pixels almost blended in as random image noise. I was however, able to identity with certainty the same bad pixels in most of the stills. Additionally, at the higher resolution of the stills, the offending pixels almost don't even need masking or removal with photo editing software. Like I said - without the constant visible background motion to make them stand out as irregularities, or the ability to do a frame-by-frame analysis (which I can do easier with the video file using an app called VideoPix), the bad pixels are more difficult to identify.


The importance in identifying the bad pixels in stills, even though they are practically not noticeable, is that it can now be determined that the bad pixels are not the result of some kind of video coding glitch, but rather present in the hardware of the device itself - either as a defective image sensor, or defective hardware that translates the data from the image sensor into the pixels utilized by the software for photos and video.


As part of the tests I ran, I captured video with an app called KingCamera, and it, too, produced the same results. KingCamera also has an option for "locking" of the auto white balance and brightness, loosely similar to setting those manually on a higher-end, dedicated camera. Even locking the white balance and brightness, the pad pixels were present, so the potential for the issue being caused by an anomaly in the software's auto-settings management were put to rest.


In regard to being able to correct the bad pixels in stills - yep, easily done. BUT, I shouldn't need to. In terms of correcting them during video playback, that *is* actually something I could probably work out a method of doing. I used to dable in production videos that involved complex imaging filters, multiple composite video layers, and automated ways of identifying specific colors, contrast points, and patterns within images/video streams and altering them (such as cleaning up sloppy image mattes or locking in on visual patterns for the application of various video effects). Again, this is something I *shouldn't* need to do.


The still photo and video capabilities on the iPhone 4S are outstanding, in contrast to what other similar existing devices offer (arguably better than or almost as good as the best competitor offerings, depending), but the quality is not DSLR, and like I said - I'm not expecting it to be. I do expect however, and I don't think it is unreasonable to, image rendering without persistent "bad pixels" showing - even if I am only in the 1% of people who are likely to notice such image defects. They are, in fact, defects, and not "normal operation for a device of this caliber".


I was hoping there might be some trick to correct them, but I kind of figured there would not be. It seemed smart to at least reach out and see what other people may know that I might be missing. Again, I regret terribly my poor decision to return my perfectly-functioning 32 GB iPhone 4s for the 64 GB model. Knowing a little about the very intricate technologies that make these things seem "magical", I understand that there are dozens or even hundreds of very complex little components that are mostly stamped out on an assembly line of robot fabrication means, then assembled, at least in part, by humans in a factory who are required to pump out x number of these things in short periods of time. I worried about the possibility of trading in my perfectly functional device for something that may be just a tad less perfect - just because of the odds. My bad for not following my natural instincts on this one. I just didn't want to have to constantly be managing space on my device. I failed to think through my initial choice of the 32 GB model in regard to my trigger-happy photo/video nature. The cumulative files are much larger on this device than I was initially expecting (because I was just too excited to get my hands on one on October 14 and failed to do enough research and calculations). Days into owning my 32 GB 4S, I was already spending time, almost daily, re-rendering videos at 720 and 580 through iMovie, being somewhat irritated about having to replace the original videos with reduced resolution versions just to keep the videos live on my iPhone (sometimes archiving the originals on my Mac first). Anyway - enough of my rambling.


Thank you again for your input, pogster. If you have any other ideas, please share. I'm pretty sure I know what I have to do to feel happy with my investment, though. I'm just not looking forward to complaining about 3 bad pixels to a Genius Bar employee who's probably going to look at me like I'm one of those "never happy with..." people. Three bad pixels that I won't even be able to show ON the actual iPhone itself, but rather bring my iPad 2 and show zoomed-in playback of the video footage.


Message was edited by: ascii-T - fixed type-o

bad pixels in the camera sensor

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple ID.