You can make a difference in the Apple Support Community!

When you sign up with your Apple Account, you can provide valuable feedback to other community members by upvoting helpful replies and User Tips.

Looks like no one’s replied in a while. To start the conversation again, simply ask a new question.

IPhone 12 Pro Max Lens flare

Love the cameras. But horrible lens flares.

And it seems it’s defective:

One the right, above the window. Green flares from tube lights to the left.


In the middle of the picture:



On the TV screen:



Will try out during the day and post

Posted on Nov 13, 2020 11:18 PM

Reply
451 replies

Feb 10, 2021 12:51 PM in response to merethe99

No one cares about that little small blue dot , this is normal when you directly to the Sun , light flare from MAX in the street light or town , snapseed is a nopeneed. , apple needs to be rethinking for new models , pushing pictures and low light quality and not check flare how it works or try fix them , definitely they’re have technology to deal with it !! Mate 10 picture

Feb 10, 2021 1:04 PM in response to gtx279

Correct. If it was just the one dot due to bright subject then it would be manageable as it is on DSLRs. However, the lens flare is horrible and unpleasant in almost any situation with lights as a subject and renders night mode useless. Also, post processing as an answer is unacceptable especially since the same issues are present in video.

Feb 10, 2021 1:25 PM in response to scorproy

Back and forth, back and forth. People are landing here because they're experiencing a problem they haven't seen before, and they are not first time iPhone users, so they have reasonable expectation, experiences to not have this problem. Call it whatever you like and offer any kind of fiddling and angling and post editing of images as a solution, but no solution for videos, and the reason people are coming here is still truth. They see an outstanding, obtrusive, distracting, green artifact and it is shocking and worrisome on their new, expensive, iPhone. They think it is a problem, rightfully so. They aren't looking for an argument, they're trying to determine if they have a defective new iPhone, and then they are extremely disappointed to learn that it is a new "normal" problem. Sure, it's a flare, but from our perspective it shouldn't be there. It might as well be green rubber ducks littering our images and videos. Let's call it that. We've got green rubber ducks in our pictures and for all the iPhones we've had in the past, and a trove of images and videos in vast iPhoto libraries, we've rarely if ever seen green rubber ducks nor had to do anything special to avoid capturing green rubber ducks.

Feb 10, 2021 1:36 PM in response to IdrisSeabright

Other phones do have flares BUT normal, expected flares ... the beautiful ones....not 300 dots on a photo and terrible reflections. .

the phone is powerful fantastic and I love it. The lens are also great BUT they restrict me. I cannot shoot in light, I cannot shoot in the house as I get ghost, and honestly I have not yet tried to shoot video at night ... I think I will be crying tears there!

keep in mind that I ve been using  since iPhone 3g and have never changed ever since.

Feb 10, 2021 1:46 PM in response to merethe99

Yeah. Someone said earlier that they stand out more because of higher grade glass lenses. Maybe. I don't know. I just know I don't see them in my library's images or videos from past phones. And with my new phone, it's so pronounced it's outright silly, and it's not even occurring when just in night mode or at night or in the dark. It occurs always when there are light sources. I've got the same setting shot in my living room with older iPhones and taken videos and pictures in all different modes. My old phones were fine, the new one is like watching a green spaceship fly around. (Sorry, I mean green rubber ducks. I'm going to start calling them that now. They're as undesirable as any kind of artifact, but so obtrusive they make images and videos as ridiculous as any image or video littered with green rubber ducks)

Feb 10, 2021 3:02 PM in response to scorproy

Well, Dogcow-Moof has convinced me. Lens quality is significantly higher, and this is why everyone is noticing the artifacts now. Older iPhones lenses just wash out the quality so they're either not there or less pronounced. I'm not yet convinced that Apple cannot solve this with some math and live processing. They know all of the metrics on the lenses, they know the pitches and movements of the camera, and for indoor shoots, they can probably even get ranges. They have all of the factors they need and the computational power to do it. Also, the color of these green flares is extremely consistent. The simplest transformation is to adjust those pixels. More complex computations would calculate the probable region of flares, look for obtuse pixels and normalize them. I think if that were an option in the camera app, everyone would be a lot happier and it would truly push this phone's picture and video quality to its real potential and set them apart from competitors that everyone is pointing out that have similar artifacts.

Feb 10, 2021 4:16 PM in response to Michael Prescott

While it's true Apple uses some Computational Processing in their photography wizardry, that's more taking huge amounts of data in low light to piece together photo's with clarity unheard of until very recently. But for Apple to decide what should be in a picture or not in a picture and to remove data the lens sends to the sensor would not, in my opinion be a good option. What if the camera sees things and removes them, yet you didn't want them removed. And that CAN include flares in some situations. Personally, I don't want the camera to make such decisions for me.


What they need to be able to figure out is how to let enough light into the sensor, either with redesigning the lenses or the lens surrounds to minimize artifacts and that's not an easy challenge given the tiny nature of the lenses in a phone. In a DSLR, you would use a lens hood to minimize the artifacts from very bright subjects. But then you would NEVER use a lens hood at night as it would reduce the amount of light hitting the sensor and to be honest, DSLR cameras don't have the magical ability to do what an iPhone can do at night, unless you have the camera on a tripod.


We're dealing with limitations here. And I do get that many here feel let down by some of the results they're getting. But blaming Apple or the camera is not the answer. We have to learn how to use the photographic tools we have in our hands to take the best images we can and accept that some situations are going to result in recorded artifacts we don't want.



Feb 10, 2021 4:38 PM in response to bobneedham

bobneedham wrote:

It can not be tied to Apple’s using a better lens. I shoot with m.Zuiko lenses, and I can assure you that the quality of iPhone lenses aren’t in the same world as them. I don’t have this problem with them. If it was about the quality of the lens they should be far worse by far with my m.Zuiko lenses.

So you use m.Zuiko lenses which are the same size as the lenses on iPhone? Of course you don't. And I would expect those lenses to be appreciably better. They can cost more $$ than an iPhone. And I imagine if you have a camera with dedicated lenses, you likely use filters on those lenses? For the money, you not only get a camera system, but you get a phone that makes and accepts calls, which runs all sorts of apps, accesses the internet, and can do far more tasks than a dedicated camera system. You need to compare apples to apples, not apples to pears.

Feb 10, 2021 4:50 PM in response to lobsterghost1

I respectfully disagree with the computational processing assertion. I think it can be improved to significantly and noticeably reduce and even eliminate the problem, and it doesn't have to be an always-on feature for everyone unless the user chooses. Most of the green space ships, green rubber ducks / green flares that I see are transparent, consistently colored, and consistent angles from light sources. Even without angles, accelerometer, and light source factors, the green pixels are often enough of a factor to identify the problem area, and in cases of small points and diffusion areas, along with surrounding pixels, you can automatically calculate best color replacements. Sure, the huge flares become a problem, but I've noticed in all of my images and many of them here, these points are consistent and small and could be corrected even better than a person post editing. Mixing in more intense processing to include light source locations, camera pitches and other angles would indeed be intensive and shorten battery life, but again, it doesn't have to be always on, and it could be used to identify probable regions of artifacts, to reduce how much of the image to process and therefore speed the process in live preview. IF it were an option, I'd wager most consumers would have it always on given how pronounced these artifacts are. Pro-consumers could have the option to disable it and work their magic with angles and aesthetics.

I guess I'm thinking along these lines because I've recently been tinkering with machine learning code, which can easily identify objects in my scenes. Identifying these green flares should be trivial in comparison.

Feb 10, 2021 5:10 PM in response to scorproy

Ooooo! I just made a discovery, how to switch cameras that the phone is using, and the green flares do NOT occur on the Ultra Wide (0.5x) lens! So, green flares on two cameras, when you set 1x or 2.5, but the green flares do not happen on the 0.5x lens. Now I'm back to wondering if this is a defect. So frustrating. I need to put it down for a while.

IPhone 12 Pro Max Lens flare

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple Account.