IOS 26.0.1 camera are blurry on iPhone 17 Pro Max and older iPhones.

I have contacted Apple senior support for over 3 days about the camera quality, where i take photos of a book page (none macro, normal 1x) and only where it focuses is good quality and the rest of the texts and the edges of the photo and the content of the book are blurry/smudgy. I have only owned my iPhone 17 Pro Max for 5 days, my IOS 18.5 six years old iPhone 11 Pro Max outperforms today’s iPhone, with much sharper photo and clarity all sides and edges. Not even the senior support could point out if it was a hardware or a software issue. And they booked me in on a official apple reseller and even there they ”couldn’t see a problem” with the iPhone 17 Pro Max, yet it was pretty obvious. Until i tried the cameras on the demo iPhones both in Apple official reseller and a retailer store and they all had the same photos i got. Even the employees with their older iphone (15 pro max and 13 pro max) had the same issue since they had IOS 26 upgrade. So either please fix the issue as soon as possible, or recall the devices and refund them.


Also Apple senior support team tried to tweak some camera settings while i shared screen for them on my iPhone 17PM, and they told me to factory reset my iPhone and still absolutely nothing changed.


I can provide photo evidence of how blurry images are on the edges and smudgy on texts, and i would really like a confirmation if the camera of the new 2025 device supposed to be this bad, so that i can return it as it didn’t fill my expectations at all.

Posted on Oct 6, 2025 8:41 AM

Reply
Question marked as Top-ranking reply

Posted on Oct 27, 2025 12:03 PM

74 replies

Oct 6, 2025 2:18 PM in response to PlsFixMyProblem

Hi, this is a very simple issue of a concept called Depth of Field (DoF). DoF can be expressed as a mathematical formula. Rather than post a lot of math, you can see the formula and how it works here,


https://en.wikipedia.org/wiki/Depth_of_field


The concept though is fairly simple. There is a zone of acceptable sharpness/focus that extends both behind and in front of the exact point you focus upon. The further objects appear from the exact point of focus the blurrier they become.


You’re attempting to photograph a flat page in a book, but it’s not perfectly flat. Do you see the dip in the center of book, along the spine. You’ll achieve better results using a ½” or ⅝” polished plate glass to hold the book flatter. Not perfect, but better than what you’re currently doing.


Why did the DoF change between older cameras and newer cameras? Apple made a design decision to improve the camera and resulting images for the average photographer. One of the improvements was to a larger sensor and different lens design for the 24mm (1X) lens. The changes resulted in different parameters and the DoF changed and became narrower (less objects in focus both in front of and behind point of focus). This is why newer models will not produce images with less DoF when focused close. The newer models do produce better/sharper images with the 24mm lens that the average photographer will take such as portraits, seascapes, sunsets, sports etc.


Your options are to return the iPhone and purchase an older model, purchase a camera better suited for flat reproduction images, modify your current technique and equipment for better results.



Oct 6, 2025 4:18 PM in response to PlsFixMyProblem

Your two sample photos, illustrate part of the principles of DoF. You photographed to flat objects and the flat objects are consistently sharp from edge to center.


Your original two objects were not flat and illuminated how DoF affects objects not in the plane of focus.


Your issue in the last two images does not have a DoF issue. If focused under identical conditions, using a tripod to eliminate camera motion, illustrate one lens is sharper than the other. It’s an issue of lens design, sensor design and choice of materials. At the given distance, the iPhone 11 Pro Max is sharper the 17 Pro Max.


But software can’t significantly alter the sharpness of a lens. You would need to put a different lens on the 17 Pro Max to achieve more resolution and sharper images at that distance.


So again, your options are to return the iPhone 17 PM and purchase an older model, purchase a camera better suited for flat reproduction images, or modify your current technique and equipment for better results. Unfortunately, improving technique will not achieve the results you’re hoping for.



Oct 12, 2025 11:12 AM in response to PlsFixMyProblem

I have to correct my previous post. Though I’m still not convinced with the quality of the camera chip (at least under low illumination conditions), I found a simple and reasonable explanation for the blur. If you look at the file size of a 12 MP pic it is around 1.2-1.4 MB, which is about 30-40% higher compression than the typical 2.0-2.1 MB of previous iOS. Higher compressions means more condensed data and more interpolation in between data points which causes the blur effect. Play with the settings, select different raw modes and you’ll see the difference. Also, keep bi***ing to make reduce the compression. ;)

Oct 13, 2025 8:39 AM in response to captainkirk24

iPhone 16 Pro’s were the same. There’s a 30+ page thread about blurry closeups buried in community. Not a fix in over a year, because the Apple doesn’t view it as an issue. Apple changed a lot when they moved from iPhone 15 to 16 models.


Apple could have made changes, but evidently chose not to. If you do a lot of closeup photography provide feedback to Apple.


Product Feedback - Apple


Oct 24, 2025 11:06 PM in response to PlsFixMyProblem

Hi all, I'm new to this forum after looking for a thread about iPhone 17 pro camera focus problems.


I take thousands of photos of buildings to use in Reality Scan to generate 3D models. I switched to using the iPhone 11 for this mostly because all the photos would be in focus, not something guaranteed with a traditional camera.


I recently upgraded from the iPhone 14 Pro to the iPhone 17 Pro for the three 48mp cameras and am now having severe focus issues with up to 30% of the photos out of focus.


After extensive testing (and more to be done) I think Jeff Donald may have the right answer but for all the wrong reasons!


If you look at this photo on flickr:-


https://www.flickr.com/gp/padraiccollins/0d9yu37473


This image has severe depth of field (but absolutely nothing tack sharp) - this is completely software generated. I didn't ask it to do this and in a series of photos taken by walking around the building some are completely in focus and some are completely out of focus. Indeed the centre of the "false" depth of field can be anywhere on the focal plane to the point it sometimes looks like a rolling shutter effect, half the image in focus and the rest out of focus, unrelated to the depth of the object.


This is computational photography at its very worse....... software bug or software feature?


I believe at the heart of this is the "Fusion Focus" system interfering with my photography. I think this is what PlsFixMyProblem is also seeing. I get the feeling the software is using all the sensors (motion, lidar, all three camera sensors etc). This might be great but is currently out of control.


Today, under the camera app settings, I switched the "Fusion Camera" option to "24mm only" and I am hoping this solves some of the issues, but an "off" option might be even better.


It would be useful to have a technical explanation of exactly what is going on "under the hood" for this and other features (prioritise faster shooting???) to be able to take back control of the cameras. The camera focus on the iPhone used to be sharp and reliable in conditions beyond that of a traditional camera.


For Jeff to suggest the depth of field is hard wired and one should go back to an older iPhone is strange. The iPhone depth of field is completely computational and should present little challenge to give some user control. Melding multiple images from different sensors when one just wants a sharp image is not the way to go!




[Edited by Moderator]

Oct 26, 2025 6:17 AM in response to Jeff Donald

Jeff, I'm sort of lost for words.


"Depth of Field will be governed by Laws of Physics and not Apple Engineers."


As google will tell you - due to the iPhone's smaller sensor size its effective depth of field is much larger than the equivalent large camera sensor. The depth of field shown on the image I posted is impossible under the laws of physics on an iPhone. It is completely generated by software engineers.


I've take tens of thousands of photos with various iPhones. My focusing issues are not related to inexperienced use of the camera. Using the same process and techniques on a iPhone 14 Pro produces 99% sharp images and this has dropped to under 70% on the iPhone 17 Pro, depending on the colour of the subject.


This isn't about holding the camera steady. Describing a studio process for taking sharp images is not a real world situation where there are a number of changing variables.


The thing is I don't want depth of field. I want a sharp image across the whole photo. If I'm ten feet from a building why is the iPhone presenting the image as if it was a product shot from a studio?


Its not about debating the merits of computational photography, something is "broken" on the iPhone 17 Pro focus system and how it then computes the image. The thousands of photographs I've taken in the last two days suggest the fusion focus system is getting confused, particularly if the subject matter is predominantly red.


Nothing to do with my and old and shaky grip.....


Oct 27, 2025 4:25 PM in response to PlsFixMyProblem

For the past month, I’ve taken photos indoors and outdoors with my 17 Pro Max and the image quality is worse than what I was getting on my 15 Pro Max before iOS 26 was released. I have no idea what Apple has done from a software standpoint that has taken an incredibly capable piece of hardware on paper and turned it into a digital camera from the mid-2000s. This isn’t a sudden inability to take a quality photo or user error; this seems to be over-aggressive processing by the software to “fix” images after they’re captured.


There are too many people reporting the same concerns for this to be down to a handful of phones with bad cameras. I’m hoping that a software update to dial down the processing of images to restore better quality present on older phones with “inferior” sensors and lenses.

Oct 27, 2025 4:55 PM in response to Senator48

As someone that’s been involved with iPhones since 2007 and photography in general for almost 50 years, don’t expect much change from Apple. Every year a small segment of the iPhone camera community voice their concerns about the over processing of images. I’ve witnessed this since about iPhone 7.


Tge solution is simple and it won’t come from Apple. Instead look to third party camera apps and their developers. Apps such as Halide, Reeflex, Leica LUX, and ProCamera will do minimally processed Raw files. No computational photography. The ones I mentioned above all charge a subscription fee or a high lifetime purchase fee. However, there is one free app, FotoGear that does a Raw files will little to no processing. But If Apple hasn’t changed in almost 10 years, I wouldn’t hold out for them starting with iPhone 17 Pro models.


If you want to switch, you might consider some of the Chinese models or the Google Pixel 10 models. The Pixel does computational photography, but most photographers feel they do less than Apple or Samsung. Overall, the consensus is Samsung does the most computational photography and it’s notorious for digital hallucinations.

Nov 25, 2025 3:53 AM in response to freddy2013

I’m experiencing exactly the same issue, the main 24mm camera suffers from severe “shutter shock”, resulting in blurry images at around 1/60s. It seems the new sensor-shift OIS is conflicting with the autofocus mechanism, or perhaps there’s some resonance between the natural frequency of the image stabilizer and the autofocus lens, causing blur at that shutter speed.


This exposure setting is very common under indoor lighting, which makes it incredibly frustrating.

While depth of field naturally blurs the edges in macro shots, this “shutter shock” affects the entire image.


Interestingly, not all iPhone 17 Pro Max units have this problem. I tested some demo units in the Apple Store, and they seemed fine. However, others - including mine - did exhibit the issue. Even after switching to a refurbished 17 Pro Max, the problem persisted, so it may be a batch-related defect.


I deeply regret trading in for the 17 Pro Max. It’s quite disappointing.

Nov 28, 2025 8:35 PM in response to PlsFixMyProblem

I pre-ordered iphone 17 promax with such hype and interest but the camera isn't working to my expectations and I I am disappointed. Screenshots are blurry and I have tried several times to figure it out why but to no avail. I was booked to Bromley Apple Store a forthnight away but nothing has changed. I don't want my new iphone to go under repairs and prefer to return it for refund.

Dec 2, 2025 7:44 AM in response to PlsFixMyProblem

After a full reset/reinstall and full system diagnostics performed by Apple Genius Bar Tech, which returned nothing wrong via the scans, only the actual picture tacking. They offered to replace the camera module since they too were able to replicate the blurry photos in the back room. I declined cracking open a brand new 1700 phone. I called Apple customer service they are sending me a replacement phone, will come back here and update. I recently went on a trip to Disney and 90% of the pictures are blurry.

IOS 26.0.1 camera are blurry on iPhone 17 Pro Max and older iPhones.

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple Account.