You can make a difference in the Apple Support Community!

When you sign up with your Apple Account, you can provide valuable feedback to other community members by upvoting helpful replies and User Tips.

Looks like no one’s replied in a while. To start the conversation again, simply ask a new question.

iPhone 14 Camera is Blurry

My brand new iPhone 14 Pro Max’s camera is blurry whenever attempting to take close up pictures in 3rd party apps like Amazon, Target, Walmart (ie barcode scans). Phone is brand new out the box, updated to latest iOS (16.0.2), and have reinstalled the apps.


Anyone else with this issue?


[Re-Titled by Moderator]

iPhone 14 Pro Max

Posted on Sep 24, 2022 9:03 AM

Reply
Question marked as Top-ranking reply

Posted on Jan 7, 2023 12:46 PM

All I can say is I've had iPhones for over 10 years and I've never had the issues with the camera, both native and third-party apps and accessing the camera as I've had now with the new iPhone 14 Pro. Look at the picture attached. I made it this morning using the native camera unfortunately due to the focal distance, the iPhone could not figure out which camera to use and kept switching back-and-forth and ended up with a blurry picture. That is an issue with the iOS that did not exist before this version of the iPhone. I am seriously considering going back to an iPhone 13 as the issues did not exist with that phone.

162 replies
Question marked as Top-ranking reply

Jan 7, 2023 12:46 PM in response to Sisyphean_task

All I can say is I've had iPhones for over 10 years and I've never had the issues with the camera, both native and third-party apps and accessing the camera as I've had now with the new iPhone 14 Pro. Look at the picture attached. I made it this morning using the native camera unfortunately due to the focal distance, the iPhone could not figure out which camera to use and kept switching back-and-forth and ended up with a blurry picture. That is an issue with the iOS that did not exist before this version of the iPhone. I am seriously considering going back to an iPhone 13 as the issues did not exist with that phone.

Nov 19, 2022 2:42 PM in response to Krid63

You are correct, there is usually no hardware issue as it works as designed, third parties need to update their apps as stated.


There are several third-party apps that work perfectly scanning bar codes on the 14 Pro Max, as does the QR scanning ability that is now part of the Camera app.


Scan a QR code with your iPhone, iPad, or iPod touch - Apple Support


The Code Scanner that Apple provides in Control Center also works perfectly, the key is not to try to force the block to fill the square but rather back the phone up until the QR code is in sharp focus and the scanner will read it instantly.


This obviously won’t work but backing the phone up a few inches/several cm until the code is in sharp focus will, every time.



If it doesn’t work for you, then there may be something wrong with your device; I recently visited an Apple Store and it worked with every iPhone 14 they had on display, too.


The interesting thing is the Camera app will switch to the macro lens to be able to focus more closely where the Code Reader app instead requires you to hold the phone further away so it can focus.

Dec 26, 2022 2:18 AM in response to EtaoinShrdlu

EtaoinShrdlu wrote:

Nope.
The very relevant point is: “If you are using [the] camera to read them […]”
If you have to use a third-party app or Safari it does not work reliably.
I repeat: If you have to use a third-party app or Safari it does not work reliably.
I repeat again: If you have to use a third-party app or Safari it does not work reliably.


  • Camera reads every QR code I've tried.
  • I'm unaware of how you would use Safari to read a QR code.
  • Many third-party apps read QR codes without issue; those that don't need to be fixed.


As an example, download the free app Scandit - I've yet to see it not read a QR code.




Oct 7, 2022 4:35 AM in response to QuadrupleThat

It's not my responsibility to read Apple disclosures on their APIs and upcoming new features to determine what may or may not work on future revisions of hardware and in future versions of iOS.


Further, even if I could link to the information, it may be information requiring that you sign up with a developer account to read.


But as an example, new to iOS 16 is the


maxPhotoDimensions


property, which must be one of the values defined by the


supportedMaxPhotoDimensions


property:


The dimensions you set must match one returned by supportedMaxPhotoDimensions for the current active format. 

Apple Developer Documentation


These settings are new to iOS16 and the previous method of checking whether "high resolution" capture is supported has been deprecated as of iOS16:


isHighResolutionCaptureEnabled

Boolean value that specifies whether to configure the capture pipeline for high resolution still image capture.

Apple Developer Documentation


Oct 24, 2022 9:29 AM in response to JDM76

ProCam for one allows you to manually select which lens you want to use and focuses just fine when you select the ultrawide lens, allowing you to fill the screen as much as half way with a bar code that measures 1 5/8" x 3/4" on paper, so it is possible for third-party apps to request the proper lens based upon what it needs in terms of resolution and size.


The issue is apps are apparently just asking for access to the camera rather than properly following the specified API to ask the phone which cameras are available and selecting the one with the shortest possible minimum focus distance.


That is a technique that was recommended with the release of iOS 15, but apparently app authors just got away with not doing it on iPhone 13 and it's biting them now.


What’s new in camera capture - WWDC21 - Videos - Apple Developer



Nov 15, 2022 3:01 AM in response to southernboyuk

Yes, and that's explained here:


About the Camera features on your iPhone - Apple Support


Again, for the 14 Pro and Pro Max, the normal 1x "wide" lens cannot focus on anything closer than about 8" away.


To focus on something closer, the "ultra wide" macro lens must be used.


If you are taking a photo yourself, the Camera app should auto-switch as described at the link above.


In a third party app, the third party app will be responsible for either switching lenses or reading barcodes from further away using the 1x "wide" lens - the apps for Amazon and Walmart to name two already can, as can multiple other third party apps.


It's not a defect, it's by design and due to the limitations of optics and lenses.



Nov 16, 2022 4:29 AM in response to southernboyuk

Which is funny, because the Scandit app in the App Store can read driver's licenses codes, so that puts the blame squarely in Adobe's court.


I don't know if apps or the API are supposed to switch lenses, but for example the Amazon app can read every barcode I've pointed my 14 Pro Max at seemingly without switching lenses, and it may be the barcode API can return a favorable result even if the barcode is 8" away from the wide (1x) lens. Likewise many apps that need to read QR codes like Qrafter and the Walmart app also work fine, reading whatever I point them at without any difficulty.

Nov 16, 2022 11:14 AM in response to stevenbad

Maybe this will help some developers; I've found the issue, at least on my end.


TLDR: My previous hunch was correct. If you're not developing using native Swift/Obj-C you might be looking at needing an entire framework change to accommodate the Triple Lens.


Here's a quick fix if your app is a native iOS app:

Using AVCaptureDiscoverySession, list out all of your cameras. AVCaptureDeviceTypeBuiltInTripleCamera should be one of them if you're on iPhone 14 Pro; usually, this is index 7. You need to select this rather than the "rear camera" (index 0) when starting up your capture session, in order to use the Triple Camera functionality. It should auto-handle focusing and swapping between the lens (you can actually select only the Macro Lens / ultra-wide lens as well, but that removes the ability to let iOS's backend handle the lens swapping).


Here's why it's bad if you're not developing for native iOS (i.e., Flutter/Qt/Xamarin/React Native):

I'll use Qt as an example, since that's what is giving me trouble. The LTS version of Qt 5 has enumerations for only the "front" and "rear" cameras. When you use Qt to access the available cameras, it only lists out Apple's designators for those two cams (indices 0 and 1), and will only let you choose from those two lenses. If I try to force my camera object to use a device ID like the Triple Camera one, it forces it back into the default "rear camera" state, presumably because Qt 5 doesn't support controls for it. This means that I simply cannot use the Triple Camera through Qt. It seems like I would have to create an entirely new interface that pulls the camera directly from apple's native camera or change to another framework. There's a lot of reasons one could infer about why neither solution is desirable for a business that has an established codebase.


Check if your framework supports using Macro Lens/ Triple Cam device ids.


What could Apple do to mitigate this?

I'll be honest, as a dev myself it is frustrating that the Triple Camera isn't just interfaced to the "rear camera" device ID by default on the iPhone 14 Pro. I cannot imagine a scenario where an app that uses the camera in a one-dimensional way (i.e., an app that isn't used to gain extreme control over the camera) would not just want to use the Triple Cam like the default one. Releasing a new phone with an incredibly long minimum focus distance that requires special control to use up close is, in my opinion, bad foresight considering most APIs use the "general-purpose" rear camera accessor.


You may be asking "Why did no one care about this with the iPhone 13 Pro?" and that's because the minimum focus distance for the iPhone 13 Pro's normal camera was short enough that it didn't break document scanning apps. No one noticed it then, so no one thought to try and swap to the Triple Cam in their apps. But now that the 14 Pro increased that distance so much, it seems like the solution should be "let people who try to use the general purpose rear camera on the iPhone 14 Pro have their API calls redirected to the Triple Camera."



Nov 16, 2022 4:02 PM in response to pechman146

What you cite is the issue, as you shouldn't be choosing a camera at all.


Instead you should tell the device what you need in terms of resolution and allow the device to choose.


I outline the basics here:


The issue is apps are apparently just asking for access to the camera rather than properly following the specified API to ask the phone which cameras are available and selecting the one with the shortest possible minimum focus distance.

That is a technique that was recommended with the release of iOS 15, but apparently app authors just got away with not doing it on iPhone 13 and it's biting them now.

iPhone 14 Camera is Blurry - Apple Community


Nov 16, 2022 4:05 PM in response to southernboyuk

southernboyuk wrote:

Above my head, I just want it to work as previous models did. Let’s hope Apple are all over this!


They are, they told developers how they should be using the APIs, some just didn't follow the rules properly.


southernboyuk wrote:

Yes I’m doing that, however, when you need to use 3rd party apps ( as previous) the 14 pro max has a major design problem when taking close up photos of ie driving licences which must fit into a a certain sized app frame.


The device doesn't have a problem, the way third party apps are using the device is incorrect.


If you don't believe me, download the free Scandit app I mentioned earlier and you will see it has no issues scanning things.

Nov 16, 2022 9:43 PM in response to abjacobs

Then make an appointment to have your phone examined:


Contact - Official Apple Support


I don’t have some mythical device that is the only one that works somehow.


I’ve explained how and why things work and why many apps work but some don’t.


If however you are expecting your iPhone 14 Pro/Max’s 1x lens to focus on something 3” away, that’s simply not going to happen except in Camera if auto macro is enabled or if you select the Wide lens if it’s not.


You may not like my answers, but Apple can’t fix something that is normal.

Nov 17, 2022 7:23 AM in response to Dogcow-Moof

To be clear, I am using the "available cameras" API you describe. It offers my choice of cameras that are available using AVCaptureDiscoverySession like "Ultra Wide," "Macro", and "Triple Cam" (which is, even if it's hard to believe, one of the cameras offered on the phone as a "camera") and I'm trying to choose the one with the shortest minimum focus distance (which I can check using the properties of the device). I just keep getting forced back to the normal rear camera. I'm following Apple's directions for choosing a camera, and it only saves my choice using native iOS development tools in Objective C, not my framework. In this, you can't choose by resolution. The parameters offered to the developer are "Camera Type," "Media Type" (e.g. image capture, audio capture, video capture), and "position" (front, back).


I'm almost 100% certain I've narrowed the issue down to exactly the problem, which is that my framework doesn't support multi-cam control and I need to therefore update the framework. But for others who have up-to-date frameworks, they should just be able to select TripleCam if it's supported.


I'm a little confused, in this reply you said "You shouldn't be choosing a camera at all" but in the reply you reference you say I should "select [from the available cameras] the one with with the shortest possible minimum focus distance," which is exactly what I'm doing and able to do on Native Xcode but not Qt 5. Is there a difference between "choosing" and "selecting" here?


Nov 17, 2022 12:24 PM in response to Dogcow-Moof

By “not choosing cameras” I meant you shouldn’t select based upon what you (as the developer, not as your app) know about available cameras but rather based upon characteristics reported by the API. In that way whether the device has one camera or twelve, you will get the most appropriate.


This is different than the old approach and may be why frameworks have yet to catch up, and why native iOS development is always best but obviously isn’t always possible.


To quote the video (this is all getting too technical for a general non-developer thread):


That's where the new minimumFocusDistance property of AVCaptureDevice comes in. It's new in iOS 15. Given the camera's horizontal field of view, the minimum barcode size you'd like to scan -- here I've set it to 20 millimeters -- and the width of the camera preview window as a percentage, we can do a little math to calculate the minimum subject distance needed to fill that preview width. Then, using the new minimumFocusDistance property of the camera, we can detect when our camera can't focus that close and calculate a zoom factor large enough to guide the user to back away. And finally, we apply it to the camera by locking it for configuration, setting the zoom factor, and then unlocking it. After recompiling our demo app, the UI now automatically applies the correct zoom amount.

As the app launches, it's already zoomed to the correct space. No more blurry barcodes!

Nov 19, 2022 6:48 PM in response to JDM76

Apple went so far as to have a complete session on it at WWDC 2021; that’s pretty much the Apple version of buying billboards.


What’s new in camera capture - WWDC21 - Videos - Apple Developer


complete with sample code:


Apple Developer Documentation


I’m not trolling you, I am telling you actual facts; Apple can’t help it if developers ignore the changes they go out of their way to tell them about.


Apple’s apps can do it, of course, but so can apps from Amazon, Walmart and other third parties.


You can prove it to yourself by following the instructions I gave above.




iPhone 14 Camera is Blurry

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple Account.