Here is a quick and easy reproductible test case I propose to everyone including those that keep replying "my phone is fine" to literally every comment here, denying the reality that affects us:
- make your phone steady (tripod, rest on something etc - just make sure you can touch the screen without moving the phone). point towards a non-moving scene with lots of details such as newspaper or a colourful bed sheet with lots of drawings. a sitting person is also ok as long as manages to relax and not move the face/head and keep eye look in the same direction. Let it autofocus the same spot or make sure you repeat focusing the same exact spot.
- connect the phone with cable to a Mac
- reset all camera settings to defaults, so we exclude any difference in settings
- open camera app
- configure Assistive Touch to take a screenshot of the camera viewfinder (iphone-screenshot-without-button) then take a screenshot
- take a regular picture
- take a burst picture, then select and keep only one of them. I've had issues in the past on iphones with the exposure of the first 1-2 in the sequence, so choose one in the middle just in case.
- import a photo using the mac
You will notice that:
- the preview screenshot shows what the sensor sees. NO effect.
- the image taken by the camera app SHOWS the smarthdr/deepfusion oilpainting/watercolor/cartoonface over processed effect
- the burst image does NOT show the effect.
- the image captured at the request of the Mac does NOT show the effect.
- the burst, the screenshot and the Mac import shows a soft/blurry lens image (lack of details), but a natural image overall compared to the one taken by the Camera.
I am curious if ANYONE reproducing my simple above test case would obtain a camera image without the blamed effect !!!
My personal conclusion is this is not a mistake, and this is by forced choice: Apple *knew* the 13 models front camera hardware does lousy images (see the softness in **ALL** other images - the screenshot/viewfinder, the import from Mac, the burst), they observed something is wrong (my guess: camera components malfunction due to shortage of supply issues in Vietnam followed by a recall program, I wish/hope 🙏) and went to attempt to correct the resulting pixels in software. This is the only logical explanation I can think of. I had my camera replaced at Apple Authorised Service Provider for nothing: the replacement is identical.
Attaching my own results (100% lossless crops of relevant areas, these are NOT digital zooms) as well as link to full original files.
viewfinder
import from iPhone on mac
burst
camera