But, in my opinion, there is something going on on the software side which is interesting enough to justify an attempt at quantifying it in order to better understand it and maybe better utilize it. Or even work around it so that one can avoid it. Since all digital cameras have to use some kind of software to deal with the inevitable noise, the quality of the image depends to a non-negligible part on the quality and nature of the software that is used. Different manufacturers use different algorithms. Apple apparently uses a different algorithm, or several different algorithms on the new camera.
Back to my issue which I feel is shared by many: Looking at the grain in the images, I can't help but notice that there is obviously a different algorithm at work than in previous iPhones. When you zoom into an image, you notice a blob-like graininess which to my eye is less appreciable than the graininess that was produced by the software in previous iPhones. Interestingly enough, the algorithm seems to pick up on the predominant structure in the image. Grass-like blobs when there are grass leaves, round blobs when there are round blossoms.
Also (it might be the same "problem" though), I've noticed that the algorithm that is used specifically on parts in the image with fine detail seems to try for additional edge enhancement. The effect is almost psychedelic - it looks awesome on the iPhone screen, but you know that it wasn't THAT sharp originally. An unaltered image that I sent to my daughter caused her to exclaim: "Wow! Is this photoshopped? So beautiful!" while I was writhing in pain over the exaggerated blossom structure in the image. The same image taken by my iPhone 6s was much less "dynamic", but also less "awe-inspiring". However, its image was closer to the truth than the iPhone 11 Pro's. Which is enough for me to want to find out what's going on there.
The general behavior would hint at an edge-enhancement filter kicking in at a certain spatial-frequency range. Where there is no detail in the image of that spatial frequency, the effect of that filter is to generate a general softening with blob-like grain.
Also, if you zoom in, you notice that a detail that "seems" to be there actually isn't there. This has been addressed as well in this thread. It feels like "See - not see", dependent on whether you're zooming in or just looking at the image in the photo app.
Therefore a second question might be: Is there something going on in the photo app simply for the enhanced image appeal when on display in the photo app? I read something about all images being optimized to the full dynamic range of the iPhone's display.
In general I think it is a masterful process to make a potentially unfocused image appear focused on the iPhone screen (kudos to Apple) - but what if you don't want that assist? Is there a way around it? That might be where the Pro Camera and the Moment App come in for the desired result (or non-result).