So I finally got the iPhone 11 Pro. Overall, I love this phone. And for the most part, the camera is awesome!
Most of the pics I've taken, using all 3 lenses, have turned out really good. And night mode is incredible. The video...wow! And that's just at 1080p. I'm sure the OLED screen plays a big part in that. But it still looks pretty amazing on my graphics monitor.
That being said. I do understand the original posters concerns. There have been some pics, using normal lens (wide), in very well lit indoor lighting, that the images came out less than adequate. It didn't look photographic. The face of the subject looked like it was filtered using a paint or drawing filter, with a lot of definition (for a filtered photo). There was considerable noise, and lines weren't crisp. Some photos were in decently lit room, but they looked a little better than the ones taken in a well lit room. But pictures taken outside, looked great! None of the issues I described above. I also took a pic of co-worker at work (less light than "filtered" photo), and it looks really good. You can see the stubbles on his face. I will post some examples when I get home.
After using the iPhone 11 Pro for a couple of weeks now, I'm going to presume, that it's not the camera itself, but rather the software used to generate the pics. ie, the Apple Camera app. Sometimes it renders great, sometimes it doesn't. Quality is not consistent, even in the best conditions. My old iPhone X pics were really good. And it was consistent. But that was using it under iOS 12.x.x. When I updated the X to 13.x.x, I wasn't paying much mind to the pics I was taking. And I didn't take many before I got the 11 pro. But I suspect, that changes in 13.x.x (when it comes to the camera app) messed up the rendering for some reason. I do hope that when Apple finally releases Deep Fusion, great pics will be what's consistent moving forward. Btw..anyone know when it's supposed to be released?