Hi jvrdlc,
The pixel density on a camera sensor and the way pixels are comprised on a device display are two completely different creatures. The reporter in the article link from St. Clair was somewhat trying to prove that although using a pretty flawed method by just down-resing an image in Photoshop. In the end the quality of the image is going to be determined by the quality of the sensor, not by resampling in software or a larger MP number. The algorithm a camera sensor uses to comprise the image, the image format(i.e. RAW, JPG, etc), and other factors are key ingredients in the final quality of an image. A high megapixel camera may record a lot of pixels but it's also about how it records the information in those pixels along with elements such as lens glass quality, signal processing, etc. An image on a 20MP sensor can turn out worse than one on a 5MP depending on the settings of the camera, how the internal software processes the image along with many other variables. Fortunately, Apple puts the care into making sure all those minor pieces that have a major impact are first class instead of just focusing on a big MP number that looks good in marketing material. The new 1.5 micron pixels on the sensor are going to deliver crazy detail and a 2.2 fstop aperture will further the quality just that much more. Pro photographers are using the iPhone 5 camera to capture images that are being used in full size magazine spreads as well as very large poster promotional materials so don't let a 'lower sounding' MP number influence you into thinking you are sacrificing image quality.