Hi Jim,
Many thanks for taking the time to respond to my question but I regard your English lesson as fillibuster.
If you wish to consult the Oxford English Dictionary, they will provide you will the technical interpretation of the words Precision and Accuracy in addition to their common usage.
And, I quote, Precision
1.2 • technical Refinement in a measurement, calculation, or specification, especially as represented by the number of digits given: a technique which examines and identifies each character with the highest level of precision [count noun]: a precision of six decimal figures
And, I quote, Accuracy
1.1 • technical The degree to which the result of a measurement, calculation, or specification conforms to the correct value or a standard: the accuracy of radiocarbon dating [count noun]: accuracies of 50-70 per cent
Let us take the standard meter rule as an example. Let's assume it is subdivided into one thousand equal graduations called millimeters. We cannot make a measurement more precise than one millimeter with this rule. However, the accuracy of the measurement is a different matter as the rule may indicate different lengths for the same graduations at different temperature and pressure and any other physical properties of the rule material and its ambient surroundings.
In your response, you refer to a presumption that the GPS hardware always returns the same number of digits. I would love to debate the difference in English between an assumption and a presumption too but that would be more fillibuster! So I will ask you, upon what do you base that belief? Regardless of the hardware, the readings presented to users pass though many levels of software that may, or may not, affect the precision presented to users (i.e. the number of digits).
For example, if I use a common internet mapping application, I find that the precision of the Long/Lat readings displayed depend on the zoom level. This makes sense as the individual pixels on the map can only resolve to a certain precision depending on map scale. So, although my iPhone hardware may be telling me my location to N decimal places, the application only displays them as M decimal places depending on the map scale that I am viewing.
My real world problem is that I read a collection of photographs taken over a wide geographic area and I am attempting to position those photographs in a geographic information system. The longitude and latitude values in different photographs have different levels of precision ranging from one decimal place to thirteen decimal places. I seek to know why the precision of these readings varies. I read the iPhone documentation and I queried it. The issue may be nothing to do with iPhone hardware and (more likely) is introduced by software that processes the information.
Any insight you might have would be gratefully received.