Hi, Ian
Making computer chips is an expensive proposition.
They are not made one by one, instead they are (for lack of a better word) "printed" on silicon "wafers". Several of these chips are printed on one wafer, and then they have to be tested.
The circuits on these chips are so tiny that we talk about nanometers (1 nanometer is one millionth of a millimeter, or 10^(-9) meters). You may have heard that the M1 is based on 5-nanometer technology.
Now this means that the pathways in these circuits are just a few molecules wide. It is mind-blowing if you consider this (at least for me it is).
Not all of the chips pass the required testing with flying colors.
Take, for example, an intel cpu chip. Maybe it works at a lower clock speed with disabled hyperthreading.
Instead of being thrown out (an expensive waste) it may be sold as an i3 or i5 instead of an i7.
Something like that may happen with Apple chips as well - and indeed I'd expect this to be even more critical since we are not just dealing with a CPU, but a whole SoC (system on a chip) containing CPU, GPU, RAM and SSD, all on the same die.
So maybe some of these chips do not pass the stringent testing if all the GPU cores are enabled, but do pass with 7 instead of 8.
Rather than throw them out and lose the full cost, it makes sense to sell them for a little less - they are still very usable and still much faster than just about anything else for an even remotely comparable price.