This was definitely a very interesting thread. I could see myself switch sides a couple of times on the first two pages :-)
People on this thread are far more educated than I am.
That said, everyone here is correct but, a small addition, I feel could make sense - Apologies if I am wrong.
For a battery to charge you need a voltage. Example: a car's alternator puts out about 14.5v trying to charge a 12v battery. However, as you power on your headlights, stereo etc, the alternator's voltage goes down to about 13v
How is this even relevant ???
The rating given on a USB Power Source usually has the voltage, current, AC/DC, polarity.
Eg: 5v, 2.1A DC
What we can deduce from this is, the source will put out 5v steady output until the total closed loop current when my device is connected does not exceed 2.1A. But, the fact is, due to slightly non-linear behavior of the source, the source voltage 'could' be (my assumption) 5.2volts when the closed loop current is 0.5A vs the source voltage being at 4.8v at full rating of 2.1A
Now lets not worry about the current for a bit. I think giving a specific voltage is what charges a device.
Now when you connect an iPad, the total voltage being put through might be around 4.9v. But, in the case of a phone it could be a little higher around 5.2v
If you were to connect a phone to the 1A charger, the voltage going through would be at 4.9v. But, if you connect an iPad to the same charger, the voltage will fall further down to 3v (just an example) causing the device not to charge.
I think this is the reason for the phone to charge slow or fast depending on the charger. That being said, I would rather use a charger which would just spit out the right voltage for my application (load dependent)
Hope I did not make a fool of myself.