342 Replies Latest reply: Sep 10, 2013 3:00 PM by ronhenderson2 Go to original post Branched to a new discussion.
• Level 1

So, it sounds like if one were to use the 2amp charger on the iphone, while watching a netflix movie, there would be a benefit in that the higher capacity charger could provide more amperage to charge and watch the movie at the same time.

• Level 1

Correct.  More power means you can do more while still charging!

I just want to say this has been a very entertaining thread to read and I appreciate all the expert advice provided. I now know I can purchase a higher amperage USB charger (brick) and not damage my iPhone, short of any defective components affecting me.

Lastly, while I do think Apple generally makes quality products, you guys need to remember they do make mistakes, occassionally in hardware... does anyone remember the iPhone 4 antenna problem?  So they are not infallable, so keep that in mind.   Thanks all.

• Level 1

This was definitely a very interesting thread. I could see myself switch sides a couple of times on the first two pages :-)

People on this thread are far more educated than I am.

That said, everyone here is correct but, a small addition, I feel could make sense - Apologies if I am wrong.

For a battery to charge you need a voltage. Example: a car's alternator puts out about 14.5v trying to charge a 12v battery. However, as you power on your headlights, stereo etc, the alternator's voltage goes down to about 13v

How is this even relevant ???

The rating given on a USB Power Source  usually has the voltage, current, AC/DC, polarity.

Eg: 5v, 2.1A DC

What we can deduce from this is,  the source will put out 5v steady output until the total closed loop current when my device is connected does not exceed 2.1A. But, the fact is, due to slightly non-linear behavior of the source, the source voltage 'could' be (my assumption) 5.2volts when the closed loop current is 0.5A vs the source voltage being at 4.8v at full rating of 2.1A

Now lets not worry about the current for a bit. I think giving a specific voltage is what charges a device.

Now when you connect an iPad, the total voltage being put through might be around 4.9v. But, in the case of a phone it could be a little higher around 5.2v

If you were to connect a phone to the 1A charger, the voltage going through would be at 4.9v. But, if you connect an iPad to the same charger, the voltage will fall further down to 3v (just an example) causing the device not to charge.

I think this is the reason for the phone to charge slow or fast depending on the charger. That being said, I would rather use a charger which would just spit out the right voltage for my application (load dependent)

Hope I did not make a fool of myself.

• Level 8
Mac OS X

The USB spec requires that the voltage be 5 v +- 0.5 v, or 4.5 to 5.5 volts. So it will never fall to 3v. It is impossible to draw too much from the power source because the source is "smart" and will limit the current to what it can supply while keeping the voltage in the spec'd range. The charger circuit in the phone also will work over the voltage range spec. You can't ignore current, because it is current that charges the battery, not voltage. The battery in an iPhone 5 is 1440 MaH at 3.8 v. In general, as long as the power source voltage is higher than the battery voltage the actual voltage is not relevant to charging time.

The car analogy is flawed, because in a car the car's voltage regulator is the charger. For iDevices, laptops, androids, etc, the power source is just a power source, and the charger is actually inside the device being charged. Also, the alternator voltage does not drop under load; it is always 14.5 v, but the resistance of the wiring under load reduces the voltage at the point of measurement to a lower voltage, with the actual measured voltage determined by where you measure it. If you measured right at the alternator it would still be 14.5 v. But this is also irrelevant for iDevices because the resistance of the wires is insignificant at the currents being supplied. Car headlights, for example, draw about 10 amps, and the starter 300 amps.

• Level 1

Thank you, Finch!

So I see that you do agree that the USB spec is 4.5 to 5.5 volts. And you also said that the phone would take the same charging time irrespective of whether the output voltage is at 4.5 or 5.5

" The battery in an iPhone 5 is 1440 MaH at 3.8 v. In general, as long as the power source voltage is higher than the battery voltage the actual voltage is not relevant to charging time."

Does this mean that I can use a 12v DC to charge my phone ??

• Level 8
Mac OS X

As long as its output meets the USB spec of 4.5 to 5.5 volts. If it doesn't the phone will refuse to connect.

• Level 1

1.Do you agree that the phone will charge sooner at 5.5V than at 4.5V?

If you do not agrree to 1.,

2.Do you agree, that if hardware tweak the circuit and provide 12V DC to the battery for charge, the battery might 'Explode'

• Level 8
Mac OS X

aju2032 wrote:

1.Do you agree that the phone will charge sooner at 5.5V than at 4.5V?

No. The voltage is not as important as the current, which is regulated by the charging circuit.

If you do not agrree to 1.,

2.Do you agree, that if hardware tweak the circuit and provide 12V DC to the battery for charge, the battery might 'Explode'

It will probably fry the charger chip rather than the battery, because the charger chip is designed for a nominal 5 volts. If you're lucky 12 volts may not be sufficient to damage anything, but you will get a message saying the connected device is not compatible and it won't charge.

• Level 6
iPhone

1. No!  The phone has a charging circuit that controls charging rate and voltage applied to the battery, regardless of the charger voltage, as long as it's within USB specs. This has been explained repeatedly in this thread.  It helps to read the thread you're posting in.

2.  No!  It would fry the charging curcuit chip (and possibly others), killing all charging capability. The battery itself probably would not see any voltage/current at all.

• Level 1

>>1. No!  The phone has a charging circuit that controls charging rate and voltage applied to the battery, regardless of the charger voltage, as long as it's within USB specs. This has been explained repeatedly in this thread.  It helps to read the thread you're posting in.

The point is the Spec. The spec says the source voltage(From Wall adapter) can be between 4.5 to 5.5v. Now please don't tell me that, the charger circuit would make sure that the voltage applied is fixed say, 5V if the input is within the spec.

The fact is, the charger circuit is functional at 4.5v through 5.5v. And it puts out 4.5V if that is all is available and it does not magically increase it to 5v (Wouldn't that be awesome )

>>2.  No!  It would fry the charging circuit chip (and possibly others), killing all charging capability. The battery itself probably would not see any voltage/current at all.

Forget about the charging circuit for a minute here. This is a theoretical question(I am not asking you to perform it). If I pull out the battery off the phone and provide it 12v, it will most likely kill the battery

One of my friends said, if a wire is a pipe, Voltage is water and the current is the pressure at which it gets the water through the pipe/wire.

Sorry again, do not want to argue here but, I am trying to learn something and get my understanding right. "Everyone is a noob at some point"

I feel, voltage of the source, (within the spec), the charger circuit puts it through as is (It is not a stabilizer). Now if 4.5V is seen at the battery, I would think that it would charge slower than seeing a 5.0V at the battery.

This is a discussion not a fight. There is no winning or losing, end of the day everyone learns new stuff (in this case could be me )

• Level 1

> No. The voltage is not as important as the current, which is regulated by the charging circuit.

Could you tell me what changes between trickle charge and fast charge?

• Level 8
Mac OS X

The phone's built in charger limits the current to the battery to a lower value once the battery has reached about 80% charge to avoid overcharging. It then shuts off completely when the battery reaches full charge.

• Level 1

aju2032 wrote:

I feel, voltage of the source, (within the spec), the charger circuit puts it through as is (It is not a stabilizer). Now if 4.5V is seen at the battery, I would think that it would charge slower than seeing a 5.0V at the battery.

With all due respect, your assumption here is not true. In fact, the charger circuit does act like a stabilizer. It regulates both voltage and current from the power supply.

• Level 1

Thank you!

• Level 8
Mac OS X

Battery management technology has become highly sophisticated, so much so that it's hard to grasp how advanced it is if all of our experience is with charging car batteries or small batteries for flashlights and portable radios.

As an example, I have a 10 year old Prius with 95,000 miles still on its original battery, which still has its original capacity, and I know other owners with over 200,000 miles on the original battery. The battery is nothing special; a 300 volt Nickel-metal-hydride cluster of cells that are similar to the "D" cells you could buy at Radio Shack (the original Prius in Japan actually DID use D cells). What makes it last seemingly forever is the charging management computer.