Apple Event: May 7th at 7 am PT

Looks like no one’s replied in a while. To start the conversation again, simply ask a new question.

Can use ipad charger to charge iphone?

Hi, I have both iPad and iPhone 4. Can I use iPad charger for iphone and vise versa?

By the way, why is my iPad not charging when connected to computer via USB?

Why does iPhone doesn't show battery percentage on the status bar?

Sorry I am very new to these 2 gadgets..

Message was edited by: emfung

iPhone 4 and iPad, iOS 4

Posted on Sep 14, 2010 3:05 AM

Reply
342 replies

Sep 8, 2013 9:55 PM in response to RF9

RF9 wrote:


It may be able to deliver 1.79A depending on how you test it but I could not get more than 1.05 amps from it when charging an iPad. That is the iPhone and iPad will not pull more than 1.05 from the 1A charger regardless if it's electrical capability.


That's because the iPad/iPhone detects that the charger is rated for 1A and limits its current draw to 1A. The iPhone charger (1A) puts ~2.7V on the D- pin and ~2.0 on the D+ pin. The iPad charger is the reverse, with ~2.0V on the D- and ~2.7V on the D+.


I suspect that the very small differences you are seeing are attributable to the limited performance of the USB power meter. It's not exactly a high-end test and measurement instrument.

Sep 9, 2013 11:13 AM in response to F. Maxwell

F. Maxwell wrote:


That's because the iPad/iPhone detects that the charger is rated for 1A and limits its current draw to 1A. The iPhone charger (1A) puts ~2.7V on the D- pin and ~2.0 on the D+ pin. The iPad charger is the reverse, with ~2.0V on the D- and ~2.7V on the D+.


I suspect that the very small differences you are seeing are attributable to the limited performance of the USB power meter. It's not exactly a high-end test and measurement instrument.


I ran the exact same tests by cutting the 5V line and running it through an ammeter (digital mulimeter) and got nearly identical results. That's not to say my relatively inexpensive digital multimeter doesn't also have the same limitations as this little USB test meter. Impedence or current draw of the ammeter itself could be enough to affect the test. I no longer have access to more sophisticated equipment (I used to in my former job.) But it's entirely posisble that it's the test method.

It just seemed very logical that the iPhone will limit draw to 1 amp even fully powered rather than greatly exceed it when connected to a 2 amp line. I just don't believe that the phone will draw more power to cover operating currrent in order to preserve full current to the charger. I think it pulls some (100mA give or take) and the rest comes at the expense of charging (the battery.) Especailly since I've never seen any testing or evidence of the contrary. In other words, using a 2 amp charger won't allow for slightly faster charging over a 1 amp with the phone powered up to the maximum.

Just from my own observations, I run Waze, podcasts streaming over bluetooth, screen on, (everything on total powre suck) and I don't observe any faster charging on my 3.1 amp iPad car charger vs a 1 amp. Charging is very very fast in both cases. Then again these are not genuine apple chargind adapters so it's possible the iPhone isn't treating the 3.1 amp as anything other than a 1 amp. But my "tests" show identical behavior as with a 10W iPad 2 charging adatper (using my portable USB power meter.)

So I'll agree that this is inconclusive, but again I've never seen or heard any test results that refute this.


I totall agree (that's because it detects that it's a 1 amp charger.) Just for the sake of the topic, I was trying to point out that regardless of the charging adapter's capability, you won't get more than 1 amp out of it with an iOS dvice so for all intents and purposes, isn't it irrelevant. Maybe I missed another point that was being made.


There is also a secondary reading that will limit current draw from a USB charging adapter. This isn't relevant to the topic except to add to what you said about detecting that it's a 1 amp charger. The phone has another way to deal with detecting the limit.

This site describes it better than I can:

http://voltaicsystems.com/blog/choosing-usb-pin-voltages-for-iphones-and-ipads/

Note: A general observation with the Apple products is that they will attempt to draw the maximum current if and only if the voltage level on Pin 1 with respect to the current draw remains constant between 5 and 5.25 volts. Often, if a charger is unable to supply the proper amount of current the voltage output will drop. For instance, a 5.25V 1A power supply may only output to 4.5V when under a 2A load. For Apple products, when a device is presented with any one of the voltage configurations (500mA, 1A, or 2A) and then presented with a range of Pin 1 voltage levels from 4.5V to 5.25V, the actual current drawn by the device varies. In our tests, we found that when the voltage sent to the iPad 2 was 4.5V, the iPad 2 drew only about 1A, but steadily scaled up the current to 2A as the voltage was incrementally increased to 5V. Additionally, we noted Apple devices will not accept charge from power supplies with Pin 1 voltages of less than 4.5V or greater than 5.5V.

Sep 9, 2013 12:57 PM in response to RF9

RF9 wrote:


I totall agree (that's because it detects that it's a 1 amp charger.) Just for the sake of the topic, I was trying to point out that regardless of the charging adapter's capability, you won't get more than 1 amp out of it with an iOS dvice so for all intents and purposes, isn't it irrelevant. Maybe I missed another point that was being made.



I just wanted to make sure that everyone understood that the current limit was set by the iPhone and was not due to the iPhone power adapter being pushed to its limits.


RF9 wrote:


There is also a secondary reading that will limit current draw from a USB charging adapter. This isn't relevant to the topic except to add to what you said about detecting that it's a 1 amp charger. The phone has another way to deal with detecting the limit.

This site describes it better than I can:

http://voltaicsystems.com/blog/choosing-usb-pin-voltages-for-iphones-and-ipads/

Note: A general observation with the Apple products is that they will attempt to draw the maximum current if and only if the voltage level on Pin 1 with respect to the current draw remains constant between 5 and 5.25 volts. Often, if a charger is unable to supply the proper amount of current the voltage output will drop. For instance, a 5.25V 1A power supply may only output to 4.5V when under a 2A load. For Apple products, when a device is presented with any one of the voltage configurations (500mA, 1A, or 2A) and then presented with a range of Pin 1 voltage levels from 4.5V to 5.25V, the actual current drawn by the device varies. In our tests, we found that when the voltage sent to the iPad 2 was 4.5V, the iPad 2 drew only about 1A, but steadily scaled up the current to 2A as the voltage was incrementally increased to 5V. Additionally, we noted Apple devices will not accept charge from power supplies with Pin 1 voltages of less than 4.5V or greater than 5.5V.


That's a way of detecting whether the power adapter "lied" about its capabilities or was being shared (e.g., two port charger capable of 2.0A that was already charging another iPhone). The voltage sag is how the iPhone sees that the charger is being pushed beyond it's capabilities.

Sep 10, 2013 11:49 AM in response to emfung

I've been reading SOME of the replies to what Merlin1128 said. He is accurate to a degree. And so are many other comments. And this reply of mine may have been made somewhere in this discussion, I don't have time to go through hundreds of comments to check and see.


Please don't take my capitilization as SCREAMING. It is ONLY for emphasis on certain points, just like you would place empasis on certain parts of a sentence in normal conversation (not screaming to do it).


HERE IS SOMETHING TO ADDRESS. The devices do have internal regulators. And the voltage of pin 5 is a part of that regulation system. Apple says the iPad charger is compatible with . . . (a multitude of devices), and that is true.


HOWEVER, when you go to buy a charger for a 12 volt battery, WHY would you want to buy one with a "TRICKLE" CHARGE??? The charger that DOES NOT HAVE TRICKLE CHARGE will work just fine. BOTH ARE "COMPATIBLE" but BOTH are NOT equal in the "QUALITY" of the charge to the batteries.


HOWEVER (again) the situation is because of the "NATURE" (I'm not as educated as many of you) of BATTERIES! EVEN the lithium batteries!


ANY battery charged with the PROPER (COMPATIBLE) rating for the battery will charge the battery. BUT a TRICKLE charge will be BETTER FOR THE BATTERY BEING CHARGED!


Thus the battery of the iPhone will suffer (to some extent) when charge with the FASTER charge. AND the iPad battery MAY actually last LONGER when charged with the iPhone charger (TRICKLE CHARGE). Yes, it will take longer to charge the iPad battery, but without the extra heat and thus damage(maybe not really damage), but NOT a shortened life span as charging the iPhone with the iPad charger will do. So, you may want to call it damage anyway, depending on how each individual wants to express the shortened life.


You can call my comment uneducated (unlettered and ordinary) but I'm 60 years old and have worked around batteries for about 50 of those years. I may not know WHY batteries respond this way to a charge, but working in the HVAC industry for about the last 18 years, I KNOW THAT "HEAT" is the "WORST" enemy to any electrical component, (short of water damage) whether it be a relay, a wire, a coil, a BATTERY (ANY KIND). IF heat be removed from any OPERATING component IT WILL LAST LONGER. If not mistaken, that's why they RATE electical components to ONLY be allowed to operate to a LIMITED amout of current (amp rating). If you exceed this it will shorten the life of(or FRY) the component whatever it is. If someone could build a "REGULATOR" that would absolutely not allow but a certain amount of current to pass through it (such as the pin 5 of the regulating system of the chargers in discussion) regardless of what excess current that is applied to it (WITHOUT HEAT AND DAMAGE) they would be a millionaire. Then you could apply 600 volts and whatever current it might carry, to charge ANY small device (iPad, iPod, iPhone)!


Again, please don't take my capitilization as SCREAMING. It is ONLY for emphasis on certain points, just like you would place empasis on certain parts of a sentence in normal conversation (not screaming to do it).


With all due respect,

ronhenderson2

Sep 10, 2013 12:46 PM in response to deggie

I'm sorry. Correction: I should have said 5 volts with 600 amps. Sarcastically: The 600 amps won't hurt the iPad or iPhone, because it is REGULATED inside the device to ONLY the current it needs. Would the 600 amps not heat up (FRY) the regulator in the device?


Most meters aren't as minutely scaled to show the DIFFERENCE of the charge rate between the 2 devices being discussed. But the mathmatical equation I read somewhere in this thread DOES. My meters I use in the HVAC industry have to measure extremely low readings because the way a furnace tells whether the burner is lit or not, is by the current flowing through the ions in the flame of the burner. I'm only using that as an example. The comments I've seen in this thread so far only take the charge down to one tenth of an amp. When your dealing with voltages this low, you need a meter that will actually read into the hundredths or thousandths of an amp.


Just because a person is an engineer, doesn't mean the are exactly accurate on every judgement they make, even by oberservation. That's the reason I highlighted the difference in a trickle charger, and one without trickle charge, the charger having the regulator built into the charger to REGULATE the current to a trickle. When you get into such smaller voltages, THE SAME PRINCIPLES APPLY (laws of physics). You just may not be measuring them. Charging different devices with different chargers, unless in a scientific laboratory setting designed to measure such differences (making sure both devices are absolutely pulling the same amout of current (measured down into the hundredths or thousandths) you WILL NOT AND CANNOT make an accurate statement about this discussion. BUT if you DEMONSTRATE it with a MATHMATICAL equation, then you can make accurate statements. I'm so curious WHY an engineer would ARGUE with a mathmatical equation. It may not be as EXACT in the field as the equation, BUT the equation is EXACT in principle (laws of physics). Regardless of the hundreds of comments in this thread by both highly educated and less educated individuals, that ignore the EXACT laws of physics.


Some people will argue there suppositions without going into the detail needed to accurately comment on the discussion. And of course, some people just like to argue, or make inaccurate or unsubstantiated comments.


But the truth is the truth, regardless of how minute (mīnūt) it might be. And I may be stretching a little, but as far as minute goes, I really doubt that the batteries in discussion have been manufactured to such minute details. Obviously, they are not created with the same detailed accuracy that NASA would need (snicker).

Sep 10, 2013 1:54 PM in response to ronhenderson2

ronhenderson2 wrote:


Just because a person is an engineer, doesn't mean the are exactly accurate on every judgement they make, even by oberservation. That's the reason I highlighted the difference in a trickle charger, and one without trickle charge, the charger having the regulator built into the charger to REGULATE the current to a trickle. When you get into such smaller voltages, THE SAME PRINCIPLES APPLY (laws of physics). You just may not be measuring them. Charging different devices with different chargers, unless in a scientific laboratory setting designed to measure such differences (making sure both devices are absolutely pulling the same amout of current (measured down into the hundredths or thousandths) you WILL NOT AND CANNOT make an accurate statement about this discussion. BUT if you DEMONSTRATE it with a MATHMATICAL equation, then you can make accurate statements. I'm so curious WHY an engineer would ARGUE with a mathmatical equation. It may not be as EXACT in the field as the equation, BUT the equation is EXACT in principle (laws of physics). Regardless of the hundreds of comments in this thread by both highly educated and less educated individuals, that ignore the EXACT laws of physics.

Ron, with all due respect, measuring below 100 mA is irrelevant to this entire discussion. You can go on about the sensitivity of equipment, testing methods, etc., but to answer the simple stuipid quesion about how an iPhone charges from a 10 watt vs a 5 watt USB chargind adatper dosn't need such sensitive measurements.

All you need is a very very rough measurement. Plugging the AC adapter in to a "Watt's Up" and looking at the wattage would probably be good enough, and it can only measure in increments of full watts. I have no idea what you're referring to when it comes to mathematical calculations, and I don't relaly care.


It's plain and simple. An iPhone will pull ABOUT 1 amp from a 10 watt charger and 1 amp from a 5 watt charger. No more. The iPhone treats both the same within a margin of error that not significant. We don't care about <2% performance which you'll get just off of the length and of cable you use anyway.


So please, stop distracting the discussion with your technicalities that have no real relevance to the discussion except to callin to question everyone's qualifications in making rudimentary test that require little expertise.


And trickle charing vs. rapid charging is not what this conversation is about. Yes slower charging is always more gentle, but no one is disputing that.

Sep 10, 2013 1:54 PM in response to ronhenderson2

You are blowing smoke. Suppose your house has 150 amp service. By your reasoning the instant that you turn on a 60 watt lightbulb (~0.5 amps) it should explode when hit by that 150 amps.


You are absolutely clueless about electricity, and the difference between voltage, current supply capacity, and current supply demand. I'd never let you near the HVAC in my house.


The Apple "charger" is not a charger. It is a 5 volt power source, capable of supplying 1.79 amps (for the iPhone power source), or 2.1 amps (for the iPad power source). The charger is in the phone. The MOST the iPhone ever draws from these power sources is 1.1 amps, and most of the time it is 1 amp. AS MEASURED, not as speculated.


The charger in the phone, BTW, IS A TRICKLE CHARGER, as explained in my previous posts. The initlal current that it draws is 1 amp, and it starts to decrease at 60% charge, dropping linearly to ZERO at full charge. As measured, not as speculated. As posted.


Oh, the meter I am using measures down to 0.01 amp. I don't know where you got 0.1 amp from.


The reason to argue with a mathematical equation, BTW, is because that equation is the WRONG EQUATION for this model.


If you doubt the reports from the 4 people in this thread who have actually gone and measured to reach their conclusions why don't you do the same?


While you are thinking about it, please watch this video:

http://www.youtube.com/watch?v=DC4gPxc89Wg


Then explain how, with 5 watts going into the iPad power source when connected to the iPhone, you can get more than 5 watts out of it?

Can use ipad charger to charge iphone?

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple ID.