1 19 20 21 22 23 Next 342 Replies Latest reply: Sep 10, 2013 3:00 PM by ronhenderson2 Go to original post Branched to a new discussion.
• Level 8
Mac OS X

You are blowing smoke. Suppose your house has 150 amp service. By your reasoning the instant that you turn on a 60 watt lightbulb (~0.5 amps) it should explode when hit by that 150 amps.

You are absolutely clueless about electricity, and the difference between voltage, current supply capacity, and current supply demand. I'd never let you near the HVAC in my house.

The Apple "charger" is not a charger. It is a 5 volt power source, capable of supplying 1.79 amps (for the iPhone power source), or 2.1 amps (for the iPad power source). The charger is in the phone. The MOST the iPhone ever draws from these power sources is 1.1 amps, and most of the time it is 1 amp. AS MEASURED, not as speculated.

The charger in the phone, BTW, IS A TRICKLE CHARGER, as explained in my previous posts. The initlal current that it draws is 1 amp, and it starts to decrease at 60% charge, dropping linearly to ZERO at full charge. As measured, not as speculated. As posted.

Oh, the meter I am using measures down to 0.01 amp. I don't know where you got 0.1 amp from.

The reason to argue with a mathematical equation, BTW, is because that equation is the WRONG EQUATION for this model.

If you doubt the reports from the 4 people in this thread who have actually gone and measured to reach their conclusions why don't you do the same?

Then explain how, with 5 watts going into the iPad power source when connected to the iPhone, you can get more than 5 watts out of it?

• Level 8
Mac OS X

RF9 wrote:

And trickle charing vs. rapid charging is not what this conversation is about.  Yes slower charging is always more gentle, but no one is disputing that.

Which is why the iPhone charger IN THE PHONE taper charges after 60% of capacity is reached, and shuts off completely (doesn't even trickle charge) when the battery reaches 100%

• Level 1

Lawrence Finch wrote:

RF9 wrote:

And trickle charing vs. rapid charging is not what this conversation is about.  Yes slower charging is always more gentle, but no one is disputing that.

Which is why the iPhone charger IN THE PHONE taper charges after 60% of capacity is reached, and shuts off completely (doesn't even trickle charge) when the battery reaches 100%

Agreed.

My experience is that rapid charging tapers after 80%, not 60%, but I never ran a test from (close to) 0%.  Mine always started at around 30% and consistently saw full current until about 80%.  I imagine things like increased battery temperature and length of time charing could start the charge rate reduction earlier.

Subsequently Apple also says 80%.

http://www.apple.com/batteries/

Most lithium-ion polymer batteries use a fast charge to charge your device to 80% battery capacity, then switch to trickle charging. That’s about two hours of charge time to power an iPod to 80% capacity, then another two hours to fully charge it, if you are not using the iPod while charging. You can charge all lithium-ion batteries a large but finite number of times, as defined by charge cycle.

I'm not saying it can't happen at 60%.  It's very likely that there are cases where charging slows earlier.

The point of the matter is exactly what you said, that the charging slows and eventually stops.

I think the only question is could it be benficial to the battery to never even allow full rapid charging, to say limit the phone to 500mA so the charge rate is slower across the board?  Perhaps, but my educated asusmption is that it will not help in a significant way.  Apple's engineers designed it this way knowing what they were doing.

Here are my test results graphed.

• Level 1

What this conversation diverted to is highly educated people using there qualifications to measure in on the fairly insignificant subject.

I'm sorry I wasn't commenting necessarily on significance that has already been compared quite sufficiently.  I was making the accurate statement that the engineers obviously didn't want to accept.  The mathmatical equation they tried to shoot down and the reasons they might do this.  Engineering is all about mathmatices (laws of physics) especially since they were commenting on it themselves.  The question is, how much shorter life does the battery have?  Is Merlin 1128 satisfied that the difference in battery life is "SIGNIFICANT" to HIM or not.  That's for him to answer.  But when engineers get to commenting on the specifics (ignoring mathmatical equations that their education is based on) I feel it necessary to point out their flaw.  Why would they ignore their own education.

By the way, the mathmatical equation you don't care about was demonstrated in kelvinnguyen's comment back in April on the 11th.

Re: Can use ipad charger to charge iphone?

Apr 11, 2013 2:11 AM (in response to emfung)

Very simple.

1Ah = 3600 C (coulomb)

1 mAh = 3.6 C

Your iPhone needs total charge Q = 1440 mAh = 5184 coulombs

Your iPhone has a fixed battery inside. So capacitance is constant. C is constant.

C = charge/voltage = Q / V

V is constant (iPhone/iPad charger supplies 5V)

Q = charge_current x charge_time = I x t

So:

C = Q / V = (I x t) / V

=> charge_time  t = ( C x V ) / I

in this equation we have C constant, V constant (5V).

Therefore, higher charge_current will reduce charge_time.

There is no other side effect from this equation.

You can go back to the original post to try and make the foregoing comments seem insignificant to the post, but that doesn't make the statements any more accurate that came from the ones that made them.  And I don't think you know how much respect I have for such people, even Lawrence Finch, and FMaxwell.  I'm sure I could learn a lot from them.  I don't know how much I would learn from someone that just wants to make trivial their comments, saying the post was insignificant to scientific accuracy.  Oh well, I'm sure you will do well without any further comments from me.  I don't have time.  BTW, I didn't know we were on the Last Word With Lawrence O'donnel show.

• Level 8
Mac OS X

ronhenderson2 wrote:

There is no other side effect from this equation.

First, power sources don't GIVE amps. They MAKE AVAILABLE amps. The load determines how many amps are actually used. The ampere rating of any power source is the MAXIMUM that it can supply, not the amount that it forces into the load.

You really don't get it, or you are being deliberately obtuse. IT ISN'T A CHARGER!!!!! IT IS A POWER SOURCE. THE CHARGER IS IN THE PHONE. The MEASURED output of the iPad power source when charging an iPhone is 1 amp, NOT 2.5 amps. When charging an iPad the MEASURED output is 2.1 amps.

From the video I linked to, the MEASURED INPUT to the iPad charger when charging an iPhone is 5 watts. Now let's use an equation that IS relevent, since you are so into equations. P = I * E. Solving for I, we get I = P/E. P = 5 watts, E = 5 volts, so I = 5/5 or 1 amp. So the iPad charger is NOT pumping 2.5 amps into the iPhone; it is supplying 1 amp. This, of course, assumes that the Second Law of Thermodynamics is still valid.

The equations that you so carefully copied are true, but have nothing to do with what we are discussing.

• Level 1

Oh, we are talking about batteries which are DC current and not AC.  They must be compare to what DC regulators and other components react to when subjected to much higher currents than they are rated for.  The iPad and iPod chargers convert to DC.

• Level 1

If you read my OTHER POST thoroughly, you would see that I acknowledged that the regulator is INSIDE the device.

• Level 1

Is it the consensus that a 20 amp power source would not fry the regulator in the devices (since this post directed to a technical discussion).

• Level 6
iPhone

It's also the consensus that you remain grotesquely clueless about how the iPhone internal regulated charger works. But, don't let that stop you from coninuting the trolling.  You are getting the attention you seek.

• Level 1

Sorry, I didn't mean to offend.  I didn't know you were replying specifically to certain posts becaues you said you hadn't read the thread.  That's OK, I'm tired of going around and around with people on here.

It's a simple, simple, simple, simple concept.  The iPhone will use 5 watts from a 5 watt adapter and 5 watts from a 10 watt adatper.  The iPhone will not allow itself to charge faster with either because of it's internal charing circuitry.

The key thing here is that these are power adapters, not chargers, which is what seems to confuse people.  The charger is in the phone, the USB adapters are just transformers and A/C-D/C converters (or "power supplies.")

It doesn't take calculations, sensitive equipment, or even sceince to figure that out.  Just a simple ammeter.  That's all I was trying to say, not to edecuate you or belittle your intelligent posts.

I'm tired of going around and aorund on here too so I'm not going to post on this thread any more.  I've said far too much.

• Level 8
Mac OS X

ronhenderson2 wrote:

Is it the consensus that a 20 amp power source would not fry the regulator in the devices (since this post directed to a technical discussion).

Absolutely. In fact, while I have never used a 20 amp power source, I have used a 5 amp power source. Works just fine, and takes the same time to charge as the 1 amp. And the reason is that the LOAD determines the current drawn from the power source. The power source does not push current; it supplies voltage at a specified maximum available current.

• Level 1

The temptations are so overwhelming.  OK, Lawrence, educate me.  Does that mean that the amp rating on the regulators used in these devices are irrelevant?  I'm really confused.  Not to show any disrespect.

Ron

• Level 1

I didn't make the amp rating question clear.  Does the amp rating only apply to output current?

I want to thank you for clearing this up for me if I'm wrong.  Like I said, I have respect for the knowledge of individuals in this field.

Ron

1 19 20 21 22 23 Next