Is this guy the expert he claims ?

I skimmed through and everything the boatmad guy was saying is correct
Modern cells don't increase resistance much due to age, mostly due to high temp exposure and charging too fast.
Also, the leaf battery is a really low resistance so the CV phase will always be quite short, so he is correct that slower charge time is really more a function of the charger/BMS commanding a limited current, not the actual cells
 
Maybe but it's the cells chemistry and internal resistance changes that allow Nissan to charge the 30 Kwh faster, this came from Nissan themselves. I'm not debating the charger limiting the power I know this from charging countless LiPo cells with balance chargers. But the "actual" limiting factors with LiPo were the cell chemistry and internal resistance and as the internal resistance rose due to age so too did heat and voltage sag and when you measure the internal resistance it was always much higher than new and this is how RC people know their cells are dying.

The faster charging LiPo Turnigy always had much lower resistance than the cheap Zippy for instance, so if you bought 30C Zippy you would actually experience a lot more voltage sag than with Turnigy nanotech for instance.
 
Most of the basic claims being made in this thread are true. Neither thermally-induced DCR change or BMS smarts alone explains all the observations made here, and they are valid observations. One does not cancel out the other or negate the fact that it exists.

Yes, performance suffers with cold temperatures because cell DCR, internal resistance--whatever you like to call "it"--increases (worsens) as temperatures get colder from room temperature. "It" is, more or less, the collective effects of electrochemical losses that we tend to model as pure resistances because that's what they look like to us as electrically-oriented thinkers. The symptoms we all see with this are sluggish performance (from the extra voltage drop imposed by the higher DCR) and reduced range (because more power is being lost as heat in the cell while DCR is higher.) And yes, cells will self-heat if discharge is agressive enough to overcome the cold, which will bring DCR back down and restore performance gradually back to what we're used to when it's warm outside. This is all during DISCHARGE.

During CHARGE, there are multiple things to consider. If you have a stone-dumb charger that will charge full blast no matter what, it will charge a cold battery at the rated current in all but perhaps the most extreme cases. If all else is equal, charging with this unit will take longer than using the same charger at room temp because 1) electrochemical heat losses across the cell (high DCR) mean less applied charge power is actually stored in the cell, and 2) the DCR shift slows the end portion of the charge process for the reasons stated earlier in the thread (by Luke?). However, if you were to use this dumb charger to charge a cold pack in this way, you will pay dearly in calendar life and capacity in a short period of time. This is because of loss of cyclable Li in the cell, as so eloquently explained by Luke in his first post in this thread. THAT is why a smart BMS limits current, whether it be from regen or the grid, into a battery that has cell temperatures below a certain threshold.

So, to recap, most performance losses in the cold during discharge are due to increased DCR in the cells imposed by the cold temperatures. Those same things WOULD limit charge times, but there is a bigger risk to long term cell health by allowing full rated charge current into a cold cell, so the BMS limits charge current to a level safe for those cells at those temperatures. This is usually going to be more noticeable than the electrochemical effects alone would have been, so you could technically say that in most systems the slower charging of a cold battery really doesn't have anything to do with elevated cell DCR....even though that elevated DCR is there and can slow charging all by itself.

Hope that clarifies.
 
I remember when playing around with LiPo that charging it cold didn't charge at the same C rate on the Icharger ? I never used temp sensors to charge LiPo.

I'm also nearly sure I observe the Leaf charge at a higher rate in warmer weather , let me explain.

Yesterday I charged the Leaf from almost 0% and after a min or two it charged at about 45 Kw then quickly ramped down around 25-30% to about 30-35 Kw and stayed there until 50-55% when I disconnected, the whole 0-55% took about 22 mins.

I am almost positive in warmer weather that it doesn't ramp sown to 30-35 Kw until after 50%. battery temp yesterday was about 17Deg C and ambient about 5.

So what I'm thinking is that the BMS/ ChaDeMo protocol limits power based on battery and ambient temps ?

Also, how does the Leaf limit power while charging as the battery ages ? does it limit based on how fast it warms ? the battery usually warms much faster while charging and discharging as it's getting older.

Charge times are observed to take a lot longer for the original Leaf as the battery ages, which ages much faster than the current Gen.

Also, drivers of older leafs have noticed greatly reduced regen with an older battery.
 
o00scorpion00o said:
I remember when playing around with LiPo that charging it cold didn't charge at the same C rate on the Icharger ? I never used temp sensors to charge LiPo.
This would likely fall into one of those "extreme cases" I mentioned where the DCR rise actually eats into the ability of the charger to deliver max current. This is easy to do on the bench with a little battery and a beefy charger. Much harder to do with an EV pack and any charger available for that job. Let's stick to talking about cars for now.

o00scorpion00o said:
I'm also nearly sure I observe the Leaf charge at a higher rate in warmer weather , let me explain.

Yesterday I charged the Leaf from almost 0% and after a min or two it charged at about 45 Kw then quickly ramped down around 25-30% to about 30-35 Kw and stayed there until 50-55% when I disconnected, the whole 0-55% took about 22 mins.

I am almost positive in warmer weather that it doesn't ramp sown to 30-35 Kw until after 50%. battery temp yesterday was about 17Deg C and ambient about 5.

So what I'm thinking is that the BMS/ ChaDeMo protocol limits power based on battery and ambient temps ?

It's the BMS that calls the shots, the charging station only does what it's told. I gotta say, those are not terribly low temps. Doesn't seem low enough to necessitate such a cutback, but I'm not well-versed in the rules for those cells, so I don't know for sure.

o00scorpion00o said:
Also, how does the Leaf limit power while charging as the battery ages ? does it limit based on how fast it warms ? the battery usually warms much faster while charging and discharging as it's getting older.

Charge times are observed to take a lot longer for the original Leaf as the battery ages, which ages much faster than the current Gen.

Also, drivers of older leafs have noticed greatly reduced regen with an older battery.

While I think they should, I know of no BMS that takes aging of cells into account as time goes on. What you are likely seeing is the effects of increased DCR over time. As DCR rises, voltage excursion during charge and discharge also increases--more sag during discharge, and more rise during charge. This means the voltage extremes that cause the BMS to cut back charge and discharge energy will be reached sooner than when the battery was new, as well as increased losses in the cell during charge and discharge. Both factors conspire to reduce battery performance as time goes on. Yes, the cell will heat faster than when new because of elevated DCR, but this does not overcome the fact that the cell has degraded and cannot perform as well as it did when new, no matter what.
 
Back
Top