charger 3kw

It's better to cut off the charge source's input if it doesn't have a remote switch.

Depending on what else is on line, relatively violent load dumps at high current rates can impose harmful stresses.

A standalone HVC device could do this, leaving the other built in one set at a slightly higher voltage as a backup for when the primary one fails.

BMS controlling the contactor could do either too.
 
or just buy a proper CC/CV mean well from the ELC or HLC line or better yet: the HRP/HRPG line wich are medical grade.
 
Are you saying these are proper chargers rather than plain PSUs?

Terminate charge automatically when the battery is Full?
 
john61ct said:
Are you saying these are proper chargers rather than plain PSUs?
They are cc/cv power supplies that are commonly used as lithium battery chargers. No cut-off at full charge though, so I would classify them as "dumb" chargers.
 
john61ct said:
Are you saying these are proper chargers rather than plain PSUs?

Terminate charge automatically when the battery is Full?

a (lithium) battery is " full" when you reach the set voltage and the battery does not take anymore current. it is -that- simple.

that some chargers have a little extra components that switch a red led to green when the current drops below a certain value does not change that.

terminating a charge is not a thing, at some point the cell simply will not take anymore current and is saturated at the set voltage, what that voltage is is up to what you have set.

people think that you keep on pushing in power into a cell if you dont terminate the charge. that is not a thing. the battery simply wont take anymore charge, its litteral like filling a enclosed tank of water, at some point its full. it will not overflow and break.

a battery will control the charge itself once you get out of the CC session and get into the CV portion of the charge cycle.

"smart" chargers are not really a thing unless you dive into the canbus enabled chargers and those you need to keep far away from as they serve no purpose outside commercial/industrial setups.

serious_sam said:
They are cc/cv power supplies that are commonly used as lithium battery chargers. No cut-off at full charge though, so I would classify them as "dumb" chargers.

they are exactly the same thing as a "normal" lipo charger except its not written on the box.

OSHA calls it a "foam protection device" on a metal beam and it costs 300 bucks at the health and safety shop, i call it a pool noodle and costs 3 bucks at wallmart.
 
flippy said:
they are exactly the same thing as a "normal" lipo charger except its not written on the box.
Only up to the point of full charge, yes. But a lot of chargers (for example, hobby balance chargers, et al) stop the charge action at a programmed point. They don't continue to apply voltage to the pack when the charge is completed. The pack is effectively isolated by transistors at that point.
flippy said:
some chargers have a little extra components that switch a red led to green when the current drops below a certain value does not change that.
The charger I use operates like you said, when the current drops below a set value, the red light goes green. It doesn't terminate the charge, so I consider it to be a dumb charger. But I don't have a problem with it. It does the job, and I manually disconnect it usually within an hour after full charge is reached.

I will be modifying it in the near future to be able to terminate charge early, but that's just because I only want to charge my cells to 4.1V, and the charger isn't internally adjustable down to that voltage.

I haven't seen any data one way or the other to show if continuing to apply voltage to the pack at full charge for a short duration (say, an hour) has any significant effect.
 
I should add, that at relatively low charge currents, charging past the CC stage and into the CV stage is almost pointless anyway.

For example, my 20s pack has an IR of approximately 0.08ohms. My charger charges at 3A. So at the point that the charge cycle reaches the CV point (and the current starts to drop below 3A), there is only another approximately 0.24V remaining to push into the pack.

i.e If I disconnect at that point, my 84.0V pack will be charged to 83.76V. Pointless to leave it connected past that point IMO.
 
serious_sam said:
flippy said:
they are exactly the same thing as a "normal" lipo charger except its not written on the box.
Only up to the point of full charge, yes. But a lot of chargers (for example, hobby balance chargers, et al) stop the charge action at a programmed point. They don't continue to apply voltage to the pack when the charge is completed. The pack is effectively isolated by transistors at that point.
flippy said:
some chargers have a little extra components that switch a red led to green when the current drops below a certain value does not change that.
The charger I use operates like you said, when the current drops below a set value, the red light goes green. It doesn't terminate the charge, so I consider it to be a dumb charger. But I don't have a problem with it. It does the job, and I manually disconnect it usually within an hour after full charge is reached.

I will be modifying it in the near future to be able to terminate charge early, but that's just because I only want to charge my cells to 4.1V, and the charger isn't internally adjustable down to that voltage.

I haven't seen any data one way or the other to show if continuing to apply voltage to the pack at full charge for a short duration (say, an hour) has any significant effect.

what do you think " charging" means?

that a charger is connected to a battery does not mean the battery is actually being charged. once the voltage between the battery and charger are the same and the cells are saturated the current simply stops flowing. this is a "natural" process the happens inside the cell. no smart electronics needed. so "terminating" a charge by disconnecting the output of the charger does exactly nothing if the battery is fully saturated at the set charge. this is why you dont see it on most chargers exept some overpriced garbage that just needs to add on "features" to justify the pricetag.

keeping a battery floating at the set voltage does not damage it in any way.

in a LOT of cases with cheap bms you even NEED to keep the charger to keep outputting voltage in order for the bms to balance the pack. wich usually happens with <100mA wich is under the trigger point for most "smart" chargers. so your cheap bms will never be able to properly balance the pack.

and yes, advanced RC based chargers need to cut power and all that good stuff because they can do active balancing and all other kinds of smart stuff because al those batteries dont have any BMS systems. but that is not really useful on a bike or scooter with a 2+kWh battery pack and a decent bms. you aint charging a 20 pound battery pack with a IMAX B6.

right tool for the job and all that.
 
serious_sam said:
john61ct said:
Are you saying these are proper chargers rather than plain PSUs?
They are cc/cv power supplies that are commonly used as lithium battery chargers. No cut-off at full charge though, so I would classify them as "dumb" chargers.
CC/CV is for me a given, and of course true current limiting is also required in order to use for charging LI.

PSUs and DCDC converters are mutually exclusive categories from true "chargers", that requiring a charge termination algorithm, sine qua non to deserve the label.

Smart vs dumb, to me is the ability to dynamically alter voltage during a cycle' s stages (e.g. Bulk or Float as opposed to a single Absorb / CV setpoint)

and also the quality of that stop-charge algorithm.

Some chargers are "too smart".

And the ability for the user to set up custom profiles rather than just selecting from canned choices is a critical feature, but orthogonal to the above classifications.
 
flippy said:
a (lithium) battery is " full" when you reach the set voltage and the battery does not take anymore current. it is -that- simple.
No.

Not a single vendor specs that last bit, and in fact doing so is harmful to cell longevity.

If an Absorb / CV stage is desired, an endAmps spec of say, 0.05C trailing current is useful for testing, maintenance protocols or standardized benchmarking.

But since precision is not at all required in normal cycling, CC-only "charge to x.yy V and stop" is perfectly fine for that context.
 
As far as I'm aware, the only "harm" to a lithium-ion battery being left on the charger is the minute amount of cycling happening due to self-discharge, besides the capacity loss that happens when cells are kept at high SOC.

I can't find it now, but I had read a paper before about measuring the self-discharge of li-ion cells. One method was to apply a constant voltage to the cell until the charge current had completely tapered off, such that the remaining current draw was equal to the self-discharge current within the cell.

In the paper, these cells were being left for many days with CV applied, to make the self-discharge measurement more accurate. They didn't seem to be concerned about this damaging the cells. The cells are going to be self-discharging even when they're not connected to anything..
 
serious_sam said:
flippy said:
keeping a battery floating at the set voltage does not damage it in any way.
Says you. Data or it didn't happen.
Actually I know for sure that keeping LI sitting at 100% SoC - or anywhere near there - while the pack is not needed is harmful to longevity.

If loads needs to be fed, say off-grid House bank scenario, it should be kept in storage mode as long as shore power is available, or once at the owner's defined "working Full" in a solar context.
 
flippy said:
once the voltage between the battery and charger are the same and the cells are saturated the current simply stops flowing
Getting anywhere near that point is overcharging afaic.

> keeping a battery floating at the set voltage does not damage it in any way

Again, absolutely false, if you count cycles lost off the back end. Unless the more immediate damage inflicted by a high C-rate use case is causing the cell to only last a few hundred cycles anyway, then I agree the difference will be insignificant.

> in a LOT of cases with cheap bms you even NEED to keep the charger to keep outputting voltage in order for the bms to balance the pack.

Which by definition IMO is a stupid design, to be avoided.

> you aint charging a 20 pound battery pack with a IMAX B6.

Actually, getting the pack to say 98% SoC, and then using a balance charger afterwards is a better balancing solution than using the above kind of BMS.

There are also "balancing only" BMSs, as well as protective ones without balancing.

 
Addy said:
measuring the self-discharge of li-ion cells.
..
They didn't seem to be concerned about this damaging the cells. The cells are going to be self-discharging even when they're not connected to anything..
Plenty of test protocols are used that should be avoided in normal use, even "testing to destruction" is pretty common.

Some chemistries have in effect zero self-discharge even over a decade, as long as properly stored

 
serious_sam said:
I should add, that at relatively low charge currents, charging past the CC stage and into the CV stage is almost pointless anyway.
Exactly.

In a solar context, where charging current may go below the endAmps spec for precise charge termination

it is easily possible to harmfully overcharge even at a voltage setpoint far below that spec'd by the maker.

The specifications as laid out by cell manufacturer in the cells' data sheets specify absolute maximum / minimum limits of acceptable ranges only, not recommendations intended for normal functioning, operating conditions for day-to-day cycling.

These are "stress ratings", to which the cells can only be subjected for short times, without causing irreparable damage.

Frequent exposure to such stressful conditions, or for extended periods, can adversely affect cell reliability and will greatly reduce lifespan.


 
flippy said:
a (lithium) battery is " full" when you reach the set voltage and the battery does not take anymore current. it is -that- simple.
Nope. Read the battery data sheet. Many of them have a charge termination criteria; <.1C or <.01C are common charge termination values.
that some chargers have a little extra components that switch a red led to green when the current drops below a certain value does not change that.
That most chargers have a little extra components that switch off charge current when the current drops below a certain value means that they terminate charge.
terminating a charge is not a thing
It really is a thing that good battery chargers do. If you don't believe me, hook up a scope to one.
people think that you keep on pushing in power into a cell if you dont terminate the charge. that is not a thing. the battery simply wont take anymore charge, its litteral like filling a enclosed tank of water, at some point its full. it will not overflow and break.
Cells self discharge. So it's like trying to push more water into a leaky tank. Will it keep taking it? Yes.
"smart" chargers are not really a thing
You may have never seen one, but they are indeed a thing - and most good battery chargers are indeed "smart" (i.e they teminate charge)
 
Just a nit pick, but plenty of charge sources are not chargers, e.g. rectifier/PSUs, DCDC converters, old-school RV "converters", alternators etc

If it does not terminate, it is not a charger, even the dumbest one includes that functionality.
john61ct said:
Smart vs dumb, to me is the ability to dynamically alter voltage during a cycle' s stages (e.g. Bulk or Float as opposed to a single Absorb / CV setpoint)

and also the quality of that stop-charge algorithm.

Some chargers are "too smart".

And the ability for the user to set up custom profiles rather than just selecting from canned choices is a critical feature, but orthogonal to the above classifications.



 

Curious. If the cell manufacturer specs their cell at 4.2v as a maximum, how can it be harmfully overcharged if it never reaches 4.2? Or are you saying the solar system might not be creating enough energy to actually allow whatever is controlling cutoff to actually trigger said cutoff because it doesn't even realize it's passing current, thus leading to an overcharge? Or something else entirely?
 
The fundamental issue is that this only applies if longevity is a high priority, and the use case allows for getting a high cycle count lifespan to start with. Also presumes that maximising every scrap of mAh capacity utilisation is not important.

The key is that the **charging setpoint** is an entirely different number from the **cell voltage** after a few hours of resting isolated.

Also when overcharging (in my longevity sense above) there will be, even after **days** of resting isolated, some degree of "surface charge" voltage over the true baseline, that does not proportionally represent any significant SoC / capacity utilisation.

Once that is dispelled by withdrawing say 0.1% of the pack' mAh capacity, call that "100% resting Full voltage" for that chemistry / model cell.

As a side note, IMO even a low current but constant Float charge at a voltage **below** that definition of Full is not optimal for longevity, beyond the damage inflicted by sitting at a high SoC.

OK back on topic.

Say you define your endAmps spec for "Precise 100% Full" for benchmark testing calibration etc, as
4.04V at 0.5C, held for Absorb / CV until trailing current tapers down to 0.05C then stop
Note this is a higher SoC than you would use for your definition of "daily usage cycling Full", which might as well be defined as "charge to a voltage and stop", IOW Bulk / CC stage only, no Absorb / CV stage at all, both gentler more conducive to longevity and very simple to implement in a failsafe manner.

Now back on topic again.

What happens if your charging current, rather than the normal rate say 0.2C or higher, is down way low, even below the endAmps spec as discussed above?

Then if you are trying to do a CV stage, you can't even tell when to stop!

That is why when charging current is very high, it is safe to use a bit higher V setpoint, the "after charging sag" is greater, represents a lower SoC point than at much lower current rates.

At **very** low current rates like below 0.1C even if using a CC-only profile, to be safe the termination setpoint should be tweaked downward a bit.

Examples from a real life bank, using LFP chemistry in this case 160Ah Thundersky cells, where this issue is more both critical, and the delta between charge setpoint vs resting cell voltage is a bit higher:

Vendor spec from Mr Winston Chung Hing Ka might be
3.65Vpc, endAmps 0.01C
way too stressful except for when longevity is not desired but every scrap of mAh capacity is.

My own definition of "100% Full, for benchmark testing calibration etc" is
3.46Vpc at 0.4C, endAmps 0.05C, then stop

My usual "daily usage cycling Full" is
Charge to 3.52Vpc and stop
for normal current rates, but maybe
Charge to 3.35Vpc and stop
in a situation where the C-rate may fall below 0.1

Note "resting Full" cell voltage is 3.34-3.36Vpc, anything over 3.37V is just surface charge.

Usually this whole idea is only relevant with solar-only contexts, and the danger of overcharging IRL is mitigated by concurrent loads and the fact that in most places the sun does set every night. :cool:

I hope this is clear and helps explain that overly terse edge-case generalization.







 
HK12K said:
Curious. If the cell manufacturer specs their cell at 4.2v as a maximum, how can it be harmfully overcharged if it never reaches 4.2?
The other proviso is that the desire for longevity even in a low C-rate context may be carried to extremes compared to the mainstream idea of batteries as a short-lifespan disposable commodity.

Say the Thundersky spec for EoL based on 80% SoH is 3,000 cycles.

Following all my "coddling regime" care recommendations may result in going well past 10,000 cycles.

A LTO chemistry bank vendor-rated at 10,000 cycles cared for in that way, may get passed on through a few generations of grandchildren.

IOW the bank becomes a once-off capital investment, lasts much longer than any boat or camper it might be installed in, maybe even longer than that off-grid cabin in the woods.

Most people have no such ambitions, and therefore should ignore the extreme aspects of my comments when I insert that "for longevity" qualifier.
 
Yes, much more clear and thank you for helping to further my understanding.
 
you're very welcome, happy to help

A bit more on vendor specs being too high https://www.cruisersforum.com/forums/f166/review-feedback-lifepo4-upgrade-228829.html#post3054754

and on the BMS balancing issue flippy referenced https://endless-sphere.com/forums/viewtopic.php?p=1503339#p1503339


 
john61ct said:
serious_sam said:
flippy said:
keeping a battery floating at the set voltage does not damage it in any way.
Says you. Data or it didn't happen.
Actually I know for sure
Saying "I know" isn't data. Without some kind of reliable source, it's just your opinion. You often state your opinion as though it is fact.

But like I said, I haven't seen any reliable data one way or the other. I will err on the side of caution anyway, but I'm not going to perpetuate an opinion without some reliable data to back it up.
 
Strange how you cut out the precise statement of what it is I am saying I am sure about.

That is true about many many things on all sorts of topics for which I would not bother looking for some research study for "proof".

The fact is in that case there is absolutely nothing to be gained by **not** following that advice, and no one credible recommending doing otherwise, so seems a bit odd to choose that bit to take a scientistic stand about.
 
Back
Top