Quad Cell Distributed Digital BMS design discussion

Clever it is not, just plain simple and effective.

It's dumb, but robust, good qualities in something that gets chucked around and exposed to shock, vibration, a high EM environment (Peter can elucidate further on the EM issues on an electric vehicle, he had a few difficulties overcoming the noise problems on his microcontroller based BMS) and fairly severe climatic extremes (bike parked in the sun or left outside in sub-zero temperatures).

If you want a microcontroller based design, then, as I've mentioned previously, Peter Perkins design takes some beating. If you want low parts count and simplicity, then a simple shunt design is hard to beat. I doubt you can have both.

Jeremy
 
The above simple circuit is 11 parts (including 4 active devices) per cell plus overhead of the non per cell parts. And that does not include the LVC circuits which would add 3-6 more parts per cell.

My micro design is currently 6.25 parts per cell plus overhead, and only 2 are active devices. It is hard to get below that for the functionality required. The analog designs certainly cannot.

Parts in the overhead column are a few more (than the simple circuit above) for my micro design, but it is pretty lean. In the end the parts count is not that high for my design. And functionality is high.

Peter's single cell PIC design is about 20 parts per cell plus master overhead. That's why I went away from single cell designs, I started there too. Very high parts count. And the low cell voltage makes it difficult to regulate and filter and protect the micro and make it reliable. No surprise that was difficult. If the cell gets down to 1.5 volts the single cell system will likely fail to work altogether.

I see there is an aversion to using micros here in BMS circuits. A great deal of effort is expended to solve the problems without using micros. This raises cost and reduces the flexibility and feature set of the result. It has resulted in years of delays fiddling with the designs and reworking pc boards to solve the problems. Micros aren't trivial to deploy but they are not that difficult. The CellLog is an example. Software upgrades are far more convenient than pcb reworking.
 
Alan B said:
The above simple circuit is 11 parts (including 4 active devices) per cell plus overhead of the non per cell parts. And that does not include the LVC circuits which would add 3-6 more parts per cell.

My micro design is currently 6.25 parts per cell plus overhead, and only 2 are active devices. It is hard to get below that for the functionality required. The analog designs certainly cannot.

Parts in the overhead column are a few more (than the simple circuit above) for my micro design, but it is pretty lean. In the end the parts count is not that high for my design. And functionality is high.

Peter's single cell PIC design is about 20 parts per cell plus master overhead. That's why I went away from single cell designs, I started there too. Very high parts count. And the low cell voltage makes it difficult to regulate and filter and protect the micro and make it reliable. No surprise that was difficult. If the cell gets down to 1.5 volts the single cell system will likely fail to work altogether.

I see there is an aversion to using micros here in BMS circuits. A great deal of effort is expended to solve the problems without using micros. This raises cost and reduces the flexibility and feature set of the result. It has resulted in years of delays fiddling with the designs and reworking pc boards to solve the problems. Micros aren't trivial to deploy but they are not that difficult. The CellLog is an example. Software upgrades are far more convenient than pcb reworking.

I look forward to seeing your design in due course.

I've certainly no aversion to using microcontrollers (a quick scan on some of my contributions here will quickly prove this) but I'm not convinced that a very simple, safety critical task, like cell charge management, is necessarily best handled by one. Each to our own, I guess.
 
Gents I better chip in.

I started my project with Picaxe chips a couple of years ago now. As Jeremy mentioned they are great to get your head round, but I/we now use simple plain pics 12F683 pics for the slaves to reduce cost. The parts count per slave is now quite a bit less as we have refined the design. We now do a stackable multi cell slave board 16 or 25 (cells) per board with flying lead connections which work well and also reduce the noise issues as the slaves are remote from the cells. I'm no electronics experts and by gum we have learnt a lot on the way :shock:

The early program code allowed an analog type mode as well and reqd no Master. We moved on and now have a Master with a lot of features not all of them perfect, but it's a work in progress and we are currently discussing a V3 Master which will interface with the open source controller design. It has a more advanced video output chip the Tellymate and will use a 40 pin pic with a lot of spare capacity. It will control a Meanwell charger setup using PWM control of an opto isolator interfacing with the voltage control circuit.

The current Master (V2) (95 cell Maximum) monitors calculates and displays.

Cell voltages, High cell V, Low cell V, Temps using I2C sensors, Rpm, Speed, Amps, Soc.
It can detect cell over or under V and controls balancing loads of upto 350ma.
It can detect failure of the slaves due to the slave stopping functioning (Low cell V) But they work down to around 2v in practise. It also identifies the affected slave (cell) in error conditons. It can detect over temp conditions and whatever limits you decide to include for voltage/current etc. It can output data to a remote display using a 433mhx data link in the house for instance. It can ouput cell data directly into to an excel spreadshet using the old Picaxe usb programming cable. It has an audible alarm and watchdog on board to reset the master if it locks up. It can control an external charger via a 12v relay or an opto isolated output. Ditto the controller via an opto output. It has an onboard I2C eeprom for alarm data logging and probably a few other things I've forgotten :wink:

The slaves operate independently but send info to the master via a simple serial link 9600 baud on request. They maintain basic control over the load and can cut it off if the master tries to turn it on when V is to low or if the master dies. The slaves will cut the load if cell V goes too low or turn it on if cell V goes too high. The slaves default voltages and control parameters can be updated by the Master sending commands.

Of course the BMS doesn't do everything and it's not perfect. I use it in my two Lithium cars and there are several others or variations of it out in the wild or under construction.

FYI the interference problem never affected the slaves it was the comms to the master which caused me problems mainly due to my poor layout and wires drapping everywhere!! We havn't had interference issues with the remote slaves. once i went to twisted pair and thought about the layout i got it under control. The 12F683 chips are amazing and seem to be unaffected by very hostile electrical/rfi enviroments :D

Good luck with your own design.
 
Very nice, Peter, and thanks for your comments. You have made a lot of progress on this in a couple of years.

Feel free to put links to your stuff in this thread, it helps people find things and learn what's out there.

Using a micro opens up so many interesting possibilities. I built some PIC projects years ago but more recently have come to prefer AVR chips and GCC-AVR. Lots of interesting parts out there to choose from.
 
Most all Battery Management Systems provide charge control and low voltage cutoff. What else should they do? What would take them into the "next generation"?

It would be nice if it would tell us about a problem "before it happens". When we key on, it would be good to get some idea of the condition of the pack and especially the weakest cells. If the pack has been resting, and we know the temperature, capacity and chemistry type then the remaining capacity should be roughly predictable.

If we use RJ45 connectors, and dual optos, there is one unused parallel bus available. What would be a good use of that?

Other ideas???
 
Alan B said:
When we key on, it would be good to get some idea of the condition of the pack and especially the weakest cells. If the pack has been resting, and we know the temperature, capacity and chemistry type then the remaining capacity should be roughly predictable.

Interesting idea, but how can you determine capacity, even roughly, from just that data? Cell voltage is an extremely poor indicator of state-of-charge for all lithium chemistries. Apart from measuring amp-hours out vs amp hours in, and correcting for Peukert, temperature etc, I'm not sure that there's any other way of trying to determine capacity, even roughly. My guess is that some means of doing a pulse discharge measurement of cell Ri might possibly work, but even then I think it might be difficult to calibrate this accurately enough to make it useful.

The open circuit terminal voltage seems to vary from cell to cell, and not in direct relationship to cell capacity, as far as I can tell. I've been monitoring a pack for a year or so now, and a couple of cells are consistently a bit lower than the others when the surface charge has had time to settle. Those cells aren't the lowest or highest capacity cells in the pack though, they're both pretty much in the middle.

If you can come up with a way to determine remaining capacity, then that would be useful. Having tried it with my battery meter project I think it may be harder than it looks!

Jeremy
 
I agree with Jeremy here OC voltage is very variable, I've had several multi cell packs from the earliest TS cells to the latest A123 ones and they do vary in terminal voltage which does not relate to actual capacity. The higher terminal voltage ones do IMHO appear to have a higher IR but are not always as has been stated the first to be exhausted or have the lowest capacity. The only way to build up useful data for impending problems is with some sophisticated logging of cell data over a considerable period/number of cycles. It's pretty easy to detremine which cells has least capacity as it is the one that reaches the low V cut off first consistently over a number of cycles. I don't think soc charge has to be that accurate to the nth degree either. The soc is a guide and with any reasonable ah counting over short periods of EV charge and discharge the error in pic timming is negligible. If you can detect the current flowing with any accuracy and have a simple one second timer then ampseconds is pretty easy to implement. IMO with lifepo4 there is no need to do any fancy temp/puekert compensation.
 
Alan B said:
Most all Battery Management Systems provide charge control and low voltage cutoff. What else should they do? What would take them into the "next generation"?

It would be nice if it would tell us about a problem "before it happens". When we key on, it would be good to get some idea of the condition of the pack and especially the weakest cells. If the pack has been resting, and we know the temperature, capacity and chemistry type then the remaining capacity should be roughly predictable.
As said, that's not really easily possible (if at all) from that information. You'd need to keep all the previous usage data stored within the BMS, for every cell, so that the BMS could do trend analysis periodically. It could do this at startup or at shutdown, perhaps, or during idle time when the vehicle is normally shut off. For the latter, it would do the analysis, store the records, set a flag to tell you about it when you next start the vehicle, then shut itself down (preferably with a relay that cuts it off from the pack preventing drainage during disuse).

That way it could tell you about cells that are higher Ri based on for instance the voltages at end of charge and end of discharge, and during high current draws. It could tell you about lower capacity cells that even after balancing still hit LVC sooner than the others at end-of-discharge. It could tell you about cells that tend towards higher or lower voltages and/or capacities under various temperatures and/or current loads than other cells do. Stuff like that. Then you can do what you like with the information.

If you program it to interpret that information for the average person, it could just point out with an amber-lighted cell on a display of the pack cells which cells are not quite as good as the others (which would all be green). For cells that have been determined to be beyond some tolerance percentage you've programmed, it could light them up red, as needing replacement.


Another thing that I'd like to see in a BMS is one that doesnt' do a shunting balance charge, but rather one that charges the pack until the first cells reach HVC, and then switches off the main pack charger, and initiates a balance charge with individual chargers for each cell. Those don't have to be as fast a charge as the main pack charge, so even in-total can be smaller than it. Or a single one (or a handful of htem) can be switched around from cell to cell, automatically by teh BMS, until the pack is really full.

It's "safer" on the cells than a shunting balancer, which will still potentially trickle current thru the already-full cells, which could overfill them if conditions are right(wrong) and it goes on long enough. It's also less wasteful of power, and should generate a lot less waste heat.

Even for a really huge pack, like for long-range car-sized EVs, it should still be practical to do that way.
 
amberwolf said:
Another thing that I'd like to see in a BMS is one that doesnt' do a shunting balance charge, but rather one that charges the pack until the first cells reach HVC, and then switches off the main pack charger, and initiates a balance charge with individual chargers for each cell. Those don't have to be as fast a charge as the main pack charge, so even in-total can be smaller than it. Or a single one (or a handful of htem) can be switched around from cell to cell, automatically by teh BMS, until the pack is really full.

It's "safer" on the cells than a shunting balancer, which will still potentially trickle current thru the already-full cells, which could overfill them if conditions are right(wrong) and it goes on long enough. It's also less wasteful of power, and should generate a lot less waste heat.

Even for a really huge pack, like for long-range car-sized EVs, it should still be practical to do that way.

There's no reason why a shunt-based charge management strategy can't do this. In practice, even for a fairly big pack (my large pack is 80Ah) even a couple of hundred mA of balancing charge brings the lowest sub-pack up to the same voltage as the others after maybe an hour at most (usually it does it within maybe half an hour). Dissipation with a couple of hundred mA is only 0.73 watts (for LiFePO4) per cell shunt, so no big deal and almost certainly less than a separate cell charger would use/dissipate.

If all the cells (or sub-packs) are at the right terminal voltage at end-of-charge, then they are all full to their individual cell capacities. Very, very little current flows through the fully charged cells during balancing, virtually all the balancing current flows through the shunts in practice. The couple I measured were drawing less than 1mA when sitting at 3.65V with an active shunt across them pulling close to 200mA. The risk of damage to cells is near-zero as long as their terminal voltage is kept strictly within allowable limits. You can't 'overfill' a lithium cell - once it reaches the clamped terminal voltage it simply stops accepting charge (which is why the terminal voltage (if left unclamped) rises so quickly at end-of-charge).

Jeremy
 
Alan B said:
I would like enough accuracy on charge state for a gas gauge, not necessarily a lot more. If we measure current out of the whole pack, and have access to all cell voltages, temperatures, and charge/discharge history I would think we could do something useful.

That was precisely the goal of my project, reported on another thread here. If you measure current per unit time out, then that's plenty good enough; there's no merit in measuring temperature for this function, primarily as there's no reliable algorithm that will use it to usefully refine accuracy, AFAIK. There's no real advantage in measuring current per unit time in, either, unless you have regen and want to allow for it with some rough recharge efficiency factor.

The problem with measuring current in and out as the means of tracking remaining capacity arises when you try and make cumulative measurements, say several charge/discharge cycles. The errors multiply rapidly, making the meter inaccurate pretty quickly.

One possible way to get around this limitation is to reset the capacity gauge to 'full' every time the pack is charged (assuming that it will always be fully charged, not just topped up), using the 'all shunts active' signal from the BMS. You can then use the first cell LVC event to trigger the minimum capacity measurement and use that to calibrate the gauge. The gauge stores the effective capacity (the Ah used between fully charged and the first LVC event) and uses that to set the dynamic capacity range displayed. Every time the battery is run down to an LVC event the gauge recalibrates capacity using the stored measurement of current per unit time taken out between charge and this point.

This is pretty much the approach used on the 'energy meters' used on some EVs I believe - I'm pretty sure that the Citroen/Peugeot EVs use a variation on this theme to monitor their NiMH packs, by resetting their meters whenever they do a full balancing charge.

The downside with this approach is that running the pack down to a cell LVC event intrinsically risks reducing pack life, plus many users may never use this much capacity from a pack. There is a lot of evidence that using lithium chemistry packs in the mid-part of their capacity range gives longest life (say from 10% to 95% or so). The smaller the dynamic usable range of capacity that's used, the greater the cycle life.

This then leads me to conclude that, for a usable 'fuel gauge' type solution, it's probably OK to take the simplistic approach of pre-programming a slightly low effective battery capacity (of maybe 85% or 90% of the pack nominal capacity) and then simply using a strategy that always resets the gauge to 'full' on charge and subtracts the current per unit time used from this stored figure. This is the strategy I've finally adopted and it looks like it will work adequately as a 'fuel gauge'. It would even have the built in 'reserve' that a lot of car fuel gauges have (i.e. it wouldn't really be 'empty' when it says it is!).

Jeremy
 
Sounds pretty reasonable. Might link to that thread here. Don't recall running across that one.

One might have an algorithm setting the capacity to something like 90% when the "balancing" phase begins during charging, and 95% when all cells reach max V, and then 100% when they have "soaked" for a period of time at maxv. Might want a charging current measurement to integrate charge before hitting balancing.

Setting the initial capacity to 80-90% of the ratings is reasonable, giving a "reserve" and staying out of the bottom capacity. Tripping LVC would add capacity calibration data iff the pack was fully charged this cycle.

I see manufacturer's charts that show temperature vs capacity changes, that factor would be easy to apply to the result.

The other uses of temperature would be more dynamic. A temperature gauge for controller, motor and batteries would be quite useful. I would like to know my motor is getting hot, or the controller or batteries. Especially when climbing hills. One can adjust speed, etc to protect the equipment.
 
Guys, there is a lot of appnotes by Texas Instruments describing their Impedance Track algorithms. Those do indeed figure out both state of health and state of charge of a pack based on the voltage changes between various states (relaxed, charging, discharging etc). A somewhat simpler approach is to count charge and apply various compensations for temp/rate/age etc but that gets messy quickly. So it is indeed possible, but does require a few PhDs here and there, or using their gas gauges to do the heavy lifting :)
 
Thanks for the info. Have to look that up and put some links here.

Had a long flight to think and sketched out some thoughts on this. Rather than try to make a perfect gauge, perhaps a better way to start is display selected data and let the fuzzy operator do the correlation. Like:

Lowest cell v
Lowest recent cell v - most val data?
Pack v
Pack recent low v
Recent max current

Ideas?
 
A gas-gauge that depends on the user to interpret the data can make things significantly easier but I wouldn't even consider your next step until you've thoroughly reviewed the TI Impedance Track app notes for several of their devices, all the way up to the bq87PL114. TI has spent years perfecting these devices and from experience I can say that they work very well.

I think a review of the algorithms involved will help you a lot in deciding on what features to include. Letting an established gas-gauge do all the ugly math while you concentrate on the overall specs (how you'll respond to the gas gauge data), safety features and human interface stuff can make creating a great BMS a lot easier. It's really no different than using packaged ADCs rather than all sorts of discrete components or using a packaged OVP chip instead of an op-amp, resistors, FET drivers, etc. :)
 
I would recommend searching for them on the TI web site for the best hits. Here are a few to start:
http://focus.ti.com/lit/wp/slpy002/slpy002.pdf
http://focus.ti.com/lit/an/slua450/slua450.pdf
http://focus.ti.com/lit/an/slua529/slua529.pdf
http://focus.ti.com/lit/an/slua534/slua534.pdf
http://focus.ti.com/lit/er/sluu386/sluu386.pdf
http://focus.ti.com/lit/an/slua421a/slua421a.pdf
http://focus.ti.com/lit/an/slua466/slua466.pdf
https://focus.ti.com/seclit/an/slua404/slua404.pdf
http://focus.ti.com/lit/ug/sluu330b/sluu330b.pdf
http://focus.ti.com/lit/an/slua556/slua556.pdf
http://focus.ti.com/lit/an/slua537/slua537.pdf
http://focus.ti.com/lit/an/slua524a/slua524a.pdf

I also recommend checking the product folder page for each of the gas gauges to see what document links they have. The Technical Reference manuals have lots of info too.

With the large number of settings available for the gas gauges, it can seem overwhelming at first. But, it's these settings that make them so powerful. Instead of modifying your hardware or firmware for each new test/setting/cell you want to try, just change a setting via the chip's PC software! This lets you get up and going very, very quickly with a BMS design (especially if using or modifying a TI reference design, i.e., one of their EVM designs) and then very easily change the settings to best match different needs without having to modify firmware, etc.
 
If you use LiCoO2 or LiMnO2 based batteries, you can know the exact state of charge simply by taking a cell voltage reading when at zero current draw and a known temperature. ;)
 
Alan B said:
Most all Battery Management Systems provide charge control and low voltage cutoff. What else should they do? What would take them into the "next generation"?...

If you really want to take a bms to the next level, then have that sucker balance during use near the bottom. ie squeeze every drop out of the pack by using the extra capacity of the best cells in the pack to support those with the lowest capacity, so if you hit LVC (which hopefully is rare) that it's actually the pack's LVC, not just the LVC for one cell. Until then BMS is just a Battery Monitoring System instead of a truly useful Battery Management System, and the pack is only as good as it's worst parallel string. 8)

John
 
John in CR said:
Alan B said:
Most all Battery Management Systems provide charge control and low voltage cutoff. What else should they do? What would take them into the "next generation"?...

If you really want to take a bms to the next level, then have that sucker balance during use near the bottom. ie squeeze every drop out of the pack by using the extra capacity of the best cells in the pack to support those with the lowest capacity, so if you hit LVC (which hopefully is rare) that it's actually the pack's LVC, not just the LVC for one cell. Until then BMS is just a Battery Monitoring System instead of a truly useful Battery Management System, and the pack is only as good as it's worst parallel string. 8)

John

This would take very high current handling components in the BMS, probably too large and complex to be practical. Piddling in a few milli-amps over hours is quite a different story than selectively sucking 20 to 40 Amps out of individual cells, in milli-seconds. A great idea, I just can't imagine the components that will make it work?
Gordo
 
What I ponder on occasion is a complete power and pack management system. Each cell would be by itself, not series or parallel connected, but instead just feed into the system. That system would connect cells as needed to meet the demands of the moment and disconnect them when they became discharged or started showing problems that would affect performance and/or result in further damage to the cell.

It would probably be not only expensive, but large and likely inefficient, but I wonder what it would enable, amongst the many things that limit us right now. I'm sure it's cheaper to just buy a pack that's 20-50% larger than you know you will need, to allow for runt cells and the like, and buy a few spare cells in case some die.

I'd say one of the biggest limitations is that if you have a runt cell, meaning one that is lower capacity but otherwise not underperforming, then your whole pack is done for the ride when that cell hits LVC. If it is 10-20% lower in capacity than the others, you're seriously shorted on range. Right now, that would mean manually taking apart your pack, disconnecting that cell, and wiring across it's terminals so the rest of the pack remains in series and can be used the rest of the way.

With many packs, that's impossible to do on the road (or severely impractical). It could mean untaping a brick of cells and fiding the runt cell, then disconnecting it from the pack, and shuffling all the BMS wires, too, so that the BMS doesn't continue to cut off simply because there is 0V on that cell's terminal. Or wiring aroudn the BMS entirely, if it wont' work without all the cells in place.

With "Amberwolf's PMS" ;) the runt cell could be kept in a pack without significantly lowering the range, because it would be automatically bypassed once it reaches it's cutoff point. There'd be no need to keep shuffling charge around in a pack because the discharged cells would be removed as they happen.

The voltage of the system would go down, but even that could be compensated for by always having more cells than needed to start with, expecting that a few will become runts. Then the system could be limited on output by the motor controller (via a throttle limitation) to the max voltage needed to get to the max speed you'd originally set it up for. Or alternately, simply switch out the runt cells for the good spare cells, instead of leaving the good spares inline at all times.

There would also be no need to shunt-balance, as all the cells could be connected in parallel for charging, with the caveat that it would probably have to individually bring up the lowest cells first so that the parallel charge connection didnt' result in slagging the cell connections, cells, or terminals. :)

I had some other things in mind when I started typing, but I have already forgotten them. :(
 
Gordo and amberwolf,
Some of these concerns can be addressed with a BMS that balances cells by capacity, not voltage (e.g., as can be done with TI's bq78PL114). It's actually kind of creepy watching it work. :)

Depending on the initial imbalance, it might take a couple of cycles to balance the cells out but after that the BMS should have no problem keeping up with any capacity imbalance that forms due to differences between the the cells.
 
I've done a bit of reading on those TI chips, and they do some amazing stuff--I think they are a good start towards what I am after. :)

I still cant' remember exactly what it was I was thinking of and heading towards when I started typing that post, and it is really irritating me, because I can *almost* see it! But what I wanted couldn't be done unless the cells can be reconfigured on the fly, and I just can't rmemeber exactly what that was. :(
 
CamLight said:
I would recommend searching for them on the TI web site for the best hits. Here are a few to start:
http://focus.ti.com/lit/wp/slpy002/slpy002.pdf
http://focus.ti.com/lit/an/slua450/slua450.pdf
http://focus.ti.com/lit/an/slua529/slua529.pdf
http://focus.ti.com/lit/an/slua534/slua534.pdf
http://focus.ti.com/lit/er/sluu386/sluu386.pdf
http://focus.ti.com/lit/an/slua421a/slua421a.pdf
http://focus.ti.com/lit/an/slua466/slua466.pdf
https://focus.ti.com/seclit/an/slua404/slua404.pdf
http://focus.ti.com/lit/ug/sluu330b/sluu330b.pdf
http://focus.ti.com/lit/an/slua556/slua556.pdf
http://focus.ti.com/lit/an/slua537/slua537.pdf
http://focus.ti.com/lit/an/slua524a/slua524a.pdf

I also recommend checking the product folder page for each of the gas gauges to see what document links they have. The Technical Reference manuals have lots of info too.

With the large number of settings available for the gas gauges, it can seem overwhelming at first. But, it's these settings that make them so powerful. Instead of modifying your hardware or firmware for each new test/setting/cell you want to try, just change a setting via the chip's PC software! This lets you get up and going very, very quickly with a BMS design (especially if using or modifying a TI reference design, i.e., one of their EVM designs) and then very easily change the settings to best match different needs without having to modify firmware, etc.

Thanks for the links!
 
Back
Top