There's no real Ah limit a pack can be to use this design, but there are some practical limits to how much current the charger used puts out. There are two functions that a BMS needs to do as a minimum. One is low voltage protection, for each cell, and the second is to manage the charging process in a way that each cell can be charged to its own full level, at its own pace, just as if individual cell chargers were being used. That's all that really needs to happen. Many of the imported BMS designs also monitor and control discharge current, and will cut the negative power lead if some limit is reached. This current limit function is included out of necessity because most of these BMS boards were designed to be used with 1-2C - rated so called "duct tape" packs. They needed to keep users from killing the cells by pulling too much current out of them.
Richard and I did not include this function because since all the current has to go through all the cells, all the time, current limiting can be done at the pack level, in the controller. Low voltage protection is the only thing that needs to be done at the cell level. Almost all controllers I'm aware of have some sort of ebrake input that can be connected to the BMS' LCV circuits so that any one of them tripping will cause the controller to cut power and remove the load. So from a discharge point of view, the BMS doesn't care how big or small the pack is. It will simply tell the controller to cut power if it detects any one cell, or block of paralleled cells, gets too low.
Charging is a bit different. Basically, the BMS does nothing during the bulk constant current (CC) mode, where the charger is pumping out the max current it can deliver. Once any of the cell voltages reaches 3.68V, the shunt circuit for that cell starts operating, which basically keeps the cell voltage at that limit. This is just like having individual constant voltage (CV) chargers on each cell. What is actually happening is that once the cell voltage is held at that point, it will start reducing the amount of current in can accept into the cell. Since all the current has to go through all the cells with a bulk charging setup, if one of them gets to this point before the others and starts limiting the current, it will also be limiting the current for the "slower" cells, which still need more current. What the shunts do is make sure that there is at least 500mA of current that is bypassed to the next cell in series. Depending on how far the low cells need to go, and how big the cells are, capacity-wise, will determine how long the "slowpokes" need to finish getting a full charge. Anyway, part of the logic to control the shunts involves interrupting the charger current going to the pack. There is a single FET that controls this, and it is rated pretty high, current-wise, but what will probably be the limiting factor is the amount of current the traces on the PCB can handle. I think Richard decided the board was probably good for up to about 30A, but to go up to say, 50A, you'd probably want to beef up the traces with solder, like they do with the higher power controllers.
What should also be considered is the pack voltage that you using these boards with. It is quite easy to daisy-chain more than one board's worth of channels together, for higher voltage packs, but there is a limit to the voltage difference between the charger's voltage (i.e. -- approximately 3.7V x # of cells in series...) and whatever the pack ends up being when fully drained (say 3.0V x # of cells...). I think this has something to do with the 12V regulator circuit for the charger/FET control logic. I don't know what this limit is, but I know that increasing this number was pretty much the only difference between v2.2 and v2.3, the current version.
Regarding how much cells get out-of-balance, it has been my experience that for healthy cells, it really depends on how far down you discharge the pack. This seems to hold true for all of my a123-based "healthy" packs, and for the two PSI/BMI-based 12s/10Ah packs I use on my wife's bike. My wife rarely runs the pack down about the 25-30% SOC level, and when I have checked, the cells were all very close in voltage (within about .005-.007V...). I have used these packs, and drained them down to LVC cutoff, and have noticed the cell were a lot farther apart (.1-.2V mostly, with some as high as .5V lower...). I've seen the same sort of theing recently with some of my newer-built a123 packs. With most of my a123 packs, they were made up of a mix of mostly healthy cells combine usually with one or two "stressed" cells in parallel. The healthy ones "help" the stressed ones, and the net result is that cell voltages even right off the charger are always all over the place. As long as you do cell-level low voltage protection, and you charge in a way that lets each block of cells get to their own "full" level, at their own pace, I could care less about how well "balanced" they stay. Recently, however, I have begun reconfiguring all my a123 packs, and in the process I'm trying to weed out the weak/stressed cells. With the couple of these reconfigured packs so far, I have seen pretty much the same thing as with the PSI/BMI packs. If I don't discharge down to cutoff, the cell blocks stay pretty close. If I go down to cutoff, I might see a 1-2% difference.
I have no idea if I answered your question, because I don't remember what it was, but hopefully this helps.
-- Gary