Hummina Shadeeba said:
how much degradation would happen in ion or iron cells that maybe touch this temp for a couple minutes?
Do you know the "BTU", friend?
A British thermal unit is a measure of heat. Described as the amount of heat needed to raise one pound of water at maximum density (1/1) through one degree (1) Fahrenheit, equivalent to 1.055 × 103 joules.
A watt is also equivalent to 3.4121 British thermal units per hour. Cells create heat through the resistance.... this heat can be virtually calculated and empirically measured with a calorimeter. Resistance can be measured. Manipulated. Designed around.
If you do not create enough heat ( in watts, or BTU,) you will never reach that temp. Given the heat capacity of the environment, the conduction, convection, and radiation. A BTu is a tiny bit of heat. Not many BTU created through the resistance of the typical 18650 circuit available through the formula V=IR.
Unless the mass is large enough, per amp hour volume, at the specified Wh/Kg density. A cell is designed to dissipate a heat load ( in BTU) whitin its design.. to be able to shed that heat and continue current providing. At typical room temp (59*F).
How do you make a cell 140*?
Place it in a 140* environment. Or place it in some other environment that cannot shed its heat capacity... such as in a mass of cells, not only by itself. The ability for the mass to convect, conduct, and radiate is compromised and heat builds for it cannot be dissipated to the environment ( lack of heat, ie cold, ie no thermal activity).
This mass is measured in Ah, ( wh/Kg, a density)) and W (watts of heat, AKA.... BTU of heat ).
W/Ah.