Heat from amps, but what of watts?

bobhurd3d said:
I²R heating when pulling high currents will damage power wiring...

Looks more like a dodgy termination as the opposite wire carried that same energy but passed the melt test.

Connections have the resistance increase directly across them meaning a bad connection effect the connector only exactly like the picture shows.

Overcurrent will have the sheathing start to fall away from the cable within as the core warms up.
 
Photo0061.jpg

While back was called out to something simular the heat had gathered from a bad connection in this case the lug had not been torqued down one lose bolt cost thousands in damage kept going till a fire alarm went and simeone manually shut her down but the casings were to far gone no longer could that be used and signed off.

Point been connections be they enemy in all systems.

What had hapoened here was the lugs surface area was in full demand the lug hsd been specified to its limit so any wrong doing in fitting or degrading of the meeting faces over time means hest will creep in on that area.

In that picture the panel was 2 days old so thats how i can diagnose it so well but the heat could of loosened the bolt so been finger tight wouldnt always give me an answer quincy would be happy with i have to use my spider sence.
 
Hummina Shadeeba said:
If I pass 100 amps at 1 volt through a resistor it’s the same wattage as if 1amp through a much bigger resistor at 100 volts, and heat is measured in watts so should be the same temp produced no?

No, resistive heating is caused by and proportional to current flow.
100A^2 * 1Ohm = 10000W
1A^2 * 1Ohm = 1W

Hummina Shadeeba said:
But only amps create heat.
Only amps create resistive heating.
 
Hummina Shadeeba said:
It’s all resistive heating right? Regardless of the voltage the amount of amps that flow are the cause of the heat
No. Well, maybe if you're only looking at a resistor.
Controllers also experience switching and static losses, and motors also experience hysteresis and eddy current losses.
 
Hummina Shadeeba said:
It’s all resistive heating right? Regardless of the voltage the amount of amps that flow are the cause of the heat

No current flows if there isn’t a voltage.
 
Please bear with me.

If I have a resistor burn 1amp at 100v it’s 100watts
If I have a resistor burn 100amps at 1v it’s also 100watts.
Both 100 watts but one is 100x more heat produced right? Doesn’t make sense to me.
 
100W = 100W, the heat is exactly the same.

Power dissipation is the product of current and voltage. P = I * V

Ohms law tells us that V = I * R. That's how you can rearrange power dissipation to get the equation P = I * I * R.

Just because you don't see the "V" doesn't mean it's not there, it's just been substituted out.
 
Hummina Shadeeba said:
Please bear with me.

If I have a resistor burn 1amp at 100v it’s 100watts
If I have a resistor burn 100amps at 1v it’s also 100watts.
Both 100 watts but one is 100x more heat produced right? Doesn’t make sense to me.

These resistors aren't hypothetical. You can calculate the value of each resistor using ohm's law.

V = I * R

You have V and I, so just divide V/I to get R.

Addy said:
100W = 100W, the heat is exactly the same.

Power dissipation is the product of current and voltage. P = I * V

Ohms law tells us that V = I * R. That's how you can rearrange power dissipation to get the equation P = I * I * R.

Just because you don't see the "V" doesn't mean it's not there, it's just been substituted out.

Addy is right.

fatty said:
Hummina Shadeeba said:
If I pass 100 amps at 1 volt through a resistor it’s the same wattage as if 1amp through a much bigger resistor at 100 volts, and heat is measured in watts so should be the same temp produced no?

No, resistive heating is caused by and proportional to current flow.
100A^2 * 1Ohm = 10000W
1A^2 * 1Ohm = 1W


....

Fatty is misunderstanding the question. A 1Ohm resistor will not drop 1V at 100A, nor will it drop 100V at 1A, so 1Ohm cannot be used to calculate the power loss in the resistor for either case.
 
thepronghorn said:
Fatty is misunderstanding the question. A 1Ohm resistor will not drop 1V at 100A, nor will it drop 100V at 1A, so 1Ohm cannot be used to calculate the power loss in the resistor for either case.
To the contrary -- I don't think the OP is referring to 1V or 100V voltage drop, but rather a nominal supply voltage.
Per my second reply, I'm assuming the question is regarding resistive heating in a motor or wire. EBike Technical, and all...
 
Yes a voltage supply of 1 vs 100. And assuming resistive heating. Heat is measured in watts so I assume a 100v supply w 1amp, vs 1v supply and 100 amps, would be same heat. But amps make heat not voltage.
 
So bear with me:
If my esc is programmed to put out peak 50 amps to my motor by lowering the pack voltage down by pwm to an effective voltage that will produce that 50 amps based on the winding resistance... how is adding a higher voltage battery going to add more power and I thought the voltage going to the motor was solely determined by the desired current as produced by the pwm.


Power = current x voltage.
I thought this was analogous to
speed x torque= power
and the voltage would be producing that speed with higher rpm. But if the motor isn’t spinning faster due to the higher battery voltage I don’t see where the voltage aspect is showing itself in the power.

With a heater the current is passed AT the higher voltage and that makes sense that it will add to a higher wattage, but in the motor the voltage seems limited by the demand for current no?

Or maybe in the motor it’s just ending up higher current that passes with a higher battery voltage as with ohms law it will then flow easier and that’s where the increased torque is coming from?
 
Back
Top