Volts, amps, watts and breakers
Discussion
What causes a breaker to trip? Is it purely the amps or is it the total power that it responds to?
We had an issue with the power supply to the factory where I work. The voltage supplied to site was low and was around 210V during peak hours. This would cause issues where motors would trip and our EV charger would cut out (meaning the directors cars wouldn't charge...).
We have a 2000Amp supply and previously when at full production the max draw on the incoming meter showed 1950Amps, which has always been on the mind as it's a little close for comfort.
The voltage issues were cured by changing the tappings on our transformer and it now sits a far healthier 235V, the issues have gone away and we are all good now.
While we haven't gone to full production since the change (we produce fast moving seasonal consumables product) the current values are lower than I am used to seeing on the meter - which is understandable due to Ohms law. (Power the same, volts up, Amps down etc)
So just out of interest I used an online calculator (as my maths isn't the most reliable) to work out the power consumed before the transformer adjustments:

This gives a power figure of about 420kW
If I then use that power with the new increased voltage:

It suggests that the new current at maximum consumption will be 1790Amps which is about 160Amps less, which is good, that means that there increased buffer between the max consumption and the incoming breaker size.
But I started considering what actually causes the breaker to trip, I know they are activated thermally, they over heat and "pop" - open circuit.
I have always understood that is a result of the current/resistance (hence long distance transmission is done at very high voltages to deliver the same power).
Then I thought, well if the voltage is higher and resistance hasn't changed. what would the power consumption be if the current were the same 1950Amps:

So if the voltage is now 235 and the current draw was still 1950 then the total power is now 460kW.
This implies to me that we could use an additional 40kW of power and still being within the limits of our main incoming breaker and distribution equipment.
Am I missing anything, or got this drastically wrong?
Basically the question is if the breaker would actually care about an additional 40kW?
Surely if the current is within its current rating then that's all that matters?
We had an issue with the power supply to the factory where I work. The voltage supplied to site was low and was around 210V during peak hours. This would cause issues where motors would trip and our EV charger would cut out (meaning the directors cars wouldn't charge...).
We have a 2000Amp supply and previously when at full production the max draw on the incoming meter showed 1950Amps, which has always been on the mind as it's a little close for comfort.
The voltage issues were cured by changing the tappings on our transformer and it now sits a far healthier 235V, the issues have gone away and we are all good now.
While we haven't gone to full production since the change (we produce fast moving seasonal consumables product) the current values are lower than I am used to seeing on the meter - which is understandable due to Ohms law. (Power the same, volts up, Amps down etc)
So just out of interest I used an online calculator (as my maths isn't the most reliable) to work out the power consumed before the transformer adjustments:
This gives a power figure of about 420kW
If I then use that power with the new increased voltage:
It suggests that the new current at maximum consumption will be 1790Amps which is about 160Amps less, which is good, that means that there increased buffer between the max consumption and the incoming breaker size.
But I started considering what actually causes the breaker to trip, I know they are activated thermally, they over heat and "pop" - open circuit.
I have always understood that is a result of the current/resistance (hence long distance transmission is done at very high voltages to deliver the same power).
Then I thought, well if the voltage is higher and resistance hasn't changed. what would the power consumption be if the current were the same 1950Amps:
So if the voltage is now 235 and the current draw was still 1950 then the total power is now 460kW.
This implies to me that we could use an additional 40kW of power and still being within the limits of our main incoming breaker and distribution equipment.
Am I missing anything, or got this drastically wrong?
Basically the question is if the breaker would actually care about an additional 40kW?
Surely if the current is within its current rating then that's all that matters?
Edited by Buzz84 on Wednesday 18th March 23:28
Gassing Station | Science! | Top of Page | What's New | My Stuff


