Electricity - sense check

Author
Discussion

anonymous-user

Original Poster:

54 months

Thursday 22nd February 2018
quotequote all
If you know about electrickery, please comment

I have been approached by a seller of a voltage optimiser that will save my electricity costs...

Here is an extract from the blurb

"In the UK, the average voltage supply to a property (domestic, commercial and industrial) is between 240v and 252v. Voltage Optimisation Units reduce the volts supplied to the property (usually 220v), saving you money on wasted electricity. Units can be made to suit the needs of all domestic, commercial and industrial customers and provide savings of up to 20%.

Across Europe, the typical electrical supply voltage is between 220 and 230v. However, in the UK, this is significantly higher, with the typical voltage supply between 240 and 252v. All electrical equipment designed for use in the UK and Europe must be capable of operating through the voltage range to comply with EU regulations – meaning that a large proportion of electrical appliances and equipment consume more energy when supplied with a higher voltage (this is much higher in the UK than across Europe). As a result, many appliances over use energy – resulting in energy wastage, higher energy bills and a reduction in the life expectancy of electrical appliances and equipment."

Am I right in thinking that this is a load of old gammon?

TooMany2cvs

29,008 posts

126 months

Thursday 22nd February 2018
quotequote all
JPJPJP said:
If you know about electrickery, please comment

I have been approached by a seller of a voltage optimiser that will save my electricity costs...

Here is an extract from the blurb

"In the UK, the average voltage supply to a property (domestic, commercial and industrial) is between 240v and 252v. Voltage Optimisation Units reduce the volts supplied to the property (usually 220v), saving you money on wasted electricity. Units can be made to suit the needs of all domestic, commercial and industrial customers and provide savings of up to 20%.

Across Europe, the typical electrical supply voltage is between 220 and 230v. However, in the UK, this is significantly higher, with the typical voltage supply between 240 and 252v. All electrical equipment designed for use in the UK and Europe must be capable of operating through the voltage range to comply with EU regulations – meaning that a large proportion of electrical appliances and equipment consume more energy when supplied with a higher voltage (this is much higher in the UK than across Europe). As a result, many appliances over use energy – resulting in energy wastage, higher energy bills and a reduction in the life expectancy of electrical appliances and equipment."

Am I right in thinking that this is a load of old gammon?
Yes.

Power = Voltage x Current
Watts = Volts x Amps

The watts are constant for any given device - you turn a 2kW heater on, you use 2kW, and in one hour you use 2kWh.
If the volts are reduced, the amps are simply increased. That 2kW heater is taking 7.9A at 252v, but 9.1A at 220v.

Electrickery is charged by the kWh... If it was charged by the Ah, then reducing the voltage would make your bill higher, because you're pulling more amps.

UK electrickery is nominally 230v +10%/-6% (253v to 216v), to be on a par with the standardised European 230v, but is normally around 240v. There might be a tiny efficiency difference but, in practice, it makes absolutely zero difference to anybody except the company fleecing the gullible with this... load of old gammon.

Just ask this... In a world focussing on emissions and energy-saving, if turning the voltage down to 220 saved electrickery, why would National Grid just do it...? The obvious answer is... "It's all a conspiracy to fleece us poor consumers" - and there you have your target market for these devices nailed.

Edited by TooMany2cvs on Thursday 22 February 14:55

anonymous-user

Original Poster:

54 months

Thursday 22nd February 2018
quotequote all
Thanks, I knew my BS detector could be relied upon

C0ffin D0dger

3,440 posts

145 months

Thursday 22nd February 2018
quotequote all
That's a new one on me, whoever is trying to sell such rubbish should be done under the trades description act or whatever applies. I have a B.Eng in Electrical and Electronic Engineering so probably know what I'm on about.

Flibble

6,475 posts

181 months

Thursday 22nd February 2018
quotequote all
TooMany2cvs said:
Yes.

Power = Voltage x Current
Watts = Volts x Amps

The watts are constant for any given device - you turn a 2kW heater on, you use 2kW, and in one hour you use 2kWh.
If the volts are reduced, the amps are simply increased. That 2kW heater is taking 7.9A at 252v, but 9.1A at 220v.

Electrickery is charged by the kWh... If it was charged by the Ah, then reducing the voltage would make your bill higher, because you're pulling more amps.
Actually no (but also yes).

The power is not constant for any given device; mostly it's nominal for the intended supply voltage. However it varies depending on what you're powering.

For the heater given, it's not Power = Voltage x Current, it's Current = Voltage / Resistance. The resistance is fixed as it's an intrinsic property of the wire used to make the heating element. Because of this, lower voltage = lower current, and thus lower power.

So a 2kW heater will have a nominal current draw at 240V of 2000 / 240 = 8.33A. This gives it a resistance of 240 / 8.33 = 28.8 ohms.
Drop the voltage to 220V and you have a current draw 220 / 28.8 = 7.64A, and a power usage of 220 x 7.64 = 1680W.

So you have reduced power usage by 16%, hurrah! However you're also reducing heat output by 16% so you need to run it longer. And the voltage reducer itself consumes power, so your actual saving may be little or nothing.

When it comes to low voltage stuff it's a different matter. Because these run off a step down converter (usually a switch mode supply these days), there isn't really a saving to be had. Your 5V phone charger is stilla 5V phone charger no matter what voltage you feed it, and the efficinency changes very little with reduction in supply voltage. So you might save a couple of percent, but you'd probably need to be running a 120V supply to get even that.

So yes it's a load of gammon, but superficially it would be easy to take in the gullible.

PH5121

1,963 posts

213 months

Thursday 22nd February 2018
quotequote all
I was talking to a chap earlier this week who was heavily involved with the selling of solar panels. He told me that the domestic solar market is on its' arse with regard to selling new systems and that voltage optimisation was the latest thing to sell to 'Joe public' to save them money.


mickmcpaddy

1,445 posts

105 months

Thursday 22nd February 2018
quotequote all
Flibble said:
TooMany2cvs said:
Yes.

Power = Voltage x Current
Watts = Volts x Amps

The watts are constant for any given device - you turn a 2kW heater on, you use 2kW, and in one hour you use 2kWh.
If the volts are reduced, the amps are simply increased. That 2kW heater is taking 7.9A at 252v, but 9.1A at 220v.

Electrickery is charged by the kWh... If it was charged by the Ah, then reducing the voltage would make your bill higher, because you're pulling more amps.
Actually no (but also yes).

The power is not constant for any given device; mostly it's nominal for the intended supply voltage. However it varies depending on what you're powering.

For the heater given, it's not Power = Voltage x Current, it's Current = Voltage / Resistance. The resistance is fixed as it's an intrinsic property of the wire used to make the heating element. Because of this, lower voltage = lower current, and thus lower power.

So a 2kW heater will have a nominal current draw at 240V of 2000 / 240 = 8.33A. This gives it a resistance of 240 / 8.33 = 28.8 ohms.
Drop the voltage to 220V and you have a current draw 220 / 28.8 = 7.64A, and a power usage of 220 x 7.64 = 1680W.

So you have reduced power usage by 16%, hurrah! However you're also reducing heat output by 16% so you need to run it longer. And the voltage reducer itself consumes power, so your actual saving may be little or nothing.

When it comes to low voltage stuff it's a different matter. Because these run off a step down converter (usually a switch mode supply these days), there isn't really a saving to be had. Your 5V phone charger is stilla 5V phone charger no matter what voltage you feed it, and the efficinency changes very little with reduction in supply voltage. So you might save a couple of percent, but you'd probably need to be running a 120V supply to get even that.

So yes it's a load of gammon, but superficially it would be easy to take in the gullible.
Its even worse than that, a lot of appliances have micro controllers in them nowadays so say for instance a washing machine, the motor will be controlled by a computer and if the voltage is reduced the computer will ramp up the current flow to get the motor to spin at the correct speed, it may draw 9A at 220 volts but only draw 8A at 240V (hypothetical figures), however the kWhr meter is sat before the voltage optimiser so still registers the 240V supply and the increased current flow, so the cost would be more than if it was supplied at the full voltage.

So Big Clive says anyway.

https://www.youtube.com/watch?v=zKasA4HxaGY


S11Steve

6,374 posts

184 months

Thursday 22nd February 2018
quotequote all
I asked the same question here about 18 months ago when we were looking at solar panels.

The short answer is that Voltage Optimisers only really work when used on an industrial scale. In a domestic setting it will make you kettle boil slower and toaster take longer.


https://www.pistonheads.com/gassing/topic.asp?h=0&...

As a follow up, I "invested" in a garage and driveway instead of solar... The uplift in house value and better LTV mortgage rate made it far more financially efficient for us than solar. I'd still go down the solar route at some point in the future though. Just without the voltage optimiser.

Edited by S11Steve on Thursday 22 February 17:29

Flibble

6,475 posts

181 months

Thursday 22nd February 2018
quotequote all
mickmcpaddy said:
Its even worse than that, a lot of appliances have micro controllers in them nowadays so say for instance a washing machine, the motor will be controlled by a computer and if the voltage is reduced the computer will ramp up the current flow to get the motor to spin at the correct speed, it may draw 9A at 220 volts but only draw 8A at 240V (hypothetical figures), however the kWhr meter is sat before the voltage optimiser so still registers the 240V supply and the increased current flow, so the cost would be more than if it was supplied at the full voltage.

So Big Clive says anyway.

https://www.youtube.com/watch?v=zKasA4HxaGY
If it's 9A at 220V it'll be 8A at 240V before the optimiser, or at least it would be if the optimiser were perfectly efficient. Current and voltage are linked, if you change one the other will also change.

However because the optimiser isn't perfectly efficient it will cost more for anything with a controller speed motor (like a washing machine).

Simpo Two

85,422 posts

265 months

Thursday 22nd February 2018
quotequote all
I thought the voltage from the Grid was reduced from the 'official' 240V to about 230V a long time ago.

I do quite like the idea of a product which is a black box about 6" cubed, with a plug on one side and a socket on the other, an LED and 'Eco-something' logo on the top, and inside, half a brick for perceived value hehe

Flibble

6,475 posts

181 months

Thursday 22nd February 2018
quotequote all
Simpo Two said:
I thought the voltage from the Grid was reduced from the 'official' 240V to about 230V a long time ago.

I do quite like the idea of a product which is a black box about 6" cubed, with a plug on one side and a socket on the other, an LED and 'Eco-something' logo on the top, and inside, half a brick for perceived value hehe
The official specs changed, but the actual voltage supplied didn't. Too much hassle I'd imagine.

Condi

17,193 posts

171 months

Thursday 22nd February 2018
quotequote all
Im pretty sure they do install something similar for big energy intensive businesses, and it does save money if you're using enough power to make it worthwhile. We're talking factories, etc.

For a domestic customer there is no point.

Dr Mike Oxgreen

4,119 posts

165 months

Friday 23rd February 2018
quotequote all
Simpo Two said:
I thought the voltage from the Grid was reduced from the 'official' 240V to about 230V a long time ago.
Not really. They changed the specification to 230V, with a tolerance of +10%/-6%.

This meant that the existing UK 240V supply was within the new spec and didn’t need to be changed. It is also meant that the existing European 220V supply was also within tolerance and didn’t need to change. A triumph of fiddling the figures!

Whilst technically the UK voltage can legally go up to 110% of 230V (i.e. 253V) that is highly unlikely. I doubt it ever happens in practice. When I measure my mains voltage, I’ve never seen it more than a volt away from 240.

As has been pointed out, “optimising” your voltage will have no effect on appliances that internally regulate their voltage and current. And on things like a kettle it just means that it takes longer to boil and uses exactly the same amount of energy*.

* Actually, a slower boiling kettle will use marginally MORE energy, because the longer time taken to boil allows more time for heat to escape through the walls and spout of the kettle during boiling. This will be a marginal effect, but it shows what a load of bks this whole energy-saving idea is: if they try to lower the maximum wattage of kettles it’ll actually have the effect of consuming slightly more energy.

Edited to add: If they genuinely want to lower the amount of energy used by kettles, they should look into design changes to encourage people not to fill them more than necessary. Or perhaps limit the overall capacity so that people can’t boil excessive amounts for just one or two mugs of tea. Reducing the maximum size of kettles to somewhere between 1 and 1.5 litres would have a genuine environmental impact, because those morons who habitually fill to the brim would be boiling less water. You’ll never change that habit, IMHO.

Edited by Dr Mike Oxgreen on Friday 23 February 07:31

mickmcpaddy

1,445 posts

105 months

Friday 23rd February 2018
quotequote all
Condi said:
Im pretty sure they do install something similar for big energy intensive businesses, and it does save money if you're using enough power to make it worthwhile. We're talking factories, etc.

For a domestic customer there is no point.
That's power correction capacitors. Its billed differently to domestic electricity.

Random google link.

https://www.eetimes.com/document.asp?doc_id=127820...



ruggedscotty

5,626 posts

209 months

Sunday 25th February 2018
quotequote all
Amazing what they will do to try and sell things....

How much were they charging for the device and what was the payback period. Ill bet that it was pretty lengthy.

Solar was making money through the feed in tariff. you were creating energy and putting it back to grid. This if it works gives very small reductions in consumed power but it also reduces the power by reducing the voltage as was explained very well earlier in the thread.

98elise

26,596 posts

161 months

Sunday 25th February 2018
quotequote all
TooMany2cvs said:
JPJPJP said:
If you know about electrickery, please comment

I have been approached by a seller of a voltage optimiser that will save my electricity costs...

Here is an extract from the blurb

"In the UK, the average voltage supply to a property (domestic, commercial and industrial) is between 240v and 252v. Voltage Optimisation Units reduce the volts supplied to the property (usually 220v), saving you money on wasted electricity. Units can be made to suit the needs of all domestic, commercial and industrial customers and provide savings of up to 20%.

Across Europe, the typical electrical supply voltage is between 220 and 230v. However, in the UK, this is significantly higher, with the typical voltage supply between 240 and 252v. All electrical equipment designed for use in the UK and Europe must be capable of operating through the voltage range to comply with EU regulations – meaning that a large proportion of electrical appliances and equipment consume more energy when supplied with a higher voltage (this is much higher in the UK than across Europe). As a result, many appliances over use energy – resulting in energy wastage, higher energy bills and a reduction in the life expectancy of electrical appliances and equipment."

Am I right in thinking that this is a load of old gammon?
Yes.

Power = Voltage x Current
Watts = Volts x Amps

The watts are constant for any given device - you turn a 2kW heater on, you use 2kW, and in one hour you use 2kWh.
If the volts are reduced, the amps are simply increased. That 2kW heater is taking 7.9A at 252v, but 9.1A at 220v.

Electrickery is charged by the kWh... If it was charged by the Ah, then reducing the voltage would make your bill higher, because you're pulling more amps.

UK electrickery is nominally 230v +10%/-6% (253v to 216v), to be on a par with the standardised European 230v, but is normally around 240v. There might be a tiny efficiency difference but, in practice, it makes absolutely zero difference to anybody except the company fleecing the gullible with this... load of old gammon.

Just ask this... In a world focussing on emissions and energy-saving, if turning the voltage down to 220 saved electrickery, why would National Grid just do it...? The obvious answer is... "It's all a conspiracy to fleece us poor consumers" - and there you have your target market for these devices nailed.

Edited by TooMany2cvs on Thursday 22 February 14:55
How does a 2 KW heater adjust the current? It's a resistor and will draw more current if supplied with a greater voltage.

Something like a laptop or TV can control what it draws, but any simple heater normally cannot.

That said a kettle drawing more current will boil quicker so about the same amount of energy is consumed. A heater would come up to temperature quicker so would shut off quicker.

It's entirely dependent on the device.

mickmcpaddy

1,445 posts

105 months

Sunday 25th February 2018
quotequote all
The advantage of using higher voltage is the cable can be thinner for the same given power. If the voltage was reduced to 110V like america then all the cables would have to be twice as thick to power a kettle rated at the same 2kW. Indeed America use 240V for stuff like washing machines and electric cookers.


Sheepshanks

32,763 posts

119 months

Sunday 25th February 2018
quotequote all
Dr Mike Oxgreen said:
Whilst technically the UK voltage can legally go up to 110% of 230V (i.e. 253V) that is highly unlikely. I doubt it ever happens in practice. When I measure my mains voltage, I’ve never seen it more than a volt away from 240.
Ours routinely sits at 248V - we're in a biggish village, but some way away from the sub-station. Kettle boils quick!

There was a fairly local company doing voltage optimisation - VPhase - but they went bust.

Getragdogleg

8,767 posts

183 months

Sunday 25th February 2018
quotequote all
Its Gammon but it does raise the question of what exactly is in the box they would install in your house to "optimise".

I find myself genuinely curious what the hardware is.

TooMany2cvs

29,008 posts

126 months

Sunday 25th February 2018
quotequote all
Getragdogleg said:
Its Gammon but it does raise the question of what exactly is in the box they would install in your house to "optimise".

I find myself genuinely curious what the hardware is.
An LED marked "Saving you money"