Another exhibit for the Museum of Energy Management: thanks to David Bridger for unearthing UK monthly degree-day data for the period 1966 to 1975 (view the complete archive file here ).
These data have mainly curiosity value and should not be relied upon for any kind of trend analysis. Observing stations have sometimes been moved or affected by nearby building development and urban expansion. Region 18 (North West Scotland) was not even included at all until I launched the Degree Days Direct subscription service in 1992, and there had been two other main data providers before I got the contract to supply the official figures in 2003, so it would be risky to assume continuity over the whole fifty years.
Below: degree-day figures reported in the government’s “Energy Management” newspaper in 1986
Proponents of voltage reduction (“optimisation” as they like to call it) have started suggesting that equipment is more energy-efficient at lower voltage. In fact this is quite often not the case. For an electric motor, this diagram shows how various aspects of energy performance vary as you deviate from a its nominal voltage. The red line shows that peak efficiency occurs, if anything, at slightly above rated voltage.
Reduced voltage is associated with reduced efficiency. The reason is that to deliver the same output at lower voltage, the motor will need to draw a higher current, and that increases its resistive losses.
When you install an energy-saving measure, it is important to evaluate its effect objectively. In the majority of cases this will be achieved by a “before-and-after” analysis making due allowance for the effect of the weather or other factors that cause known variations.
There are, however, some types of energy-saving device which can be temporarily bypassed or disabled at will, and for these it may be possible to do interleaved on-off tests. The idea is that by averaging out the ‘on’ and ‘off’ consumptions you can get a fair estimate of the effect of having the device enabled. The distorting effect of any hard-to-quantify external influences—such as solar gain or levels of business activity—should tend to average out.
A concrete example may help. Here is a set of weekly kWh consumptions for a building where a certain device had been fitted to the mains supply, with the promise of 10% reductions. The device could easily be disconnected and was removed on alternate weeks:
Week kWh Device?
---- ----- -------
1 241.8 without
2 223.0 with
3 221.4 without
4 196.4 with
5 200.1 without
6 189.6 with
7 201.9 without
8 181.3 with
9 185.0 without
10 208.5 with
11 181.7 without
12 188.3 with
13 172.3 without
14 180.4 with
The mean of the even-numbered weeks, when the device was active, is 195.4 kWh compared with 200.6 kWh in weeks when it was disconnected, giving an apparent saving on average of 5.2 kWh per week. This is much less than the promised ten percent but there is a bigger problem. If you look at the figures you will see that the “with” and “without” weeks both have a spread of values, and their ranges overlap. The degree of spread can be quantified through a statistical measure called the standard deviation, which in this case works out at 19.7 kWh per week. I will not go into detail beyond pointing out that it means that about two-thirds of measurements in this case can be expected to fall within a band of ±19.7 kWh of the mean purely by chance. Measured against that yardstick, the 5.2 kWh apparent saving is clearly not statistically significant and the test therefore failed to prove that the device had any effect (as a footnote, when the analysis was repeated taking into account sensitivity to the weather, the conclusion was that the device apparently increased consumption).
When contemplating tests of this sort it is important to choose the length of on-off interval carefully. In the case cited, a weekly interval was used because the building had weekend/weekday differences. A daily cycle would also be inappropriate for monitoring heating efficiency in some buildings because of the effect of heat storage in the building fabric: a shortfall in heat input one day might be made up the next. Particular care is always needed where a device which reduces energy input might resulting in a shortfall in output which then has to be made up in the following interval when it is disconnected. This will notably tend to happen with voltage reduction in electric heating applications. During low-voltage interval the heaters will run at lower output and this may result in a heat deficit being ‘exported’ to the succeeding high-voltage period, when additional energy will need to be consumed to make up the shortfall, making the high-voltage interval look worse than the low-voltage one. To minimise this distortion, be sure to set the interval length several times longer than the typical equipment cycle time.
Otherwise there are perhaps two other stipulations to add. Firstly, the number of ‘on’ and ‘off’ cycles should be equal; secondly, although there is no objection to omitting an interval for reasons beyond the control of either party (such as metering failure) it could be prudent to insist that intervals are omitted only in pairs, and that tests should always recommence consistently in either the ‘off’ or ‘on’ state. This is to avoid the risk of skewing the results by selectively removing individual samples.
In statistical analysis the coefficient of determination (more commonly known as R2) is a measure of how well variation in one variable explains the variation in something else, for instance how well the variation in hours of darkness explains variation in electricity consumption of yard lighting.
R2 varies between zero, meaning there is no effect, and 1.0 which would signify total correlation between the two with no error. It is commonly held that higher R2 is better, and you will often see a value of (say) 0.9 stated as the threshold below which you cannot trust the relationship. But that is nonsense and one reason can be seen from the diagrams below which show how, for two different objects, energy consumption on the vertical or y axis might relate to a particular driving factor or independent variable on the horizontal or x axis.
In both cases, the relationship between consumption and its driving factor is imperfect. But the data were arranged to have exactly the same degree of dispersion. This is shown by the CV(RMSE) value which is the root mean square deviation expressed as a percentage of the average consumption. R2 is 0.96 (so-called “good”) in one case but only 0.10 (“bad”) in the other. But why would we regard the right-hand model as worse than the left? If we were to use either model to predict expected consumption, the absolute error in the estimates would be the same.
By the way, if anyone ever asks how to get R2 = 1.0 the answer is simple: use only two data points. By definition, the two points will lie exactly on the best-fit line through them!
Another common misconception is that a low value of R2 in the case of heating fuel signifies poor control of the building. This is not a safe assumption. Try this thought experiment. Suppose that a building’s fuel consumption is being monitored against locally-measured degree days. You can expect a linear relationship with a certain R2 value. Now suppose that the local weather monitoring fails and you switch to using published degree-day figures from a meteorological station 35km away. The error in the driving factor data caused by using remote weather observations will reduce R2 because the estimates of expected consumption are less accurate; more of the apparent variation in consumption will be attributable to error and less to the measured degree days. Does the reduced R2 signify worse control? No; the building’s performance hasn’t changed.
Footnote: for a deeper, informative and highly readable treatment of this subject see this excellent paper by Mark Stetz.
When considering the consumption of fuel for space heating, the degree-day base temperature is the outside air temperature above which heating is not required, and the presumption is that when the outside air is below the base temperature, heat flow from the building will be proportional to the deficit in degrees. Similar considerations apply to cooling load, but for simplicity this article deals only with heating.
In UK practice, free published degree-day data have traditionally been calculated against a default base temperature of 15.5°C (60°F). However, this is unlikely to be truly reflective of modern buildings and the ready availability of degree-day data to alternative base temperatures makes it possible to be more accurate. But how does one identify the correct base temperature?
The first step is to understand the effect of getting the base temperature wrong. Perhaps the most common symptom is the negative intercept that can be seen in Figure 1 which compares the relationships between consumption and degree days. This is what most often alerts you to a problem:
It should be evident that in Figure 1 we are trying to fit a straight line to what is actually a curved characteristic. The shape of the curve depends on whether the base temperature was too low or too high, and Figure 2 shows the same consumptions plotted against degree days computed to three different base temperatures: one too high (as Figure 1), one too low, and one just right.
Notice in Figure 2 that the characterists are only curved near the origin. They are parallel at their right-hand ends, that is to say, in weeks when the outside air temperature never went above the base temperature. The gradients of the straight sections are all the same, including of course the case where the base temperature was appropriate. This is significant because although in real life we only have the distorted view represented by Figure 1, we now know that the gradient of its straight section is equal to the true gradient of the correct line.
So let’s revert to our original scenario: the case where we had a single line where the base temperature was too high. Figure 3 shows that a projection of the straight segment of the line intersects the vertical axis at -1000 kWh per week, well below the true position, which from Figure 1 we can judge to be around 500 kWh per week. The gradient of the straight section, incidentally, is 45 kWh per degree day.
To correct the distortion we need to shift the line in Figure 3 to the left by a certain number of degree days so that it ends up looking like Figure 4 below. The change in intercept we are aiming for is 1,500 kWh (the difference between the apparent intercept of -1000, and the true intercept, 500*). We can work out how far left to move the line by dividing the required change in the intercept by the gradient: 1500/45 = 33.3 degree days. Given that the degree-day figures are calculated over a 7-day interval, the required change in base temperature is 33.3/7 = 4.8 degrees
Note that only the points in the straight section moved the whole distance to the left: in the curved sections, the further left the point originally sits, the less it moves. This can best be visualised by looking again at Figure 2.
In more general terms the base-temperature adjustment is given by (Ct-Ca)/m.t where:
Ct is the true intercept; Ca is the apparent intercept when projecting the straight portion of the distorted characteristic; m is the gradient of that straight portion; and t is the observing-interval length in days
* The intercept could be judged or estimated by a variety of methods including: empirical observations like averaging the consumption in non-heating weeks; by ‘eyeball’; or by fitting a curved regression line, etc..
It’s an anonymised but accurate reconstruction of something I recently saw touted as an example of a ‘visual energy display’ suitable for a reception area. Apart from patently being an advertisement for an equipment supplier — name changed to protect the innocent (guilty?) — the only numerical information in the display is in small type against a background which makes it hard to read. Also, one might ask, “so what?”. There is no context. What proportion was 3.456 kWh? What were we aiming for? What is the trend?
There’s a bigger picture here: in energy reporting generally, system suppliers have descended into “content-lite” bling warfare (why do bar charts now have to bounce into view with a flourish?). And nearly always the displays are just passive and uncritical statements of quantities consumed. Anybody who wants to display energy information graphically should read Stephen Few’s book Information Dashboard Design . It is clear that almost no suppliers of energy monitoring systems have ever done so, but perhaps if their customers did, and became more discerning and demanding, we might see more useful information and less meaningless noise and clutter.
DATELINE 1 APRIL 2016: Endomagno Ltd has ordered a total product recall of its bolt-on fuel-treatment magnets after two serious incidents at customers’ premises.
Such magnets are commonly claimed to improve consumption by aligning the gas molecules, and the incidents appear to involve the alignment effect being so strong that the gas has actually crystallised in the burner. Why this has started to happen now is not clear (the product literature points out that the Romans used lode-stones to improve the heat output of hypocausts) but my theory is that it relates to the introduction of new microcrystalline neodymium magnets in what the company describes as “a certain configuration”. Chaos entanglement theory says that these may interact with quantum nanoparticles in the gas stream in unpredictable ways.
The product recall presents a significant logistical problem for Endomagno. Although the magnets are easy to attach using gaffer tape, they cannot be removed by the customer without invalidating the product’s Korean patent. This means sending a technician to every site and as a market leader in magnetic fuel treatment they have nearly seven users.
Endomagno’s marketing director, Frank Lee Beaugusse, told me that the company is urgently investigating two alternative technologies. The most promising is a unipolar magnet, which only has a north pole, but they are also testing more conventional magnets with east and west (rather than north and south) poles. Comparative evaluation and testing will be carried out by Laboratoires Garnier.
The striking thing about the plantroom panel switches above is that they lack the ‘HAND’ position that is normally provided to allow equipment to run manually, and which are all too frequently found in that condition (right).
If you genuinely need to be able to let people in the plantroom override the automatic control, then at least get your building management system to monitor the switch position to alert you. Otherwise you just end up with stuff running continuously that doesn’t need to.
There is a class of product that claims to save energy by enhancing the output of central heating radiators. These are usually small fan units, but one product I have seen is just a metal plate stuck on with double-sided tape which passively increases the radiator’s surface area. Then of course we have those additives which improve heat transfer on the water side.
These devices are commonly sold on the basis that you will save energy because the room “will heat up faster”. In principle, there might be some truth in this, but savings would only arise if you optimised the start time of the heating to take advantage of the shorter warm-up period.
How big is this saving or loss likely to be? The picture on the right shows a simulation of the diurnal temperature variation, mid-day to mid-day, in an intermittently-heated space.
This profile is reasonably representative of a building with moderate thermal performance.
When the heating goes off at A, the temperature falls rapidly and first and then progressively more slowly until B, where the heating comes on again and boosts the temperature back up to point C. This simulation shows the effect of (a) boosting the heater output by 50% and (b) optimising the start time, in which case the heating now comes on later, at B’, because the radiator can raise the temperature faster and reach point C as before.
The height of the bars represents the difference between inside and outside temperature (which is held constant here to simplify the analysis) and because heat loss is proportional to the inside-outside temperature difference, the shaded area of the chart is a good proxy for daily heat loss and thus fuel consumption. The small triangular area CBB’ represents the daily energy saved by the change. It amounts in this case to 3.4% of the total, and this proportion will probably be the similar across a range of outside air temperatures; less in milder weather, and more when it is colder.
So if you would need to increase radiator output by half to save only around 3% on fuel, it is hard to see how the marginal increase in output from a stick-on booster is going to make a perceptible change. And remember, if you don’t optimise the start time, increasing the output will cause the room temperature to rise faster, achieving target temperature prematurely, leading not to a saving but to an equal, albeit negligible, loss.
Is this a record? I photographed this outside light as an example of bad practice at Newent Community School in 1986 when I was Gloucestershire’s energy manager. I go past it every week on my way to the sports hall and in 30 years it has never been off during daylight hours. They’ve replaced the doors, windows and roof, and some trees have grown up between it and their new eco lab, whose roof-mounted wind turbine is just sufficient (on those rare occasions that it can be seen turning) to power the light above the door on the neighbouring science block:
Let’s be realistic: unnecessary lighting doesn’t waste much electricity. But unfortunately it is the most visible indicator of one’s commitment. It’s no good feeding children piffle about sustainability if you demonstrate a disregard for proper management of energy.