There are 65 million smart meters installed in buildings across the U.S., dutifully logging data that includes the time of day and kW demand. The U.S. Department of Energy’s Smart Energy Analytics Campaign recently explored the value of this smart meter data and calculated the energy efficiency savings hiding within it to be $4 billion. That’s enough money to send 8,000 people to Mars on Elon Musk’s Heart of Gold rockets.
While the DOE report makes for a great headline, turning the DOE’s juicy savings analyses spreadsheets into lower operating costs requires serious work. In fact, this gap is explicit in the DOE campaign’s focus not on data, but on analytical tools.
Utility smart meter data feeds–such as PG&E’s InterAct and Seattle City Light’s MeterWatch–are valuable, but not everything a building needs. Smart meter data is required, but not sufficient. Why? What follows are five problems with raw data feeds solved by Gridium analytics.
1 | no weather normalizations
Weather is one one of the dominant drivers of building energy use, and every building responds to outside conditions in its own way. Seemingly random fluctuations caused by the interplay between air temperature, humidity, cloud cover and other factors mask other, subtler changes in your building. Weather normalization uses a statistical model to filter out the signal from the noise, the contribution of weather conditions to building energy use. Once the weather component has been characterized mathematically, it can be subtracted out of the daily load curve, yielding a “weather-normalized” picture of building energy use that more clearly shows underlying trends.
With weather data, you can answer the question “What portion of my building’s energy use is due to weather, and what portion is due to other factors?”
2 | no rates, no bills
Not all kilowatts are created equal. Utilities vary prices by time of day and season to encourage buildings to reduce energy use during peak grid load. Time-of-use rates are designed to reward ratepayers who can shift electricity to off-peak periods. Because rates are a zero sum game, those rewards are underwritten by ratepayers who can’t, or don’t shift their use.
Grid peaks are expensive for utilities, who have to provision sufficient generation capacity to handle maximum grid load. To reflect these costs, utilities are charging time-of-use rates that vary with time of day, day of week, and season of the year. For example, imagine two efficiency projects, one that reduces air conditioning load and another that reduces lighting use during off-hours. The air conditioning project will have its biggest impact during expensive peak periods. The lighting project is mostly going to offset cheap non-peak use. Both projects might be worthwhile, but blended average rates will cloud the ROI calculations.
With rate and bill data, you can see the actual operating cost impact from the air conditioning project, isolated from the billing period length, weather, and utility rate factors also driving bill variance.
3 | no expectations
We love paging through load curves as much as any professional energy nerds, and after handling about 1 million weekly load curves, we know very well the glassy eyed sensation that’s so effective at glossing over important changes in a building’s energy use. Computers don’t get tired.
The only constant in buildings is that they change. Occupancy changes, equipment upgrade cycles pass, special events occur, and even space use fluctuates. How can you access accurate performance feedback with all of these changes? The secret is a learning baseline model that adapts alongside your building, accounts for the changes, and simultaneously learns the new normal.
At the click of a button, analytics applications can highlight anomalous energy use.
4 | no diagnostics
There is yet a further distinction between the difference between how the building is using energy compared to how it typically uses energy (for that time of day, day of the week, and weather) and how, and when, the building is drawing power. Fault detection diagnostics require two things: looking very carefully at the load curve and applying sophisticated mathematical models to it. A simple example is holiday use, although that’s easy enough to spot on a raw load curve. A tougher fault to spot with the naked eye is a subtle bump in baseload–though given that buildings use more energy throughout the year when shutdown vs. in use, the wasted dollars add up.
With analytics applications, you can easily identify hard-to-spot drifts in building startup and shutdown times.
5 | no forecasts
Perhaps the most space age differentiator between the value of smart meter data and the value of analytics is in forecasting future energy demand. In the Lawrence Berkeley National Labs’ test of automated predictive baseline models for measurement & verification of building electricity use, Gridium analytics came out on top. Since peak demand charges can drive 40% of your building’s summer utility costs, ignoring peak demand management is equivalent to ignoring about half of your bill.
With accurate and precise forecasts, you will know when your building is likely to set its monthly peak demand charge, allowing you to pay attention to that half of your bill while ignoring the days that don’t matter.
There is one last thing to consider, that being the difference between simple and easy. Let’s pick on PG&E’s InterAct. Simply logging in takes 6 mouse clicks, requiring you to recall the right username and password and to hope the post-it note with that info hasn’t fallen off the side of your monitor. Click a load curve from one of your Snapmeter emails, and you’re taken straight and securely into your analytics dashboard where one-click diagnostics are waiting to help you find and shed OPEX.