A short history of demand charges (and how to trim yours)

Turn of the century meters measured demand and enabled demand charges. Image courtesy of Antique Atlas

Demand charges have been with us for over 100 years. Take a spin through some technology that hasn't changed much since the Victorian Era, and find out how to save a bundle on your bills.

In many energy markets, commercial customers pay for both electricity use (kilowatt-hours) and demand (kilowatts). A simple analogy with water helps explain the difference: use is how much runs through the pipe, and demand is the size of the pipe. Demand is typically measured by the highest flow rate during a given billing month.

We’ve been explaining demand charges to property managers in the Bay Area for a few months now, and we consistently run into puzzlement. Even to building professionals, the logic of one hot day driving 30 day of expenses is confusing. Demand charges have more than tripled in PG&E territory in the last 10 years, driving 25-40% of the bill from one day of usage. Imagine if you pulled up to the gas station and the price to fill your tank was based on the your longest day of driving. Or if your internet price was based on the exact moment your kids were watching Youtube?

How did this all happen?

The early days of electricity were rife with experimentation, with many years passing between commercial service and the first meters (Edison’s first tariffs were based on the number of light bulbs you had!). As meters came into existence, it became clear that distinguishing the fixed costs of distribution and the variable costs of production was an issue. Since electricity can’t be stored and interrupting supply causes big issues for everyone, users’ fluctuating demand necessitated a huge build-out of distribution capacity. Averaging out the costs of this distribution across customers was both potentially unfair and also punishing to certain industrial customers who might choose instead to build their own power plants.

The solution was demand charges, a one time monthly charge designed to price the fixed cost of distribution. The picture at the top of the post is one of the first commercial demand meters, an early mercury based electrolytic model, the Wright Demand Meter. This meter enabled the Chicago Edison company to implement demand tariffs in 1897:

Each monthly bill is based upon the customer’s actual maximum in that month, and during each of the six winter months the customer is charged the full rate of 20 cents per kw-hour for all electricity consumed monthly until the consumption reaches the equivalent of 45 hours’ use, or the maximum number of lamps lighted simultaneously, as indicated by the Wright demand meter; 10 cents per kw-hour being charged for all electricity consumed in excess of that amount. During the summer months the full rate is charged for light consumption until it reaches the equivalent of 15 hours’ use of the maximum number of lamps lighted simultaneously, and for all electricity in excess of that amount the rate of charge is 10 cents per kw-hour.

Clearly the electric industry has excelled at confusing customers for 100 years with tariffs.

Energy tariff confusion for kWh metering electricity use
Image courtesy of Flickr user Sorian

A modern demand meter

The amazing thing is how little has changed. To the left we show a modern model with the spinning dials we’re all familiar with, and if you look closely a larger demand dial.

The meter is read once a month, and the tariff is applied with the maximum use of that month. In the case of antique meters, tipping the meter up returned the liquids and reset the reading. On more modern meters, the lever is reset after the meter is read.

Yes that’s right. We have smart meters. We have 15 minute readings, sometimes available in real time. But the tariff structure is dictated by a 100-year-old piece of metering technology that is presumed to be read once a month. We are literally trapped in our own energy history.

OK, enough history. Tell us how to save money.

Demand charges feel unfair to most folks. The good news is its easy to fight back with a three-step recipe:

  1. Model demand: All buildings move with weather. Determine how yours respond to temperature with a statistical model.
  2. Track your daily peak and forecast the week ahead: Track your peak, use a weather forecast to predict when you might exceed it.
  3. Reduce demand on the top days: Take simple steps to curtail on the days a new peak is likely to be set. If you avoid setting a new peak, you save your organization ~$20 per kW! If you can curtail just 50kW that’s $1,000 in your pocket…per month!

If you don’t want to build your own model, we have a simple service available in PG&E territory that does this for you for a low monthly price. Give us a call to chat about it how works: (650) 924-9917.

Feel free to add your questions in the comments. We’ll also cover how to implement a demand charge reduction program in an upcoming posts.

About Tom Arnold

Tom Arnold is co-founder and CEO of Gridium. Prior to Gridium, Tom Arnold was the Vice President of Energy Efficiency at EnerNOC, and cofounder at TerraPass. Tom has an MBA from the Wharton School of Business at the University of Pennsylvania and a BA in Economics from Dartmouth College. When he isn't thinking about the future of buildings, he enjoys riding his bike and chasing after his two daughters.

0 replies on “A short history of demand charges (and how to trim yours)”

You may also be interested in...

4 Steps to a Better Utility Rate

Need a quick way to lower energy costs? Lean on Gridium to run the analysis and figure out the best utility rate for your unique operation, so you don’t have to.