In my recent interview with AMD's director of commercial solutions Margaret Lewis, part of the conversation covered the area of benchmarks; most notably the question of what the industry standard benchmark should be when the technology inside a computer is designed to deliver a certain amount of performance in exchange for a specific amount of energy consumption. In the old days of benchmarking (and even looking at most benchmarks today), the focus is most often on straight out bang (performance over a given time frame) or bang for the buck (aka: price/performance). But with today's sharply rising cost of energy (last week, Iran warned that oil would go to $200 per barrel as the result of any embargo placed on it), businesses are and should be looking at the impact of energy costs on their bottom line. For larger businesses that run lots of servers or datacenters, the energy costs could range from significant to devastating in what I think of as a potential triple whammy.
The first whammy is the sheer cost of electricity for your computers. In many cases, there are servers sitting around that are running 24 hours a day but that are underutilized which means that they're sort of like keeping the lights on in a room for the entire day when the total amount of time spent in the room (by humans) over any given 24 hour period is 15 minutes. Such servers are great candidates for consolidation using a virtualization technology like Xen.
In other cases, companies may have pursued a scale-out strategy where scaling up might have made more sense. Where as scaling-out involves additional systems to accomodate a growing load, scaling up involves adding more processors (or these days, more "cores") to an existing system. Most people believe the latter approach is the more energy-conservative one but there are plenty of scale-out proponents that would probably argue otherwise. Lack of a widely accepted benchmark makes the arguement difficult to reconcile.
The second whammy has a lot to do with the first. The more heat that a system gives off, the more you must find ways to keep those systems cool. It's one thing to air condition office space. It's an entirely different thing to air condition a room that's full of space heaters (servers, minicomputers, mainframes, and the rest of equipment that goes with them). As the cost of energy goes up, not only does it cost a small fortune to run all of that iron, it cost another fortune to keep it cool. So, imagine if you started looking for ways to reduce the energy requirements for your systems (without sacrificing performance) which in turn would result in a reduction of heat output which in turn would result in a reduction of air conditioning costs.
The third whammy comes at you when your neighbors are doing the same thing you might be doing -- ignoring the energy consumption proboem. If everybody turns a blind-eye and just assumes they can continue to pay for the energy regardless of the cost, the net result is that your local power grid becomes overwhelmed during a heat wave and then suddenly, the cost is no longer a function of money. While your systems are browned or blacked-out, the revenue to your business starts to slip through your fingers. Put another way, while your energy costs are going up, the money you were counting on to pay for them is going down. Not a good situation.
To address the triple whammy, all sorts of solution providers are beginning to talk the energy-talk. Without being specific (there are so many vendors talking about it), you've got chip vendors, system vendors (including the blade folks), consolidation vendors, provisioning vendors (as an offshoot, these guys focus on dynamically reallocating limited resources to keep systems at close to 100 percent utilization, grid providers, etc. all looking to convince you that their technologies are the key to keeping a lid on rising energy costs. The good news is that this is one of those rare situations where doing the right thing for your business is also doing the socially responsible thing for your world. But comparing solutions is difficult if not impossible to do. Although lowering your energy costs may start with consolidating your servers into fewer scale-up boxes each of which includes energy-conscious chips in them, it really only starts there. It requires a bit more of a holistic approach and analysis of everything in the energy foodchain (which could include a complete re-architecture of your software too).
One nearly-holistic and certain high-level approach to benchmarking that I kind of liked recently came from CraigsList CEO Jim Buckmaster. Although fellow ZDNet blogger Russell Shaw didn't cover what Buckmaster said from the energy point of view, he did quote the online classified exec as saying "We do worry about how to maximize page views for kilowatt hours.....We're up to 150,000 pages per kilowatt hour, and got out of a co-lo (co-located facility) because of that." Of course, page views per kilowatt hour isn't relevant when the application doesn't involve page views. But, I think it's an interesting place to start the conversation and in this day and age of rising energy costs, it's definitely a good conversation to have.
Finally, I have one last question if anybody knows the answer. Recently, while sitting in a local coffee shop with free WiFi, I pulled out the power pack for my notebook computer and plugged it in. The owner came over and told me to put the power pack away because of how other notebook users were coming in and running up her electric bill. I offered to pay extra but she said no. Two things happened as a result. First, it was my second visit to the shop and between the two visits (the first of which was with my entire family), I spent about $45. Now, I simply won't be going back. Perhaps the owner is OK with the loss of business. It's not that I'm trying to make a statement by not going back. It's more about the battery in my notebook computer which doesn't last very long. I can't see myself making the trip over to the coffee shop and setting up and everything if I'm only allowed to work for as long as my battery lasts (less than 1 hour when the WiFi and Bluetooth radios are on).
Second, I began to wonder just how much my notebook usage would have cost her anyway. The power pack for my notebook says its output is 19 volts / 4.74 amps. Does anybody know how much it would cost me to run this notebook computer per hour?