The numbers for IBM’s global energy profile are as impressive as you’d expect for a global tech company. Their 900 locations in 71 countries represent 84 million square feet and worldwide electricity consumption for 2016 was 3,695 gigawatt hours.
One of the company’s ongoing goals is 3.5% annual energy conservation, explains Greg Peterson, the company’s manager of global energy and environment. He and his team are responsible for improving energy efficiency at IBM facilities. Over time, they’ve completed plenty of standard conservation projects. But replacing chillers and changing light fixtures only goes so far.
“You’ve got to come up with new ideas and technologies,” he says. “That’s what drove us to start looking at cognitive solutions and using more analytics.”
Peterson spoke about incorporating big data analysis into performance monitoring at the Environmental Leader Conference in June. We caught up with him recently to learn about how IBM uses cognitive systems to save money and energy in its buildings.
What does “big data” mean for you?
When I talk to the mainframe folks, they laugh because what I call “big data” is not at all big to them. It’s not even a blip on their radar screen. But it’s big data to us and for our systems, and it’s two-way communication most of the time.
How is IBM using that data to increase energy efficiency?
We’ve got three cutting-edge systems installed at many of our buildings. One, called Smarter Buildings, was an internally-developed program we started working on about five years ago. We developed and tested it in-house and it became a product for IBM.
It constantly monitors the equipment within your building, air handlers, chillers, boilers —whatever you want to monitor — and operates off a set of rules or analytics. For example, if you have an air-handling unit that is simultaneously heating and cooling, you’re wasting a lot of energy. In the summer, we’ve probably got a steam valve that isn’t all the way closed. A high-priority alert will go directly to a maintenance technician. Then we’ll get it fixed.
We have [Smarter Buildings] installed in about 145 sites or campuses, representing just about half of our total energy usage. In its current state that generates about 30,000 data points every 15 minutes. When we put building management systems in 10, 20 years ago, we weren’t anticipating moving this kind of data. The nice thing about Smarter Buildings is that it’s an enterprise system in the cloud. You make one change to the rules and it changes that across all the sites.
What difference has this system made?
Last year, just from Smarter Buildings within IBM, we had about $1.8 million in energy savings alone. This year we have an energy savings target of $2.4 million. There are also maintenance savings because alerts tend to get more to the root cause of the problem.
We used to recommission our buildings on about a three-year cycle. We would go through the entire building’s mechanical system with a team of maintenance technicians and engineers, check everything, fix everything. We’d come back three years later and find there were a lot of things that needed repair again. By putting this system in, it’s like continuous commissioning. We’re finding — and repairing — the problems as they occur.
During the conference you talked about optimizing chilled water systems. How does that work?
Typically if you have a large facility, you have a chilled water system. That’s how you air condition the building. We spend a good bit of money around the world cooling our facilities. Roughly 40% of all our energy goes to cooling the buildings. Every chiller system is different. You can design two side by side and they’ll each operate a little differently.
We found a software product called Optimum Energy that monitors the entire chiller system and figures out in real time how to most efficiently produce that chilled water for a facility. It’s considering the weather, equipment, temperatures, past usage trends, building use. It’s also making changes to the equipment so the system becomes more efficient as time goes on.
There are, on average, about 400 points per site in roughly one-minute intervals, and it’s two-way communication. We have that at 12 sites right now and we’re probably going to add another eight shortly. We’re saving about $4.5 million a year right now, and we’re seeing 25 to 35% energy reduction on our chiller plants.
What’s the third example of a system IBM has in place?
In a data center, you’ve got IT equipment on the raised floor, and computer room air conditioning (CRAC) units around the perimeter, sometimes even in the middle of the floor. They are fed by chilled water and produce cold air. That cold air is fed underneath the raised floor and then goes up through perforated tiles into the cold aisles to feed the IT equipment.
A product we purchased, Vigilent Dynamic Cooling Management System, is similar to the chiller optimization. We put temperature sensors through the data center. Then we put controllers on the CRAC units. So we feed all this information back to the central piece of software that’s cognitive and it controls the CRAC units.
If you’ve ever been to a data center, some areas are warmer and have more equipment than others. Typically you overcool the data center because there’s a hot spot somewhere. In these data centers it can be 70 to 72 degrees. You’re wasting energy.
[The management system] puts cooling where it’s needed. In areas that require very little cooling, the CRACs can back down or even turn themselves off. [The system] is learning as it goes, figuring out which CRACs are supplying what areas of the data center. It’s not intuitive. You would think a CRAC would supply the area right around it, but it doesn’t necessarily work that way. [The system] creates influence maps showing where the CRACs are actually cooling. We see the cooling energy go way down — many times cut by over 50%. Those are big numbers. We’re implementing this in all of our data centers worldwide.
Is having these types of cognitive systems an important trend?
It seems to be because nothing is static in our world. Take a data center. Every day we’re moving equipment out and in. So if you’re trying to adjust for that, it’s really hard. But when you have a system that’s cognitive, it self-adjusts. I think you’re going to see it in more systems that control buildings as time goes on.
Greg Peterson spoke about big data analytics at the 2017 Environmental Leader Conference in Denver.