Microchips are at the heart of the modern age, from datacenters to the PCs on employees’ desks. It is a sector that is evolving in the short-term and may, in the relatively near future, change significantly.
Intel, of course, is a giant in the datacenter business. Last week, The Motley Fool posted a story that focused on the company’s Data Center Group. It will, the story said, be Intel’s “primary growth engine going forward.” The DCG generated $16 billion in revenue and more than $7.8 billion in operating income in 2015, according to the piece. The arrow for the group is pointing upward, the story said:
The company said back in November of last year that, through 2019, it expects a roughly 15% compounded annual revenue growth rate for this business. This growth is expected to be driven by two major components: sales of processors and sales of non-processor components.
That 15 percent will be focused on the central processing units, which will claim all but 3 percent of the growth, the story said. The bigger point and context is that just about everything in a data center starts with the chip.
Energy efficiency is a key to the company’s datacenter strategy. Christine Boles, Intel’s Smart Building Solutions Director, told Energy Manager Today that the company is striving to cut datacenter costs:
An example of how an Intel chip can work to help reduce energy consumption can be seen in Intel-powered IT data centers that are fulfilling intensive data exchanges in organizations when Intel processor-based servers reduce server space, cooling costs, and energy usage by recovering data center heat and using it to heat the rest of the building. Advances in energy efficient IT equipment coupled with innovative consumption methods represent untapped opportunities for capturing data center operation savings.
The efficiency of computing equipment is important beyond the datacenter as well. After all, demands of PCs, the mobile devices employees and visitors carry around and even networked soda machines in break rooms is an important factor in how much electricity building systems must provide. According to Boles, Intel is trying to reduce energy demands by, in essence, allowing design devices to be designed with the capability of turning off elements that are not being used:
Intel power management technologies on the silicon can give software developers granular control over the system operation. For mobile applications, like handheld medical devices and ruggedized laptops, some power states save considerable power during periods of low activity and can be used to significantly extend battery runtime.
The road ahead for advanced computing is complex. Moore’s Law – which was developed by Intel co-founder Gordon Moore in the 1960s – says that the number of transistors per square inch will double annually. Experts say that the pace has slowed a bit but remains largely intact. However, the belief is that the ability to make chips ever more dense may be butting up against the limits of physics. Thus, new approaches are necessary.
Earlier this year, The Stack reported on comments by William Holt, Intel’s Executive Vice President and General Manager of its Technology and Manufacturing Group. The bottom line of the story is an important one for energy efficiency managers: The ending of Moore’s Law may require a switch to extremely esoteric approaches. These still are in the design phase. There use will have impact on energy requirements, the story says:
What Holt has said suggests not just that Moore’s Law is coming to an end in practical terms, in that chip speeds can be expected to stall, but is actually likely to roll back in terms of performance, at least in the early years of semi-quantum-based chip production, with power consumption taking priority over what has been the fundamental impetus behind the development of computers in the last fifty years.
This would be a huge deal – albeit in the long-term – for building managers. In general, computer-based building management functions don’t require intense processing capabilities. When heavy number crunching is necessary it usually is sent up to the cloud. Holt is suggesting a work in which very fast on-site computing becomes a niche use by a relatively some group. The rest of the computing world will use slower machines that run more efficiently.