Completing Nuclear Energy's Digital Transformation

Posted

As the nuclear power industry enters it second great era of growth, a new generation of plants will make electricity using the same fundamentals of physics first harnessed some 60 years ago. But the way these plants are designed, built and operated will be enormously different from their ancestors. Today’s computer-aided design systems can “assemble” the millions of parts of a nuclear plant entirely in silicon. Whole facilities can be “built” this way, before the first shovel ever touches dirt. This approach lets builders discover and avoid bugs before they become costly physical mistakes.  Printed documentation is disappearing, too, as context sensitive software delivers the right answer, faster, to operators.

Yet the legacy of the industry’s analog roots is still with us. It can be hard to imagine, but the first generation of commercial nuclear plants was born and built in a pre-computer age, when slide rules were used to for calculations, sketches were done on graph paper and final plans came on blueprints. It’s a testament to the remarkable intelligence and dedication of the first corps of U.S. nuclear engineers that in a span of roughly 30 years, with scant computing power, they designed and built what is still today the world’s largest and most reliable fleet of commercial nuclear reactors.

For all their ingenuity, though, analog-era nuclear designers were never able to crack a problem that still dogs the industry today, that of information immobility. Back then, designing, building and documenting a single piece of equipment could easily result in scores of blueprints and stacks of three-ring binders. Any amendment to those designs created a cascade of changes that rippled through the paper documents, requiring veritable armies of specialists to swap new information into the archives. Errors and omissions were inevitable.

Information immobility can even lead to the gradual, unintentional erosion of knowledge. For instance, back then, when the designer-builders had to hand over the keys for a new facility to its owner-operators, transferring the plant’s “operating manual” literally meant backing up a truck and unloading tons of documents. Often those materials went unused, though, as technicians learned practices on the job. It’s no wonder that in fleet-wide audits following the accident at Three Mile Island in 1979, many facilities spent lots of money to reconcile their documentation with actual processes at their plants.

Today, these problems are much diminished, but by no means eliminated. Indeed, while there’s been no new nuclear construction in the U.S. for some 30 years, design and operations software has been advancing continuously in the automotive, shipbuilding and aerospace industries, often subsidized by the government. In state-of-the-art maintenance systems being developed by the military, for example, when a technician looks at a piece of equipment, visual recognition software can ID the gear and display relevant data and instructions in a wearable display. The rendering the software calls up may have been first created years earlier by a design engineer, and then later on enhanced with new data, such as supplier details, material qualities, or service records. The nuclear industry is poised to benefit from this investment by borrowing the best of these innovations.

To benefit from these advances though, the nuclear industry has to overcome multiple challenges. Even as industry players learn how to retrofit these technologies into existing plants, they also need to adapt and transplant their best practices into tomorrow’s new reactors. From China to the Middle East and Latin America, more than 60 green field plants are under construction in 15 countries, with many more in planning stages.

For new projects, the opportunity exists to build information architectures from the ground up that will take advantage of the sorts of cutting-edge technologies described above. Indeed, one advantage of creating an adaptable software foundation at the outset is that the cost to develop advanced applications later on is much lower than in facilities that don’t fully digitize and integrate their operations early on. Such a move can also lower ongoing long-term upgrade costs. Consider that the lifecycle of a plant’s info tech—the mix of software and hardware used to manage a facility—is from five to 15 years. So in the typical 60-year lifespan of a nuclear plant, it will go through many info-tech refresh cycles.

Analog-era plants can see big benefits too. With U.S. reactors leading the way, many of the world’s more than 430 existing nuclear power plants will go through a process to relicense and extend their operations by another 10, 20 or 30 years. These steps open doors to substantial upgrades to a nuclear facility’s physical plant and its digital nervous system.

U.S. nuclear sites have shown the fruits of such efforts. Thanks in part to a steady commitment to process optimization and software investment, U.S. operators have improved operations to levels of safety and output unequaled elsewhere. In the early 1970s, the industry’s average capacity factor—how much energy the fleet produces as a share of its maximum potential—was less than than 50 percent. Last year, it exceeded 90 percent.

These are metrics China and other new builders are eager to match. As they forge ahead, new plants have the potential to reduce risk in every respect—from design and construction costs, to operations and public safety—by adapting the best, long-developed practices in the U.S. and Europe. The next generation of plants also has the opportunity to go one better, by building in digital intelligence that deeply links the design, construction and operations of plants in ways no earlier plants have done. The potential rewards are higher safety, lower building now, and lower operating costs into the future.

Neil Gerber is a Global Power Generation Solution Executive in IBM's Energy & Utilities Industry.

Environment + Energy Leader