This article describes various features of Stockpile Stewardship programme meant for testing of nuclear weapons. Simulating the thermodynamics of a nuclear blast requires millions of variables. Although the simulations are challenging, the role of simulations continues to increase significantly in the absence of nuclear testing. Stockpile Stewardship is groundbreaking science in which experiments must comply with the nuclear test moratorium, arms limitation treaties, and budgetary and political realities that go with them. Due to the complexity and uncertainties in simulations, leading engineering organizations such as Los Alamos are changing their approach to simulating large, complicated problems such as the thermodynamics of nuclear explosions. The Department of Energy is encouraging the recognition of verification and validation as a scientific discipline in and of itself. In the Predictive Science Academic Alliance Program, five universities—California Institute of Technology, University of Michigan, Purdue University, Stanford University and the—conduct research to support the National Nuclear Security Administration’s stockpile stewardship mission, which includes training scientists and engineers in the new field of ‘predictive science’.
The Cold War is over, but nuclear deterrence is hardly a thing of the past. Although the United States has not detonated a nuclear weapon in years, the country still maintains a vast stockpile of them.
Assuring the safety and reliability of those weapons is the job of the National Nuclear Security Administration. NNSA oversees the U.S. Stockpile Stewardship Program, which certifies that the weapons will work if they are ever needed to defend the nation, and also assures that they are safe.
NNSA, part of the Department of Energy, has tasked three national laboratories—Los Alamos, Sandia, and Lawrence Livermore— with analyzing the weapons to predict their performance, safety, and reliability. They do that by vast computer simulations involving the physics of a nuclear detonation, validated using a wide array of fundamental science experiments
Simulating the thermodynamics of a nuclear blast requires millions of variables. The events being modeled contain stresses and shocks that cannot be produced by any ordinary means, heat like nothing on earth, and millions of nanosecond time steps.
Although the simulations are challenging, the role of simulations continues to increase significantly in the absence of nuclear testing. As Scott Doebling, of Los Alamos National Laboratory, has pointed out, “Simulations of this scale and complexity are filled with uncertainties. The details of what actually happens in thermonuclear devices during an explosion are still the subject of considerable study among scientists.”
Doebling has extensive experience in simulation, verification, and validation, much of it through work on nuclear weapons. He is a group leader in the Computational Physics Division at Los Alamos. He also chairs ASME's V&V 10 Verification and Validation Standards Sub-committee.
Because of the complexity and uncertainties in these simulations, leading engineering organizations such as Los Alamos are changing their approach to simulating large, complicated problems such as the thermodynamics of nuclear explosions. The new approaches to these simulations are vital to the Stockpile Stewardship Program for America's nuclear weapons.
Good simulations depend heavily on validation with experimental data. The Stockpile Stewardship Program's challenge is that the experimental data from even the most heavily instrumented bomb tests is incomplete. “Most of the scientists working with that data today did not participate first-hand in the planning and execution of these experiments,” Doebling said. “In many cases today's scientists and engineers are working with experimental data that is older than they are.”
According to Doebling, as the ranks of subject-matter experts with first-hand experience of physical nuclear tests decline, the scientific value of the data in these experiments is more difficult to extract. “As the years pass, extracting good information gets harder and harder,” Doebling said. “Even so, every weapon performance simulation is validated to a large degree with data gleaned from nuclear tests.”
Some of the experimental data still in use dates back to the 1950s, before almost any digital computing, and there is even legacy information in use that comes from the Manhattan Project in World War II. For nearly 20 years, the only nuclear tests have been virtual recreations of explosions inside a supercomputer.
confidence and consequence
Stockpile Stewardship is groundbreaking science with a big budget—about $7.6 billion for fiscal year 2011 including components, production, and engineering—and big risks. Fully appreciating the risks, NNSA set up the Advanced Simulation and Computing program in 1995, just three years after all nuclear testing was halted. Coupled with a vigorous experimental science program, ASC designs, procures, and operates some of the world's largest supercomputers while also developing, implementing, validating, and maintaining software to enable simulation of nuclear weapons and conduct leading-edge scientific research. The ASC budget request for fiscal 2011 is about $600 million.
ASC reflects an increasing emphasis on simulations in the nuclear weapons program. With a great deal of support, new analyses are continually being developed at Los Alamos, Livermore, and Sandia. Doebling pointed out that “analyses are done the fundamental way: A hypothesis is developed and simulations are used to test it, along with the experimental data that we do have.”
The 1992 nuclear test moratorium aside, there are additional dimensions to the challenges surrounding experimental data. “In every experiment, whether it's part of an old bomb test or a new non-nuclear experiment, we are still discovering new things that show us that our current computational models are incomplete,” Doebling said. “That is why good experimental data is still the gold standard.”
What is of most interest today are the interactions among physical phenomena. These interactions have always been key to understanding the behavior of complex engineered systems.
“So now a big question in stockpile stewardship becomes: Do we have a sufficiently high level of confidence in the simulation to rely on it to inform our assessment of risk?” Doebling said. “Is the simulation predictive when we, as a nation, don’t want to or can’t afford to do full-scale tests to validate?”
Along with level of confidence, there are questions about the level of consequence. As he puts it, “How well do we understand what might happen and how would we deal with those consequences?” In stockpile stewardship, experiments have to comply with the nuclear test moratorium, arms limitation treaties, and budgetary and political realities that go with them.
So the issue is “how to determine how much confidence we can have,” Doebling said. “What is the methodology to determine that? What are the consequences of uncertainty, or of an error? Ultimately, the question in simulations—any of them—comes down to what to believe and what not to believe.”
Scientists are not, however, without resources to answer these questions and the underlying challenges of experimental data. Two of the most powerful tools are uncertainty quantification and verification and validation, especially when UQ and V&V are used together.
“UQ deals with what we as scientists, analysts, and engineers are unsure of in building a simulation to solve a problem,” Doebling explained. “UQ gives us a way to credibly assign quantitative values to what we are uncertain about.” V&V is one of the more effective tools “to organize information about the comparative metrics, uncertainty analysis, and numerical approximations that are fundamental to computer simulations,” he said. V&V does this in the context of real-world physics and experiments (whatever is available), and then correlates with the analysis being built into the simulation.
According to Doebling, “It's a scientifically motivated way to ensure that a simulation actually addresses the original problem, and that the analysis will be accurate.”
Scientists who work with V&V and UQ say they are powerful conceptual tools to peer deeply into simulations, to see which physical phenomena are being analyzed (and which are not), and how the analysis is to be done. That makes V&V and UQ crucial in ensuring the validity of simulations and analyses.
“What we are trying to define is how to do very complex simulations effectively,” Doebling said. “Especially for uncertainty analysis, what methods should we use? What validation metrics should we choose for simulation and experiment? We are constantly pushing the science to become more predictive.”
ASME subcommittee V&V 10 has developed a standard (V&V 10-2006) approved by the American National Standards Institute. This standard is about computational solid mechanics, which in practice typically uses finite element analysis. As formalized by ASME, V&V 10-2006 is one of the first standards to deal with the size, scope, and complexity of simulations required by 21st century engineering.
The DOE is encouraging the recognition of verification and validation as a scientific discipline in and of itself. In the Predictive Science Academic Alliance Program, five universities—California Institute of Technology, University of Michigan, Purdue University, Stanford University and the University of Texas-Austin—conduct research to support the NNSA's stockpile stewardship mission. That mission includes training scientists and engineers in the new field of “predictive science.”
For the PSAAP university partners, predictive science cuts across the boundaries between traditional physics and engineering disciplines. According to Doebling, students in the program “are being trained to think across the academic and professional disciplines rather than becoming more and more highly specialized in a single field.”
At Los Alamos, “We are greatly cheered when scientists, especially newly hired technical staff, question a simulation. They ask, ‘What parts of the physics are we missing here? What are we not seeing?’ ”
According to Doebling, it is by raising questions like these and figuring out methodologies to discern the answers that the Stockpile Stewardship Program advances.
When the U.S. Department of Energy created the Predictive Science Academic Alliance Program with five leading universities, it was looking to do more than virtually test nuclear weapons. The goal of the program is to focus on “predictive science” based on verified and validated large-scale simulations that can predict properties of complex systems, particularly where experimental tests of the physics are not feasible.
Predictive science is applicable to a wide range of analyses that require extremely large, complicated models, such as biological systems, global economics, nuclear weapons effects, climate modeling, and manufacturing.
The PSAAP universities are conducting such leading edge modeling and analysis as:
The California Institute of Technology in Pasadena is studying the high energy-density dynamic response of materials in impacts of metallic projectiles and targets at velocities up to 10 kilometers per second.
The University of Michigan at Ann Arbor is developing a software framework for radiation hydrodynamics in shock waves.
Purdue University in West Lafayette, Ind., is simulating microelectromechanical systems technologies, from contact physics and membrane response to multiscale modeling of damping, and integrating it all into a coherent simulation system.
Stanford University in Palo Alto, Calif., is working on simulating air-breathing hypersonic vehicles and is studying failure modes such as shock waves and error estimation.
The University of Texas in Austin is modeling the re-entry of space vehicles into the atmosphere, which involves such phenomena as aerothermochemistry, thermal radiation, turbulence, and the response of complex materials to extreme conditions.