November 2015
Problems
Recent innovations in science and technology have provided the impetus for a new paradigm in scientific discovery. Thus, while new sensing technologies allow us to probe nature with unprecedented accuracy, new computing resources allow us to account for great complexities in physical behavior. These newfound capabilities have heightened society's expectations from the scientific process, requiring it to provide foresight and thus mitigate risks associated with ever-increasing complexity. Of recent interest to us are risks associated with carbon sequestration, next generation (GEN IV) nuclear reactors, complex networks, and advanced materials. In these and other applications, a common challenge is to delineate the scope of predicted behaviors that can be certified given our current knowledge, while at the same time determining the worth of additional information, in the form of numerical resolution, experimental evidence, model complexity, or combinations thereof.
Approach
We adopt probability theory as logic for science, and rely on its mathematical structure to develop a methodology that permits the consistent description and characterization of discrepancies between predictions and observations. These discrepancies are associated with numerics, experiments, and models. In particular, we take advantage of the functional analytic structure that can be endowed on probabilistic objects, thus providing us with a geometric perspective, in a space whose so-called stochastic dimension reflects the complexity of the underlying phenomenon. We rely on the polynomial chaos decompositions to affect orthogonal projections and approximations in this space. Clearly, the curse of dimensionality is a ubiquitous challenge in our work, and we address it by relying on a mixture of model reduction approaches, ranging from informationtheoretic to multiscale and statistical mechanics methods.
Findings
By developing sampling distributions for the coefficients of the polynomial chaos expansions, construed as sample statistics, we tag some of the stochastic dimensions as reflecting epistemic uncertainty, and fold them into the probabilistic structure already in place. This has provided us with meaningful error bounds around the predicted probability of failure for various systems, which is of significance to policy and decision-making. In addition, we have been able to determine the cost of increasing the confidence in these predictions, providing owners and decision makers with a solid foundation for valuating performance-based designs.
Impact
Our research develops the necessary concept, methods and tools to transition from computational science to prediction science. Through suitable methods for uncertainty quantification and management, the promise of a computational laboratory providing an adequate surrogate to physical reality may finally be at hand. This will be a critical enabling technology that will greatly accelerate the transition from concept to market across many applications, while at the same time reducing lifecycle cost and facilitating the process of scientific discovery.
Selected Publications
1. Saad, G. and Ghanem, R. (2009) "Characterization of
reservoir simulation models using a polynomial
chaos-based ensemble Kalman filter",
Water
Resources Research
, 45, Art.Num.: W04417.
2. Das, S. and Ghanem, R. (2008) "Asymptotic
sampling distribution for chaos representations of
data",
SIAM J. Comp. Sci.
, 30(5), 2207-2234.
3. Doostan, A., Ghanem, R., and Red-Horse, J. (2008)
"A probabilistic construction of model validation",
Comp.Meth.App.Mech.Eng.
, 197(29-32), 2585-2595.
4. Shi, J. and Ghanem, R. (2007) "A stochastic nonlocal
model for materials with multiscale behavior",
Int'l
J. Computational MultiScale Eng.
, 4(4), 501-519.
5. Moon, S.J., Ghanem, R., and Kevrekidis, I. (2006)
"Coarse graining the dynamics of coupled
oscillators",
Physical Review Letters
, 96(14),
Art.Num. 144101.