Researchers in the worldwide race toward miniaturization, nanoscience, molecular modeling of drugs and biological systems, advanced materials, and other applications, all face events on atomistic or molecular levels, and have run into a formidable roadblock: "The Tyranny of Scales ." The term refers to the modeling of physical events that operate across large ranges of scale - 12 orders of magnitude in time scale, such as in the modeling of protein folding, or 10 orders of magnitude in spatial scales, such as in the design of advanced materials. At those ranges, conventional methods are rendered useless.

The ICES Multiscale Modeling Group (MmG) has developed a general approach to multi-scale modeling to cope with the tyranny of scales based upon so-called "Goals-Oriented Algorithms." In these algorithms, a specific quantity of interest that is to be predicted at a specific scale is first identified. Then the algorithms allow information from various scales to be adaptively added to the simulation in order to systematically control the errors in the quantity of interest. The group has developed a mathematical theory to estimate the discretization and modeling errors that contribute at all scales of the phenomena to the quantity of interest. One of the application areas within the MmG, where algorithms are being tested and further refined, involves the modeling and simulation of nano-manufacturing of semiconductors through a process called "Step and Flash Imprint Lithography." This work is being supported by the U.S.Department of Energy Program in Applied Mathematics. A second area of active research within the MmG is the development of mathematical theory, algorithms, and computational tools for calibration, verification and validation, and uncertainty quantification of complex biological systems. This work focuses on age-old and fundamental problems of science: can we be sure that our computational models really predict events that happen under certain conditions in the future and how certain are we in the actual outcomes of such predictions? These questions are at the heart of what is now called predictive science, in which the totality of what is known about a physical event - the mathematical model, its parameters, initial conditions, and data on experimental observations - are processed to yield predictions with quantified levels of uncertainty. The foundations of the methods used in this group are Bayesian statistics, a subject born more than two centuries ago in the writings of Thomas Bayes, but only now recognized as a unifying theory in predictive science. To develop predictive models of multiscale phenomena is one of the most challenging open problems in computational science and is an area under study by the MmG. |

*The University of Texas MD Anderson Cancer Center*, Houston, Texas*The University of Texas at San Antonio*, San Antonio, Texas*Eindhoven University*of Technology, Eindhoven, Netherlands

© 2012 Institute of Computational and Engineering Sciences