Small sample sensitivity analysis techniques for computer models, with an application to risk assessment

Ronald L. Iman, W. J. Conover

Research output: Contribution to journalArticlepeer-review

590 Scopus citations

Abstract

As modeling efforts expand to a broader spectrum of areas the amount of computer time required to exercise the corresponding computer codes has become quite costly (several hours for a single run is not uncommon). This costly process can be directly tied to the complexity of the modeling and to the large number of input variables (often numbering in the hundreds). Further, the complexity of the modeling (usually involving systems of differential equations) makes the relationships among the input variables not mathematically tractable. In this setting it is desired to perform sensitivity studies of the input-output relationships. Hence, a judicious selection procedure for the choice of values of input variables is required. Latin hypercube sampling has been shown to work well on this type of problem. However, a variety of situations require that decisions and judgments be made in the face of uncertainty. The source of this uncertainty may be lack of knowledge about probability distributions associated with input variables, or about different hypothesized future conditions, or may be present as a result of different strategies associated with a decision making process. In this paper a generalization of Latin hypercube sampling is given that allows these areas to be investigated without making additional computer runs. In particular it is shown how weights associated with Latin hypercube input vectors may he changed to reflect different probability distribution assumptions on key input variables and yet provide an unbiased estimate of the cumulative distribution function of the output variable. This allows for different distribution assumptions on input variables to be studied without additional computer runs and without fitting a response surface. In addition these same weights can be used in a modified nonparametric Friedman test to compare treatments. Sample size requirements needed to apply the results of the work are also considered. The procedures presented in this paper are illustrated using a model associated with the risk assessment of geologic disposal of radioactive waste.

Original languageEnglish
Pages (from-to)1749-1842
Number of pages94
JournalCommunications in Statistics - Theory and Methods
Volume9
Issue number17
DOIs
StatePublished - Jan 1 1980

Keywords

  • Friedman test
  • Latin hypercube sampling
  • estimation risk with uncertainty
  • multiple comparisons
  • radioactive waste depository
  • risk assessment
  • sensitivity analysis of computer models

Fingerprint Dive into the research topics of 'Small sample sensitivity analysis techniques for computer models, with an application to risk assessment'. Together they form a unique fingerprint.

Cite this