Understanding Decision-Support Effectiveness: A Computer Simulation Approach

Jeffrey E. Kottemann, Kathleen M. Boyer-Wright, Joel F. Kincaid, Fred D. Davis

Research output: Contribution to journalArticlepeer-review

9 Scopus citations


The interplay between decision-making and decision- support tools has proven puzzling for many years. One of the most popular decision-support tools, what-if analysis, is no exception. Decades of empirical studies have found positive, negative, and null effects. In this paper, we contrast the marginal-analysis decision-making strategy enabled by what-if with the anchoring and adjustment decision-making strategies prevalent among unaided decision makers. By using an aggregate production planning decision task, we develop a Monte Carlo simulation to model 1000 independent what-if decision-making episodes across a myriad of conditions. Results mirror and explain seemingly contradictory findings across multiple prior experiments. Thus, this paper formalizes a simulation approach that expands the scope of previous findings regarding unaided versus what-if analysis aided decision making and suggests that relative performance is quite sensitive to task conditions. In this light, then, performance effect differences in past research are to be expected. While our analysis involves a single task context, the larger and more important point is that, even within a single task context, performance differences between unaided and aided decision making are emergent.

Original languageEnglish
Pages (from-to)57-65
Number of pages9
JournalIEEE Transactions on Systems, Man, and Cybernetics Part A: Systems and Humans
Issue number1
StatePublished - Jan 2009


  • Decision making
  • decision-support systems (DSSs)
  • simulation


Dive into the research topics of 'Understanding Decision-Support Effectiveness: A Computer Simulation Approach'. Together they form a unique fingerprint.

Cite this