The Cost of Modeling (via Skype)

Wednesday, June 24, 2015: 12:45 p.m.-1:15 p.m.
For several decades groundwater modelers have argued over the appropriate use of models, specifically whether models should be as complex as the field data and modeling software allows or whether there are merits to the use of simpler models. It was soon recognized that ever increasing model complexity (more parameters) might not necessarily lead to improved representation of reality or to better predictive capability. Obviously, too little complexity (oversimplified models) can also degrade predictive capability. Consequently, concepts like “model complexity control” and “optimal model complexity” are now frequently found in the literature on this topic. In all of these discussions we hear the scientist searching for the best possible model of a real world system, or at least the best predictive model for that system. Yet, most computer modeling is performed by consultants who are responding to the needs of their clients. They need to do this fast and cheaply in order to be competitive. Under these circumstances they do not search for the best model that current technology may offer, but for the cheapest adequate model; adequate for answering the questions posed by the client. This cost-benefit consideration is rarely included in the discussions about what is the appropriate level of model complexity. Including cost in the equation invites a step-wise modeling approach whereby the job is considered done as soon as the questions are answered, not when a “best” or “optimal” model is realized. 
Presenter:
Hendrik Haitjema
See more of: Presentations