Top Navigation Bar

ASAE Journal Article

Hydrologic and Water Quality Models: Use, Calibration, and Validation

D. N. Moriasi, B. N. Wilson, K. R. Douglas-Mankin, J. G. Arnold, P. H. Gowda


Published in Transactions of the ASABE Vol. 55(4): 1241-1247 ( 2012 American Society of Agricultural and Biological Engineers ).

Submitted for review in June 2012 as manuscript number SW 9812; approved for publication by the Soil & Water Division of ASABE in August 2012.

The authors are Daniel N. Moriasi, ASABE Member, Hydrologist, USDA-ARS Grazinglands Research Laboratory, El Reno, Oklahoma; Bruce N. Wilson, ASABE Fellow, Professor, Department of Biosystems and Agricultural Engineering, University of Minnesota, St. Paul, Minnesota; Kyle R. Douglas-Mankin, ASABE Member, Professor, Department of Biological and Agricultural Engineering, Kansas State University, Manhattan, Kansas; Jeffrey G. Arnold, ASABE Fellow, Agricultural Engineer, USDA-ARS Grassland Soil and Water Research Laboratory, Temple, Texas; and Prasanna H. Gowda, ASABE Member, Agricultural Engineer, USDA-ARS Southern Plains Conservation and Production Research Laboratory, Bushland, Texas. Corresponding author: Daniel N. Moriasi, USDA-ARS Grazinglands Research Laboratory, 7207 W. Cheyenne Street, El Reno, OK 73036; phone: 405-262-5291; e-mail: daniel.moriasi@ars.usda.gov.


Abstract. To provide a common background and platform for consensual development of calibration and validation guidelines, model developers and/or expert users of the commonly used hydrologic and water quality models globally were invited to write technical articles recommending calibration and validation procedures specific to their model. This article introduces a special collection of 22 research articles that present and discuss calibration and validation concepts in detail for 25 hydrologic and water quality models. The main objective of this introductory article is to introduce and summarize key aspects of the hydrologic and water quality models presented in this collection. The models range from field to watershed scales for simulating hydrology, sediment, nutrients, bacteria, and pesticides at temporal scales varying from hourly to annually. Individually, the articles provide model practitioners with detailed, model-specific guidance on model calibration, validation, and use. Collectively, the articles in this collection present a consistent framework of information that will facilitate development of a proposed set of ASABE model calibration and validation guidelines.

Keywords. ASABE, Calibration, Guidelines, Hydrologic models, Hydrology, Validation, Water quality, Watershed.

Hydrologic and water quality (H/WQ) models are increasingly used to evaluate the impacts of climate, land use, and land and crop management practices on the quantity and quality of land and water resources. Calibration and validation of these models are necessary before using them in research and/or real-world applications. No universally accepted procedures or guidelines for calibration and validation currently exist in the literature. However, there are numerous viewpoints among model developers and model practitioners as to how calibration and validation should be implemented and reported to assist the peer-review process and to withstand legal scrutiny (Refsgaard and Storm, 1995; Refsgaard and Storm, 1996; Refsgaard, 1997; Santhi et al., 2001; Jakeman et al., 2006; Moriasi et al., 2007; Engel et al., 2007; Bennett et al., 2010).

Numerous issues related to calibration and validation of H/WQ models have been discussed by researchers. Topics include philosophical frameworks for calibration and validation (Beven and Binley, 1992; Beven, 1993), statistical and graphical model performance evaluation methods (Loague and Green, 1991; ASCE, 1993; Legates and McCabe, 1999), general procedures for calibration and validation (Donigian et al. 1983; Santhi et al., 2001; Donigian, 2002; White and Chaubey, 2005; Engel et al., 2007; Moriasi et al., 2007), autocalibration (Beven, 1993; Gupta et al., 1998, 1999; van Griensven and Bauwens, 2003; Abbaspour et al., 2007), incorporation of uncertainty analyses in model simulations (Beven and Binley, 1992; Beven, 1993; Shirmohammadi et al., 2006; Harmel and Smith, 2007; Harmel et al., 2010), and guidance on model performance criteria (Refsgaard and Henriksen, 2004; Engel et al., 2007; Moriasi et al., 2007; Harmel et al., 2010). Even with this large body of literature on model calibration and validation, it is difficult to compare modeling results from different studies because there are no universally accepted guidelines, and users utilize different calibration and validation methods.

The acceptance of guidelines for model calibration and validation provides many specific advantages to the modeling community, which include:

In 2010, two subcommittees (in essence, Process and Communication) were established by ASABE with the goal of developing modeling guidelines. In order to provide a common background and platform for consensus building, model developers and/or expert users of the commonly used H/WQ models were invited to write technical articles on recommended calibration and validation procedures for their specific models. These recommended procedures are captured in this special collection. These articles not only set the stage for developing appropriate model calibration and validation guidelines but are also invaluable in the proper application and reporting of results for chosen models. The objective of this introductory article is to introduce and summarize key aspects of the quality H/WQ models presented in this special collection.

Summary of Hydrologic and Water Quality Models

There are 22 research articles in this special collection, comprising 25 models (table 1). Each model is introduced with a description of the purpose for which the model was developed and its recommended spatial and temporal scales. For each model, the authors also provide information on the developmental history of their model(s), research and/or real-world applications, availability of source code, and technical user support. Recommended calibration and validation methods include discussion of recommended data screening and of ideal or minimum acceptable calibration and validation results. A case study is provided to demonstrate the application of calibration and validation recommendations. Finally, the strengths and weaknesses of the model as well as directions for future developments are discussed.

Tables 1 through 3 summarize important information for each of the models. Specifically, table 1 presents the processes (variables) simulated and the spatial and temporal scales for the models included, which vary in spatial context from field to watershed scale and in the watershed components represented (e.g., hydrology, sediment, nutrients, and pesticides components). Of the 25 models represented, 20 models simulate both hydrology and water quality (sediment, nutrients, pesticides, etc.); two models simulate hydrology, heat transfer, and solute transport (HYDRUS, Šimunek et al., 2012; VS2DI, Healy and Essaid, 2012); and one each simulates only hydrology (DRAINMOD, Skaggs et al., 2012), solute transport in soils and groundwater (STANMOD, van Genuchten et al., 2012), and hydrology and heat transfer (SHAW, Flerchinger et al., 2012). MT3DMS (Zheng et al., 2012) is the only model focusing solely on groundwater. There are six field-scale models and six watershed-scale models; the rest simulate either at the point scale or cover ranges of scales, from point to field, plot to field, or plot to watershed. The temporal scales range from minutes to decades.

Table 2 presents information regarding whether or not the model currently has open source code, contains a GIS interface, and has available user support. Sixteen of the 25 models are open source code. For the HYDRUS (Šimunek et al., 2012) models, the HYDRUS-1D code is publicly available, whereas the code for HYDRUS (2D/3D) is distributed commercially for a nominal fee. Eighteen models, ranging from soil-column to watershed scale, have a GIS interface to help with input preparation and data manipulation during the calibration and validation process. Most of the models currently provide some form of user support. The support types include theoretical documentation, user’s manuals, GIS and Windows interface manuals, developer’s manual, e-mail newsletters, website user groups, applications guides, tutorial manuals, and workshop training.

Finally, table 3 presents information regarding calibration and validation strategies and the model performance evaluation methods demonstrated in the case studies presented in each article. All the models discussed in this collection require calibration in one form or another, as demonstrated by the case studies. Calibration procedures vary with models, with some supporting manual calibration alone and others allowing both manual and auto calibration. Most of these models also support and recommend model validation, with a split-sample strategy as the most common method. MT3DMS (Zheng et al., 2012) is the only model that does not include model validation because the authors state that “others have argued that, at least philosophically, a groundwater model, like any scientific hypothesis, cannot be validated in the absolute sense and thus the term ‘model validation’ should be avoided (Konikow and Bredehoeft, 1992).” Most models in this collection utilize both graphical and statistical methods to evaluate model performance. The graphical methods used include time series plots, scatter plots, cumulative frequency distribution, and contour maps. Some of the statistics used include root mean square error, Nash-Sutcliffe efficiency (Nash and Sutcliffe, 1970), index of agreement, percent error, mean absolute error, correlation coefficient, mean error, absolute mean error, relative error, relative bias, standard error of estimate, coefficient of model-fit efficiency, Kolmogorov-Smirnov test, coefficient of determination, mean absolute error, model efficiency, normalized root mean square error, root mean square difference, minimum value of the nonlinear weighted objective function, percent bias, root mean square error to standard deviation ratio, mean error, 95% confidence interval to account for uncertainty, means, and standard deviation. Detailed definitions of these statistics can be obtained from the model-specific articles and elsewhere (e.g., Legates and McCabe, 1999; Moriasi et al., 2007). A few models provide performance ratings, including BASINS/HSPF (Duda et al., 2012), DRAINMOD (Skaggs et al., 2012), EPIC and APEX (Wang et al., 2012), HYDRUS (Šimunek et al., 2012), and SWAT (Arnold et al., 2012).

Future Work

The next steps in development of the model calibration and validation guidelines will be determined by the Process and Communication subcommittee members. These steps may include, but are not limited to:

Table 1. Summary of simulated processes (variables) and spatial and temporal scales for H/WQ models in this collection.

Model

Simulated Processes (Variables)

Spatial Scale

Temporal Scale

Reference

ADAPT

Hydrology, erosion, nutrients, pesticides, subsurface tile drainage.

Field

Daily

Gowda et al., 2012

BASINS/

HSPF

Hydrology, snowmelt, pollutant loadings, erosion, fate and transport.

Watershed

Daily

Duda et al., 2012

CREAMS/

GLEAMS

Hydrology, erosion, pesticides, sediments, nutrients, plant growth.

Field

Daily

Knisel and Douglas-

Mankin, 2012

CoupModel

Hydrology, nitrogen, carbon, plant growth, heat, tracer, chloride.

User defined

Minutes to years

Jansson, 2012

Daisy

Water, snowmelt, carbon cycle, energy balance, nitrogen cycle, crop production, pesticides.

One to several fields

Minutes to daily

Hansen et al., 2012

DRAINMOD

Hydrology: water table depth, tile flow, surface runoff, depth of irrigation water applied, wetland hydrology.

Plant growth: crop yield.

Point to watershed

Hourly and daily

Skaggs et al., 2012

EPIC and

APEX

Hydrology: surface runoff, streamflow, tile flow.

Plant growth: erosion, sediments, nutrients, pesticides.

EPIC: field;

APEX: field

to watershed

Daily to annual

Wang et al., 2012

HYDRUS

Water flow, solute transport, heat transfer, carbon dioxide.

Column to field

Minutes to years

Šimunek et al.,

2012

MACRO

Macropore flow, pesticides.

One-dimensional

flow; field

Minutes to decades

of simulations

Jarvis and Larsbo,

2012

KINEROS/

AGWA

Runoff, erosion, sediments.

Plot to watershed

Event

Goodrich et al.,

2012

MIKE-SHE

Surface and subsurface water dynamics, interception, evapotranspiration, overland flow, channel flow, unsaturated flow, saturated zone flow, water levels; surface and groundwater quality.

Watershed

Seconds to daily

Jaber and Shukla,

2012

MT3DMS

Multispecies solute transport, groundwater.

Plot to watershed

Hourly to daily

Zheng et al., 2012

RZWQM

Hydrology, plant growth, nutrients, pesticides.

Plot to field

Hourly to daily

Ma et al., 2012

SHAW

Hydrology, heat transfer.

Point scale

Hourly to daily

Flerchinger et al.,

2012

STANMOD

Solute transport in soils and groundwater.

One- and multi-

dimensional transport;

laboratory and field

Events

van Genuchten

et al., 2012

SWAT

Hydrology, plant growth, sediments, nutrients, pesticides.

Basin

Daily

Arnold et al., 2012

SWIM

Water and solute movement.

Field section to field

Days to annual

Huth et al., 2012

TOUGH2

Multiphase, multicomponent fluids in porous and fractured geologic media.

No inherent limitation:

pore-scale to reservoir

Generally short time

steps used to solve

differential equations

Finsterle et al.,

2012

VS2DI

Water, solute, heat transport.

No inherent limitation;

point to watershed

Seconds to decades

Healy and Essaid,

2012

WAM

Hydrology, sediments, nutrients.

Watershed

Daily, monthly

Bottcher et al., 2012

WARMF

Hydrology, sediments, nutrients, acid mine, carbon, bacteria.

Watershed

Daily

Herr and Chen,

2012

WEPP

Hydrology, soil erosion.

Hillslope and

small watershed

Single storm to

hundreds of years

Flanagan et al.,

2012

Table 2. Access to code, presence of GIS interface, and availability of user support for H/WQ models in this collection.

Model

Open Source Code

GIS Interface

User Support Provided

Reference

ADAPT

Yes

No

Little support available

Gowda et al., 2012

BASINS/

HSPF

No

BASINS: yes; HSPF: no

Yes, HSPF user’s manual and application guide.

Duda et al., 2012

CREAMS/

GLEAMS

Yes, available at: www.tifton.uga.edu/sewrl/Gleams/gleams_y2k_update.htm

Yes

Yes, available at: www.tifton.uga.edu/sewrl/gleams/gleams_y2k_update.htm

Knisel and

Douglas-Mankin,

2012

CoupModel

No

Yes, available at: www2.lwr.kth.se/CoupModel/NetHelp/default.htm

Yes, user group from KTH has interactive forum for users; informal courses and tutorials are available from KTH.

Jansson, 2012

Daisy

Yes, available at: http://code.google.com/p/daisy-model/

No

Yes, website with supporting information and potential assistance available at: http://code.google.com/p/daisy-model/

Hansen et al.,

2012

DRAINMOD

No, but provided to researchers by contacting developers at: www.bae.ncsu.edu/soil_water/drainmod/index.html

Yes, available at: www.bae.ncsu.edu/soil_water/drainmod/index.html

Yes, user’s guide published by USDA-NRCS; incorporated in model software.

Skaggs et al.,

2012

EPIC and

APEX

Yes

Yes

Yes

Wang et al., 2012

HYDRUS

HYDRUS-1D available at: www.pc-progress.com/en/Default.aspx?hydrus-2d; HYDRUS (2D/3D): distributed commercially for nominal fee.

Yes, available at: www.pc-progress.com/en/Default.aspx?hydrus-2d

Yes, available at: www.pc-progress.com/en/Default.aspx?hydrus-2d

Šimunek et al.,

2012

KINEROS/

AGWA

Yes, available at: www.tucson.ars.ag.gov/kineros/

Yes, available at: www.tucson.ars.ag.gov/agwa/

Yes, available at: www.tucson.ars.ag.gov/kineros

Goodrich et al.,

2012

MACRO

No

No

Yes, as allowed by time and resources constraints.

Jarvis and Larsbo,

2012

MIKE-SHE

No

Yes, available at: www.mikebydhi.com

Yes, documentation at the Danish Hydraulic Institute; support available at: www.mikebydhi.com

Jaber and Shukla,

2012

MT3DMS

Yes

Yes

Yes

Zheng et al., 2012

RZWQM

Yes, upon request: rzwqmsupport@ars.usda.gov

No

Yes, upon request from: rzwqmsupport@ars.usda.gov

Ma et al., 2012

SHAW

Yes, available at: ftp.nwrc.ars.usda.gov/public/ShawModel/

Yes

Yes, upon request from: gerald.flerchinger@ars.usda.gov

Flerchinger et al.,

2012

STANMOD

Yes

No

Web-based with manuals.

van Genuchten

et al., 2012

SWAT

Yes, available at: http://swatmodel.tamu.edu/

Yes, available at: http://swatmodel.tamu.edu/

Yes, theoretical documentation, user’s manual, ArcSWAT and Map Window interface manuals, developer’s manual, and email newsletter available at: http://swatmodel.tamu.edu/. There are other user groups worldwide.

Arnold et al.,

2012

SWIM

Yes, available at: www.apsim.info

Yes (APSIM user interface)

Yes, available at: www.apsim.info

Huth et al., 2012

TOUGH2

No, source code copyrighted by University of California

No

Website with supporting documents available at: http://esd.lbl.gov/files/research/projects/tough/documentation/TOUGH2_V2_Users_Guide.pdf

Finsterle et al.,

2012

VS2DT

Yes, available at: http://water.usgs.gov/software/ground_water.html

Yes, available at: http://water.usgs.gov/software/ground_water.html./

Yes, available at: http://water.usgs.gov/software/ground_water.html

Healy and Essaid,

2012

WAM

No, but can provide source code to model reviewers, university students and faculty, and other collaborators.

Yes

Yes, available at: www.swet.com/WAM.htm

Bottcher et al.,

2012

WARMF

No, maintained by Systech Water Resources, Inc.

Yes

Yes, available at: www.epa.gov/Athens/wwqtsc

Herr and Chen,

2012

WEPP

Yes

Yes

Yes

Flanagan et al.,

2012

Table 3. Summary of calibration and validation approaches and performance evaluation methods and criteria as suggested by the developers and/or expert users of the H/WQ models in this collection (continued).

Model

Calibration

Approach

Validation

Approach

Suggested Performance Evaluation

Methods and Criteria

Reference

ADAPT

Partition flows, compare annual water and nutrient budget to measured or literature-reported values. Sensitivity coefficient to identify most important parameters.

Divide record into two equal periods or use alternative years covering extremes in both calibration and validation.

Statistical : Root mean square error, Nash-Sutcliffe efficiency, index of agreement, percent error, mean absolute error, correlation coefficient.

Graphical : 1:1, time series.

Gowda et al., 2012

BASINS/

HSPF

Iterative procedure of parameter evaluation and refinement.

Split-sample

Statistical : Mean error, absolute mean error, relative error, relative bias, standard error of estimate, linear correlation coefficient, coefficient of model-fit efficiency, Kolmogorov-Smirnov test.

Graphical : Time series, scatter, cumulative frequency distribution.

Performance criteria : Provided in article.

Duda et al., 2012

CREAMS/

GLEAMS

Manual fine-tuning to achieve best comparison between simulated and observed data. Stepwise procedure: hydrology, sediment, nutrients/pesticides.

Split sample,

adjacent watershed

Statistical : Index of agreement, coefficient of determination, Nash-Sutcliffe efficiency, relative error.

Graphical : Time series.

Knisel and

Douglas-Mankin,

2012

CoupModel

Simple stepwise systematic procedure, use of double mass plot technique. Manual and automated (GLUE).

Split sample

Statistical : Coefficient of determination, Nash-Sutcliff efficiency.

Graphical : time series.

Jansson, 2012

Daisy

Defines objective functions. Stepwise procedure: bioclimate parameters, vegetation and field management parameters, soil parameters.

Split sample

Statistical : Comparison between observed and predicted means and standard deviations, model efficiency, root mean square error, index of agreement.

Hansen et al.,

2012

DRAINMOD

Possible to determine inputs without calibration; calibration recommended for some inputs.

Split sample

Statistical : Mean absolute error, coefficient of determination, Nash-Sutcliffe efficiency.

Graphical : Time series.

Performance criteria : Provided in article.

Skaggs et al.,

2012

EPIC and

APEX

Determine calibration parameters, manual and automated, multi-site (if data available).

Split sample

Statistical : Coefficient of determination, Nash-Sutcliffe efficiency, root mean square error, percent bias, objective functions, autocorrelation, cross-correlation, nonparametric tests, t-test.

Graphical : 1:1, time series, bar.

Performance criteria : Provided in article.

Wang et al.,

2012

HYDRUS

Simple gradient-based local optimization approach (based on Marquardt-Levenberg method) or automated.

Split sample

Statistical : Coefficient of determination, objective functions.

Graphical : Time series.

Šimunek et al.,

2012

KINEROS/

AGWA

Simple manual to complex automated calibration (GLUE). Recommend step-wise, multi-scale calibrations.

Split sample,

adjacent watershed

Statistical : Nash-Sutcliffe efficiency.

Graphical : Time series.

Goodrich et al.,

2012

MACRO

Forward, sequential and iterative procedure most common; also Monte Carlo methods.

Focus on individual processes

Statistical : Root mean square error, Nash-Sutcliffe coefficient.

Jarvis and Larsbo, 2012

MIKE-SHE

Warm-up period, manual and automated (AUTOCAL, GLUE).

Split sample

Statistical : RMSE, index of agreement. Graphical : Time series.

Jaber and Shukla, 2012

MT3DMS

Manual and automated. Variance-covariance matrix, and resulting uncertainty in predictions can be quantified using prediction linear and nonlinear confidence intervals and Bayesian credible intervals.

No

Statistical : Mean of weighted residuals, variance of weighted residuals errors, linear correlation coefficient.

Graphical : Contour maps of heads and concentrations, time series plots.

Performance criteria : Depends on application.

Zheng et al.,

2012

RZWQM

Manual

Split sample

Statistical : Normalized root mean square error.

Graphical : Time series.

Ma et al., 2012

SHAW

With/without calibration, sensitivity analysis, manual calibration, or automated (PEST). Stepwise, trial and error, optimization algorithm.

Split sample

Statistical : Root mean square difference.

Graphical : Time series.

Flerchinger et al.,

2012

STANMOD

Weighted nonlinear least square method.

No

Statistical : Minimum value of nonlinear, weighted objective function.

Graphical : Time series.

van Genuchten

et al., 2012

Table 3 (continued). Summary of calibration and validation approaches and performance evaluation methods and criteria as suggested by the developers and/or expert users of the H/WQ models in this collection.

Model

Calibration

Approach

Validation

Approach

Suggested Performance Evaluation

Methods and Criteria

Reference

SWAT

Systematic process: hydrology, sediments, nutrients, pesticides (including budgets). Manual and automated.

Split sample,

adjacent watershed

Statistical : Coefficient of determination, Nash-Sutcliffe efficiency, root mean square error, percent bias, objective functions, autocorrelation, cross-correlation, nonparametric tests, t-test.

Graphical : Time series.

Arnold et al.,

2012

SWIM

Manual, water balance.

Split sample,

adjacent watershed

Statistical : Nash-Sutcliffe efficiency, root mean square error to standard deviation ratio mean error, mean absolute error, 95% confidence interval.

Graphical : Time series.

Huth et al., 2012

TOUGH2

Weighted least squares objective function with several minimization algorithms.

Use uncertainty analysis based on linear or first-order second-moment error.

Statistical : Minimum value of objective function.

Finsterle et al.,

2012

VS2DI

Manual, parameter-estimation programs (e.g., PEST).

Split sample

Statistical : Weighted correlation coefficient.

Graphical : Time series.

Healy and Essaid,

2012

WAM

Process-based: source cell nutrient load and flow generation, cell to stream routing, and in-stream routing.

Split sample

Statistical : Nash-Sutcliffe, coefficient, root mean square error.

Graphical : Hydrographs.

Bottcher et al.,

2012

WARMF

Systematically adjust model input parameters within normal ranges to match simulated results to observed data, beginning with flow. Manual and automated.

Split sample

Statistical : Relative error, absolute error

Graphical : Time series.

Herr and Chen,

2012

WEPP

Stepwise procedure: hydrology, erosion.

Split sample,

adjacent watershed

Statistical : Means, standard deviation, root mean square error, percent bias, Nash-Sutcliffe coefficient, relative root mean square error.

Flanagan et al.,

2012

Acknowledgements

This article introduces the ASABE 2012 Special Collection “Model Use, Calibration, and Validation” in this issue of Transactions of the ASABE . The authors would like to thank all the model developers and/or expert model users who submitted articles for their great contribution to the model calibration and validation development efforts. In addition, the authors are grateful to all the ASABE model calibration and validation guidelines development committee members for their invaluable ideas and help with the review process of all articles in this selection. The committee members include: Aaron Mittelstet, Adel Shirmohammadi, Aleksey Sheshukov, Aisha Sexton, Pouyan Nejadhashemi, Bahram Gharabaghi, Brian Benham, Bruce Wilson (Project Coordinator), Claire Baffaut, Colleen Rossi, Daniel Moriasi (Process Subcommittee Chair), Daren Harmel (Communication Subcommittee Chair), Devandra Amatya, Dharmendra Saraswat, Elizabeth Trybula, Gene Yagow, Indrajeet Chaubey, Jairo Hernandez, Jane Frankenberger, Jeff Arnold, Jorge Guzman, Kati Migliaccio, Kyle Douglas-Mankin, Laj Ahuja, Liwang Ma, Ma. Librada Chu, Manoj Jha, Margaret Gitau, Mary Leigh Wolfe, Mike Smolen, Mike White, Patti Smith, Prasad Daggupati, Prasanna Gowda, Puneet Srivastava, Rafael Muńoz-Carpena, Ramesh Rudra, Rebecca Zeckoski, Rob Malone, Rohith Gali, Sanjay Shukla, Shiv Prasher, Srinivasalu Ale, Suresh Sharma, Vinayak Shedekar, and Yongping Yuan.

REFERENCES

Abbaspour K. C., M. Vejdani, and S. Haghighat. 2007. SWAT-CUP: Calibration and uncertainty programs for SWAT. In Proc. Intl. Congress on Modelling and Simulation (MODSIM07) , 1603-1609. L. Oxley and D. Kulasiri, eds. Canberra, Australia: Modelling and Simulation Society of Australia and New Zealand.

Arnold, J. G., D. N. Moriasi, P. W. Gassman, K. C. Abbaspour, M. J. White, R. Srinivasan, C. Santhi, R. D. Harmel, A. van Griensven, M. W. Van Liew, N. Kannan, and M. K. Jha. 2012. SWAT: Model use, calibration, and validation. Trans. ASABE 55(4): 1494-1508.

ASCE. 1993. Criteria for evaluation of watershed models. J. Irrig. Drainage Eng. 119(3): 429-442.

Bennett, N. D., B. F. W. Croke, A. J. Jakeman, L. T. H, Newham, and J. P. Norton. 2010. Performance evaluation of environmental models. In Proc. Intl. Congress on Environmental Modelling and Software (iEMSc 2010) , 1703-1711. D. A. Swayne, W. Yang, A. A. Voinov, A. Rizzoli, and T. Filatova, eds. Manno, Switzerland: International Environmental Modelling and Software Society.

Beven, K. 1993. Prophecy, reality, and uncertainty in distributed hydrological modeling. Adv. Water Resour. 16(1): 41-51.

Beven, K. J., and A. M. Binley. 1992. The future of distributed models: Model calibration and uncertainty prediction. Hydrol. Proc. 6(3): 279-298.

Bottcher, A. B., B. J. Whiteley, A. I. James, and J. H. Hiscock. 2012. Watershed assessment model (WAM): Model use, calibration, and validation. Trans. ASABE 55(4): 1367-1383.

Donigian, A. S., Jr. 2002. Watershed model calibration and validation: The HSPF experience. In Proc. WEF Natl. TMDL Science and Policy , 44-73. Alexandria, Va.: Water Environment Federation.

Donigian, A. S., J. C. Imhoff, and B. R. Bicknell. 1983. Predicting water quality resulting from agricultural nonpoint-source pollution via simulation: HSPF. In Agricultural Management and Water Quality , 200-249. Ames, Iowa: Iowa State University Press.

Douglas-Mankin, K. R., R. Srinivasan, and J. G. Arnold. 2010. Soil and Water Assessment Tool (SWAT) model: Current development and applications. Trans. ASABE 53(5): 1423-1431.

Duda, P. B., P. R. Hummel, A. S. Donigian Jr., and J. C. Imhoff. 2012. BASINS/HSPF: Model use, calibration, and validation. Trans. ASABE 55(4): 1523-1547.

Engel, B., D. Storm, M. White, J. Arnold, and M. Arabi. 2007. A hydrologic/water quality model application protocol. J. American Water Resour. Assoc. 43(5): 1223-1236.

Finsterle, S., M. B. Kowalsky, and K. Pruess. 2012. TOUGH: Model use, calibration, and validation. Trans. ASABE 55(4): 1275-1290.

Flanagan, D. C., J. R. Frankenberger, and J. C. Ascough II. 2012. WEPP: Model use, calibration, and validation. Trans. ASABE 55(4): 1463-1477.

Flerchinger, G. N., T. G. Caldwell, J. Cho, and S. Hardegree. 2012. Simultaneous Heat and Water (SHAW): Model use, calibration, and validation. Trans. ASABE 55(4): 1395-1411.

Goodrich, D. C., I. S. Burns, C. L. Unkrich, D. J. Semmens, D. P. Guertin, M. Hernandez, S. Yatheendradas, J. R. Kennedy, and L. R. Levick. 2012. KINEROS2/AGWA: Model use, calibration, and validation . Trans. ASABE 55(4): 1561-1574.

Gowda, P. H., D. J. Mulla, E. D. Desmond, A. D. Ward, and D. N. Moriasi. 2012. ADAPT: Model use, calibration, and validation. Trans. ASABE 55(4): 1345-1352.

Gupta, H. V., S. Sorooshian, and P. O. Yapo. 1998. Toward improved calibration of hydrologic models: Multiple and noncommensurable measures of information. Water Resour. Res. 34(4): 751-763.

Gupta, H. V., S. Sorooshian, and P. O. Yapo. 1999. Status of automatic calibration for hydrologic models: Comparison with multilevel expert calibration. J. Hydrol. Eng . 4(2): 135-143.

Hansen, S., P. Abrahamsen, C. T. Petersen, and M. Styczen. 2012. Daisy: Model use, calibration, and validation. Trans. ASABE 55(4): 1315-1333.

Harmel, R. D. and P. K. Smith. 2007. Consideration of measurement uncertainty in the evaluation of goodness-of-fit in hydrologic and water quality modeling. J. Hydrol. 337(3-4): 326-336.

Harmel, R. D., P. K. Smith, and K. L. Migliaccio. 2010. Modifying goodness-of-fit indicators to incorporate both measurement and model uncertainty in model calibration and validation. Trans. ASABE 53(1): 55-63.

Healy, R. W., and H. I. Essaid. 2012. VS2DI: Model use, calibration, and validation. Trans. ASABE 55(4): 1249-1260.

Herr, J. W., and C. W. Chen. 2012. WARMF: Model use, calibration, and validation. Trans. ASABE 55(4): 1385-1394.

Huth. N. I., K. L. Bristow, and K. Verburg. 2012. SWIM3: Model use, calibration, and validation. Trans. ASABE 55(4): 1303-1313.

Jaber, F. H., and S. Shukla. 2012. MIKE-SHE: Model use, calibration, and validation. Trans. ASABE 55(4): 1479-1489.

Jakeman, A. J., R. A. Letcher, and J. P. Norton. 2006. Ten iterative steps in development and evaluation of environmental models. Environ. Modelling Software 21(5): 602-614.

Jansson, P.-E. 2012. CoupModel: Model use, calibration, and validation. Trans. ASABE 55(4): 1335-1344.

Jarvis, N., and M. Larsbo. 2012. MACRO (v5.2): Model use, calibration, and validation . Trans. ASABE 55(4): 1413-1423.

Knisel, W. G., and K. R. Douglas-Mankin. 2012. CREAMS/ GLEAMS: Model use, calibration, and validation. Trans. ASABE 55(4): 1291-1302.

Konikow, L. F., and J. D. Bredehoeft. 1992. Ground-water models cannot be validated. Adv. Water Resour. 15(1): 75-83.

Legates, D. R., and G. J. McCabe. 1999. Evaluating the use of “goodness-of-fit” measures in hydrologic and hydroclimatic model validation. Water Resour. Res. 35(1): 233-241.

Loague, K., and R. E. Green. 1991. Statistical and graphical methods for evaluating solute transport models: Overview and application. J. Contam. Hydrol. 7(1-2): 261-283.

Ma, L., L. R. Ahuja, B. T. Nolan, R. W. Malone, T. J. Trout, and Z. Qi. 2012. Root Zone Water Quality Model (RZWQM2): Model use, calibration, and validation. Trans. ASABE 55(4): 1425-1446.

Moriasi, D. N., J. G. Arnold, M. W. Van Liew, R. L. Bingner, R. D. Harmel, and T. L. Veith. 2007. Model evaluation guidelines for systematic quantification of accuracy in watershed simulations. Trans. ASABE 50(3): 885-900.

Nash, J. E., and J. V. Sutcliffe. 1970. River flow forecasting through conceptual models: Part 1. A discussion of principles. J. Hydrol. 10(3): 282-290.

Refsgaard, J. C. 1997. Parameterization, calibration, and validation of distributed hydrological models. J. Hydrol. 198(1-4): 69-97.

Refsgaard, J. C., and B. Storm. 1995. MIKE SHE. In Computer Models of Watershed Hydrology , 809-846. V. Singh, ed. Highlands Ranch, Colo.: Water Resources Publications.

Refsgaard, J. C., and B. Storm. 1996. Construction, calibration, and validation of hydrological models. In Distributed Hydrologic Modeling , 41-54. M. B. Abbot and J. C. Refsgaard, eds. Dordrecht, The Netherlands: Kluwer Academic Publishers.

Refsgaard, J. C., and H. J. Henriksen. 2004. Modelling guidelines: Terminology and guiding principles. Adv. Water Resour. 27(1): 71-82.

Santhi, C., J. G. Arnold, J. R. Williams, W. A. Dugas, R. Srinivasan, and L. M. Hauck. 2001. Validation of the SWAT model on a large river basin with point and nonpoint sources. J. American Water Resour. Assoc. 37(5): 1169-1188.

Shirmohammadi, A., I. Chaubey, R. D. Harmel, D. D. Bosch, R. Muńoz-Carpena, C. Dharmasri, A. Sexton, M. Arabi, M. L. Wolfe, J. Frankenberger, C. Graff, and T. M. Sohrabi. 2006. Uncertainty in TMDL models. Trans. ASABE 49(4): 1033-1049.

Šimunek, J., M. Th. van Genuchten, and M. Šejna. 2012. HYDRUS: Model use, calibration, and validation. Trans. ASABE 55(4): 1261-1274.

Skaggs, R. W., M. A. Youssef, and G. M. Chescheir. 2012. DRAINMOD: Model use, calibration, and validation. Trans. ASABE 55(4): 1509-1522.

Tuppad, P., K. R. Douglas-Mankin, T. Lee, R. Srinivasan, and J. G. Arnold. 2011. Soil and water assessment tool (SWAT) hydrologic/water quality model: Extended capability and wider adoption. Trans. ASABE 54(5): 1677-1684.

van Genuchten, M. Th., J. Šimunek, F. J. Leij, N. Toride, and M. Šejna. 2012. STANMOD: Model use, calibration, and validation. Trans. ASABE 55(4): 1353-1366.

van Griensven, A., and W. Bauwens. 2003. Multi-objective autocalibration for semidistributed water quality models. Water Resour. Res. 39(12): 1348-1356.

Wang, X., J. R. Williams, P. W. Gassman, C. Baffaut, R. C. Izaurralde, J. Jeong, and J. R. Kiniry. 2012. EPIC and APEX: Model use, calibration, and validation. Trans. ASABE 55(4): 1447-1462.

White, K. L., and I. Chaubey. 2005. Sensitivity analysis, calibration, and validations for a multisite and multivariable SWAT model. J. American Water Resour. Assoc. 41(5): 1077-1089.

Zheng, C., M. C. Hill, G. Cao, and R. Ma. 2012. MT3DMS: Model use, calibration, and validation. Trans. ASABE 55(4): 1549-1559.