Verification and validation are two essential steps when running simulation analyses. In verification we ask if the computer model was correctly implemented. In validation we ask if the model is an accurate representation of reality or, in other words, we want to ensure that the predicted performance is sufficiently close to actual system performance

Verification

A sufficiently complex model almost certainly contains errors.

  • It’s unlikely you will find them all.
  • The main goal of this phase is to find and fix as many as possible.
  • A key issue is ownership. If you created the model it will be hard to find errors in your code.

Therefore, these are some methods for the verification of DES models:

  1. Make initial estimations, check outputs for reasonableness.
  2. View the animation.
  3. Trace the model.
  4. Interactively debug (use watches, breakpoints available in your software).
  5. Run model under varying conditions
  6. Document the model.
  7. Ask someone to walk through the model with you.

 

Validation

When to validate?

  • When simulating a new system design:
    • here limited validation is possible.
    • It is better to validate after the system is built because…
  • Simulate changes to an existing system:
    • Validate model for system as is”.
    • Calibrate as needed.
    • Implement changes in calibrated model.

How can validation be done?

  • Independently (by an outside party).
    • Best for large scale simulation models.
  • Internally (by simulation team).
    • Better to have the final user(s) involved as this aids in model credibility
  • Use both objective and subjective measures.
  • Timing:
    • Concurrently with model development is the best approach.
    • But, also, do it after the model is completed.

What to validate?

  • The concept (conceptual model validity):
    • Are the assumptions correct?
    • Is the conceptual model reasonable for its intended purpose?
  • The model:
    • Do our experiments show alignment between the computerized and the conceptual model?
  • The data:
    • Are the data adequate and correct?
  • The operation:
    • Are the results sufficiently accurate for the intended purpose of the model?

Errors!!
Project management errors, data model, logic model and experimentation errors may cause your model to not work as expected. Make sure you consider them all.
Examples of data validity errors:

  • Using a mean instead of a distribution.
  • Using a distribution when variation has known causes.
  • Improper modeling of machine failures:
    • time vs count-based, elapsed vs run time.
    • adjusting process time instead of modeling.
  • Assuming independence when unjustified.
    • order data (number of items, quantity).

Examples of logic errors:

  • Viewpoint
    • active entity, passive server: use push.”
    • active server: more appropriate for pull.”
  • Control logic
    • if dispersed, subtle errors creep in.

Approach to validation

  • Prior to developing the model make an agreement with model sponsors.
    • Specify minimum set of validation techniques to be used.
    • Specify the amount of accuracy required for output variables.
  • Test the model for face validity in each model iteration.
    • does the model appears reasonable to model users, to other knowledgeable people?
  • Test it over a range of input parameters.
  • Compare model predictions to:
    • past performance of the actual system.
    • a baseline model representing an existing system.
  • For a new system, compare implemented model behavior to assumptions and specifications.
  • Develop validation documentation for inclusion in the documentation of the simulation model.
  • If the model is to be used over a period of time, develop a schedule for periodic review of the model validity.