0% found this document useful (0 votes)
662 views2 pages

Techniques of Increasing Model Validity and Credibility

The document discusses techniques for increasing the validity and credibility of models, including: 1) Using all existing information from subject matter experts, observations, theory and experience to develop the model. 2) Interacting regularly with managers to maintain their interest, involvement and focus on the problem. 3) Maintaining a written assumptions document and validating assumptions with subject matter experts and managers. 4) Using qualitative techniques like sensitivity analysis to validate model components. 5) Validating overall model output through comparison with expert opinion and an existing system.

Uploaded by

Gideon Yegon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
662 views2 pages

Techniques of Increasing Model Validity and Credibility

The document discusses techniques for increasing the validity and credibility of models, including: 1) Using all existing information from subject matter experts, observations, theory and experience to develop the model. 2) Interacting regularly with managers to maintain their interest, involvement and focus on the problem. 3) Maintaining a written assumptions document and validating assumptions with subject matter experts and managers. 4) Using qualitative techniques like sensitivity analysis to validate model components. 5) Validating overall model output through comparison with expert opinion and an existing system.

Uploaded by

Gideon Yegon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

TECHNIQUES OF INCREASING MODEL VALIDITY AND CREDIBILITY

i. Make use of all existing information through conversations with subject matter,
observations of the system, existing theory, relevant results from similar simulation
studies and modeler’s experience and intuition.
ii. Interacting with manager on a regular basis as it is essential to maintain manager’s
interest and involvement as well as continued focus on the nature of the problem.
iii. Maintain a written assumption document by performing a structured walk-through of the
document with both SMEs and managers as well as correcting the missing or invalid
assumptions.
iv. Use of qualitative techniques to validate model components through performing a
sensitivity analysis to determine which factor significantly impact performance.
v. Validate output from the overall simulation model through:

Comparison with expert opinion. The model faces validity if consistent with perceived system
behavior

Comparison with an existing system by use of turning test where a human judge is involved in
natural language conversation with a human and a machine and if the judge cannot reliably tell
which of the partner the machine is then the machine has passed the test. Calibration of the
model is applicable where by parameters are tweaked until model output agree with system
output and by using one dataset for calibration and another independent data set for validation.
Statistical procedures can as well be used.

TECHNIQUES OF DEBUGGING

Trace-based debugging.

Trace-based debugging is predicated on the concept of breakpoints. A breakpoint is a pausing or


stopping point added to the program to examine the state of program execution up until that point.
After rectifying the current bug, the developer usually sets the next breakpoint and repeats the
same process until all the bugs are corrected.
Traced-based debugging has four sub-techniques: Trace debugging (TD), Omniscient debugging
(OD), Algorithmic debugging (AD) and Hybrid debugging (HD). All of these sub-techniques are
based around the same concept of program slicing.

Code file Insert breakpoints Run separate slices Locate error Fix error

Spectrum-based debugging

In spectrum-based debugging, the debugging process is done by monitoring the statements


included in a particular execution tree. This is achieved by using the program spectrum to identify
the active part of the program during its run. The program spectrum is a collection of runtime
information that gives a view of the dynamic behavior of the program. It includes some flags
corresponding to different parts of the program. These spectra are used to pinpoint the exact
sections of the code running for specific or abstract input.

Code file Execute Collect runtime data Report behavior

Delta debugging

The process of delta debugging is to minimize automated test cases. It takes test cases that may
cause the error and prepares an error report. From that error report, minimal test cases are selected
based on their high probability of producing the error. The minimum test cases will regenerate the
same error and thus help the developer locate the bug behind it.

Test cases with possible error Code file Execute Error report

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy