2 - Advances in Software Inspections
2 - Advances in Software Inspections
INTRODUCTION
,
,, WITH
j _I.. INSPECTIONS
,
'I #1''',
III' ,
...... ,
, ,
#I'~'/
///
,I
"
" \
\
\
/" I" \\
, I I
p DESiGN-ICODINGI' TESTING
SCHEDULE-
"tSHIP
Fig. 1.
What's in a Name?
In contrast to inspections, walkthrus, which can range
anywhere from cursory peer reviews to inspections, do
not usually practice a process that is repeatable or collect
data (as with inspections), and hence this process cannot
be reasonably studied and improved. Consequently, their
defect detection efficiencies are usually quite variable and,
when studied, were found to be much lower than those of
inspections [2], [3]. However, the name "walkthru" (or
"walkthrough") has a place, for in some management and
national cultures it is more desirable than the term "in-
spection" and, in fact, the walkthrus in some of these
situations are identical to formal inspections. (In almost
all instances, however, the author's experience has been
that the tenn walkthru has been accurately applied to the
less effi.cient method-which process is actually in use can
be readily detennined by examining whether a formally
defined development process with exit criteria is in effect,
and by applying the criteria in [2, Table 5] to the activity.
In addition, initiating walkthrus as a migration path to in-
spections has led to a lot of frustration in many organi-
zations because once they start with the informal, they
seem to have much more difficulty moving to the formal
process than do those that introduce inspections from the
start. And, programmers involved in inspections are usu-
ally more pleased with the results. In fact, their major
complaints are generally to do with things that detract
from inspection quality.) What is important is that the
same results should not be expected of walkthrus as is
required of inspections, unless a close scrutiny proves the
process a~ conduct of the "walkthru" is identical to that
required for inspections. Therefore, although walkthrus
do serve very useful though limited functions, they are
not discussed further in this paper.
354
MODERATOR
• TO COVER LOCAL
PROJECT NEEDS.
PROGRAMME A E. G. EKIT/ENTRY
CRITERIA,
COMPETENCY
MATERIALS, ETC.
APPROVEDI
CERTIFIED
INSPECTION VJ
(J1
-..j
QUALITV
EOUCATION
MANAGERS PROGRAMMERS
Fig. 2. Fishbone diagram of contributors to inspection quality.
358
REFERENCES
[1] L. H. Fenton, "Response to the SHARE software service task force
report," IBM Corp., Kingston, NY, Mar. 6,1984.
[2] M. E. Fagan, "Design and code inspections to reduce errors in pro-
gram development," IBM Sy.YI. J., vol. 15, no. 3, 1979.
[3] IBM Technical Newsletter GN20-3814, Base Publication GC20-2000-
0, Aug. 15, 1978.
[4] T. D. Crossman, "Inspection teams, are they worth it?" in Proc. 2nd
Nat. Symp. EDP Quality Assurance, Chicago, IL, Mar. 24-26, 1982.
[5] R. R. Larson, "Test plan and test case inspection specification," IBM
Corp., Tech. Rep. TR21.585, Apr. 4, 1975.
[6] T. D. Crossman, "Some experiences in the use of inspection teams
in application development," in Proc. Applicat. Develop. Symp.,
Monterey, CA, 1979.
[7] G. D. Brown and D. H. Sefton, "The micro vs. the applications
logjam," Datamation, Jan, 1984.
[8] J. H. Morrissey and L. S.-Y. Wu, "Software engineering: An eco-
nomical perspective," in Proc. IEEE Con! Software Eng., Munich,
West Germany, Sept. 14-19, 1979.
191 8. Boehm et al., Characteristics of Software Quality. New York:
American Elsevier, 1978.
LIO] F. O. Buck, "Indicators of quality inspections," IBM Corp., Tech.
Rep. IBM TR21.802, Sept. 1981.
[11] K. Ishikawa, Guide to Quality Control. Tokyo, Japan: Asian Pro-
ductivity Organization, 1982.
360