0% found this document useful (0 votes)
77 views14 pages

Unit - 1: Conventional Software Management: The Waterfall Model, Conventional Software Management

1. Three analyses in the 1990s found that most software projects fail to be delivered on time and budget, with only about 10% succeeding. Management practices are a bigger factor in success or failure than technology. 2. The waterfall model presented a linear software development process but in practice integration and testing issues often caused late design changes and delays. Customer involvement was also limited. 3. While the waterfall model aimed to plan and control the process, in practice many projects experienced protracted integration at the end, late risk resolution, strained stakeholder relationships, and a focus on documents over iterative development.

Uploaded by

Anu Ish
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
77 views14 pages

Unit - 1: Conventional Software Management: The Waterfall Model, Conventional Software Management

1. Three analyses in the 1990s found that most software projects fail to be delivered on time and budget, with only about 10% succeeding. Management practices are a bigger factor in success or failure than technology. 2. The waterfall model presented a linear software development process but in practice integration and testing issues often caused late design changes and delays. Customer involvement was also limited. 3. While the waterfall model aimed to plan and control the process, in practice many projects experienced protracted integration at the end, late risk resolution, strained stakeholder relationships, and a focus on documents over iterative development.

Uploaded by

Anu Ish
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 14

UNIT – 1

Conventional Software Management : The Waterfall Model, Conventional software Management


Performance. Evolution of Software Economics: Software Economics, Pragmatic Software Cost
Estimation.

The best thing about software is its flexibility. It can be programmed to do almost anything. The
worst thing about software is also its flexibility. The “almost anything” characteristic has made it
difficult to plan, monitor, and control software development. This unpredictability is the basis of
what has been referred to for the past 30 years as the “software crisis”

In the mid 1990’s, Three important analyses of the state of the software engineering industry
were performed. All three analyses reached the same general conclusion: The success rate for
software projects is very low. They can be summarized as follows.

1. Software Development is still highly unpredictable


Only about 10% of software projects are delivered successfully on time, within
initial budget, and schedule estimates
2. The management discipline is more of a discriminator in success or failure than are
technology advances
3. The level of software scrap and rework is indicative of an immature process.

The above THREE analyses provide a good introduction to the magnitude of the
software problem and the current norms for conventional software project
management performance. Most software engineering texts present the waterfall
model as the source of the conventional software process

1.1 The Waterfall Model


In theory : In 1970 Winston Royce(father of walker Royce) present a paper titled
“Managing the development of large scale software systems” at IEEE WESCON.

It provides 3 primary points


1. There are two essential steps common to the development of computer programs:
analysis and coding
2. In order to manage and control all of the intellectual freedom associated with
software development, one must introduce several other ‘overhead’ steps, including
system requirements definition, software requirements definition, program design,
and testing. These steps supplement the analysis and coding steps.”
The following fig 1.1. represents basic programming steps and large-scale approach

SPM UNIT - I
2

SPM UNIT - I
3. The basic framework described in the water fall model is risky and invites
failure. The testing phases that occurs at the end of the development cycle is the first
event for which timing, storage, input/output transfers, etc. are experienced as
distinguished from analyzed. The resulting design changes are likely to be so
disruptive that the software requirements upon which the design is based are likely
violated. Either the requirements must be modified or a substantial design change
is warranted.
Item 1, which is seemingly trivial, will be expanded later into the separation of
engineering stage from production stage.
To eliminate most of the development risks alluded to in item 3, the winston
Royce in his paper describing five improvements to the basic waterfall model.

1. “Program design” comes first.


The first step toward a fix is to insert a preliminary program design phase between the
software requirements generation phase and the analysis phase. By this technique ,
the program designer assures that the software will not fail because of storage,
timing, and data flux. As analysis proceeds in the succeeding phase, the program
designer must impose on the analyst the storage, timing, and operational constraints
in such a way that he senses the consequences. If the total resources to be applied are
insufficient or of the embryonic operational design are wrong, it will be recognized at
this early stage and the iteration with requirements and preliminary design can be
redone before final design, coding, and test commences. This program design
procedure is implemented in the following steps.
Begin the design process with program designers, not analysts or programmers..
Design, Define and allocate the data Processing modes even at the risk of being
wrong. Allocate processing functions, design the database, allocate execution
time, define interfaces and processing modes with the operating systems, describe
input and output processing, and define preliminary operating procedures
Write an overview document that is understandable, informative, and current so
that every worker under project can gain an elemental understanding of the
system.
Now we use the term ‘architecture first’ development rather than program design.

2. Document the Design

Development efforts required huge amounts of documentation – manuals for


everything

• User manuals; operation manuals, program maintenance manuals, staff


user manuals, test manuals, etc

SPM UNIT - I
Why do we need so much documentation?
1. Each designer MUST communicate with various stakeholders like interface
designers, managers, customers, testers, developers.
2. During early phases documentation is the design
3. The real monetary value of the documentation is to support later modifications
by a separate test team, maintenance team and operations personal who are
not software literate

Now Visual modeling provides considerable documentation

3. Do it twice :

History argues that the delivered version is really version #2.

Version 1, major problems and alternatives are addressed

Version 2, is a refinement of version 1 where the major requirements are


implemented.

Version 1 often austere; Version 2 addressed shortcomings!

Now this approach is a precursor to architecture-first development. Initial


engineering is done. Forms the basis for iterative development and addressing
risk!

4. Plan, Control, and Monitor Testing.

Largest consumer of project resources (manpower, computing time, management


judgment, …) is the test phase.

This is the Phase of greatest risk – in terms of cost and schedule.

This occurs last, when alternatives are least available, and expenses are at a
maximum.

Typically that phase that is shortchanged the most

To do:

1. Employ a team of test specialists who were not responsible for original design.

2. Employ visual inspections to spot obvious errors like code reviews, other technical
reviews and interfaces, wrong address, missing factors, dropped minus signs, etc.,

3. Test every logic path


4

SPM UNIT - I
4. Employ final checkout on target computer

Now: Plan, Control, and Monitor Testing.

5. Involve the Customer:

Involve customer in requirements definition, preliminary software review, preliminary


program design (critical design review briefings…)

Now: Involving the customer and all stakeholders is critical to overall project success.
Demonstrate increments; solicit feedback; embrace change; cyclic and iterative and
evolving software. Address risk early.

Overall Appraisal of Waterfall Model

o Criticism of the waterfall model is misplaced.

o Theory is fine.

o Practice is what was poor!

The waterfall model in Practice :


Projects destined for trouble frequently exhibit the following symptoms

1. Protracted integration and late design breakage

2. Late risk resolution

3. Requirements-driven functional decomposition

4. Adversarial stakeholder relationships

5. Focus on documents and review meetings

1. Protracted integration and late design breakage:

The S/w was compliable and executable; it was not necessarily complete, compliant,
nor up to specifications. The typical development of a S/W project that used waterfall model
follows the following sequence.

 Early paper designs and thorough briefings


 Commitment to code very late in cycle
 Integration nightmares due to unforeseen implementation and interface
issues
 Heavy budget and schedule pressure to get the system working
5

SPM UNIT - I
 Late ‘shoe-horning’ of non-optimal fixes, with no time for redesign
 A very fragile, un-maintainable product delivered late.

The following fig. shows progress profile of a conventional S/W Project

In conventional process testing consumed 40% or more of life cycle resources. The
following table shows a typical profile of cost expenditures across the spectrum of software
activities.

Activity Cost
Management 5%
Requirement 5%
Design 10%
Code and Unit Testing 30%
Integration and Testing 40%
Deployment 5%
Environment 5%
Total: 100%

SPM UNIT - I
2. Late risk resolution:

A serious issue associated with water fall model was the lack of early risk resolution. The
following figure shows a typical risk profile for conventional water fall model projects. It
includes four distinct periods of risk exposure, where risk is defined as the probability of missing
a cost, schedule, feature, or quality goal.

Early in the life cycle as the requires were being specified, the actual risk exposure was
highly unpredictable

3. Requirements-driven functional decomposition


 Traditionally, software development processes have been requirements-driven.
 Developers: assumed requirement specs: complete, clear, necessary, feasible, and
remaining constant! This is RARELY the case!!!!
 All too often, too much time spent on equally treating ‘all’ requirements rather than on
critical ones.
 Much time spent on documentation on topics (traceability, testability, etc.) that was later
made obsolete as ‘driving requirements and subsequent design understanding evolve.’

Another property of conventional approach is the requirements were specified in functional


manner. The following figure shows requirements driven functional decomposition
7

SPM UNIT - I
4. Adversarial stakeholder relationships
Who are stakeholders?
Adversarial relationships OFTEN true!
Misunderstanding of documentation usually written in English and with business jargon.
Paper transmission of requirements – only method used….
No real modeling, universally-agreed-to languages with common notations; (no GUIs,
network components already available; Most systems were ‘custom.’)
Subjective reviews / opinions
Management Reviews; Technical Reviews!

The following sequence of events was typical for most contractual software:

1. The Contractor prepared a draft contract-deliverable document that constituted


an intermediate artifact and delivered it to the customer for approval. (usually
done after interviews, questionnaires, meetings…)

2. The Customer was expected to provide comments (typically within 15-30


days.)

3. The Contractor incorporated these comments and submitted (typically 15-30


days) a final version for approval.

SPM UNIT - I
5. Focus on documents and review meetings

The conventional process focused on producing various documents that attempted


to describe the software product, with insufficient focus on producing tangible increments of the
product.

Results of conventional Software product design Reviews :

1. Big briefing to a diverse audience

Results: only a small percentage of the audience understands the software

Briefings and documents expose few of the important assets and risks of complex
software.

2. A design that appears to be compliant

There is no tangible evidence of compliance

Compliance with ambiguous requirements is of little value.

3. Coverage of requirements (typically hundreds….)

Few (tens) are in reality the real design drivers, but many presented

Dealing with all requirements dilutes the focus on critical drivers.

4. A design considered ‘innocent until proven guilty’

The design is always guilty

Design flaws are exposed later in the life cycle.

1.2 Conventional Software Management Performance


Barry Boehm’s “industrial software metrics top ten list ” for S/W project management
performance

1. Finding and fixing a software problem after delivery costs 100 times more than fining
and fixing the problem in early design phases.

2. You can compress software development schedules 25% of nominal, but no more.

3. For every $1 you spend on development, you will spend $2 on maintenance.

SPM UNIT - I
4. Software development and maintenance costs are primarily a function of the number of
source lines of code.

5. Variations among people account for the biggest differences in software productivity.

6. Overall ratio of software to hardware costs is still growing. In 1955 it was 15:85; In 1985, it
was 85:15

7. Only about 15% of software development effort is devoted to programming

8. Software systems and products typically cost three times as much per SLOC as individual
software programs. Software-system products, that is system of systems, cost nine times as
much.

9. Walkthroughs catch 60% of the errors

10. 80% of the contribution comes from 20% of the contributors.

2.1 Evolution of Software Economics :


Most software cost models can be abstracted into a function of five basic parameters:

1. Size 2. Process 3. Personnel 4. Environment 5. Required Quality

1. Size : The size of the end product is typically quantified in-terms of no of source lines of
code ( SLOC ) or No of Function Points ( FPs )

2. Process : The process used to produce the end product, the ability of the process to avoid
non-value adding activities like rework, bureaucratic delays, communication over head

3. Personnel : the capabilities of Software Engineering personnel, and particularly their


experience with the computer science issues and the applications domain issues of the
project.

4. Environment : which is made up of tools and techniques available to support efficient


software development and to automate the process.

5. Required Quality : The required quality of the product including its features,
performance, reliability, and adaptability.

The relationship among these FIVE parameters and the estimated cost can be written as

Effort=(Personnel)*(Environment)*(Quality)*(Size Process)

10

SPM UNIT - I
Three generations of software development

The following figure shows THREE generations of basic technology advancements in


tools, components and processes

Three generations of the Software Development are defined as follows

1. Conventional : In 1960’s – 1970’s the organizations follows craftsmanship’s by


using custom tools, custom processes and virtually all custom components built-in
primitive languages. Project performance was highly unpredictable in that cost,
schedule, and quality objectives were almost always under achieved.
11

SPM UNIT - I
2. Transition: In 1980’s – 1990’s the organizations follows software Engineering by
using moiré repeatable process and off-the-shelf tools, and mostly (>70% custom
components) built-in higher languages. Some of the components (30 %) are available
as commercial components, including the OS, DBMS, Networking and graphical user
interfaces.

3. Modern Practices: In 2000’s and later the organization follows software production
by using managed and measured processes, integrated automated environments, and
mostly 70% off-the-shelf components. 30% of the components needs to be custom
built.

ROI IN DIFFERENT DOMAINS :

Organizations are achieving better economies of scale in successive technology eras-with


very large projects(system of systems),long lived products, and lines of business
comprising multiple similar projects. The following figure shows how a Return On
Investment(ROI) profile can be achieved across various domains.

12

SPM UNIT - I
2.2 PRAGMATIC SOFTWARE COST ESTIMATION

The predominant cost estimation process

Debate: Which Model or Tool?

 Which cost estimation model to use

 Whether to measure software size in source line of code or function points

What constitutes a good estimate

A good estimate has the following attributes:

 It is conceived and supported by the project manager, architecture team,


development team, and test team accountable for performing the work.

 It is accepted by all stakeholders as ambitious but realizable.

 It is based on a well defined software cost model with a credible basis.

 It is based on a database of relevant project experience that includes similar


processes, technologies, environments, quality requirements, and people.

 It is defined in enough detail so that its key risk areas are understood and the
probability of success is objectively assessed.

Cost Estimation Models: COCOMO

 Barry Boehm, 1981

 Royce, “HUGE BENEFIT”

 Well-documented cost estimation models

 Three Modes

13

SPM UNIT - I
 Five Basic Phase Life Cycle

 Evolution: COCOMO II

Basic COCOMO is good for rough order of magnitude estimates of software costs, but its
accuracy is necessarily limited because of its lack of factors to account for differences in
hardware constraints, personnel quality and experience, use of modern tools and
techniques, and other project attributes known to have a significant influence on costs."

Measuring Software Size:

Source lines of code versus functSource Lines of Code

 Barry Boehm

 Easy to Automate and Instrument.

 More ambiguous measure due to language advances and other components

 More useful and precise measurement basis of various metrics perspectives

ion points

Function Points

 Capers Jones

 Independent of Technology

 More accurate estimator in early phases

14

SPM UNIT - I

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy