0% found this document useful (0 votes)
77 views30 pages

Software Estimation: Tutorial Presented by Dongwon Kang

The document provides an overview of software estimation techniques. It begins with an introduction to software estimation, noting the difficulties in accurately estimating projects. It then discusses common sizing methods like source lines of code, function points, and use case points. The document outlines the overall estimation process and different effort estimation methods such as algorithmic models, expert judgment, and analogy. It concludes with best practices for conducting software estimations.

Uploaded by

Mrinal Ghosh
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
77 views30 pages

Software Estimation: Tutorial Presented by Dongwon Kang

The document provides an overview of software estimation techniques. It begins with an introduction to software estimation, noting the difficulties in accurately estimating projects. It then discusses common sizing methods like source lines of code, function points, and use case points. The document outlines the overall estimation process and different effort estimation methods such as algorithmic models, expert judgment, and analogy. It concludes with best practices for conducting software estimations.

Uploaded by

Mrinal Ghosh
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

Software Estimation

Tutorial

Presented by Dongwon Kang

ⓒ KAIST SE LAB 2008


Contents
 Introduction
 Overview of the estimation process
 Sizing methods
 Source lines of code
 Function point
 Use case point
 Effort estimation methods
 Algorithmic models
 Expert judgment
 Analogy
 Best practices for estimation

ⓒ KAIST SE LAB 2008 2 / 29


Introduction (1/3)
 What is software estimation?
 Predicting the resources required for a software development
process
• Effort / cost / schedule
 Why do we estimate?
 No estimate, no plan and control
 Crucial to go/no-go decisions on a software project
 Essential to establish a budget for a software project
 …

ⓒ KAIST SE LAB 2008 3 / 29


Introduction (2/3)
 Difficulties in software estimation
 Average project exceeded 90% of budget and 120% of schedule
• Standish Group of 8,380 projects (1994)
 55% of projects exceeded budget
• 24 companies that developed large distributed systems (1994)

ⓒ KAIST SE LAB 2008 4 / 29


Introduction (3/3)
 Why is the estimation so inaccurate?
 Uncertainty is unavoidable when predicting the future
• Requirements – often unstable and vague
• Design – preferences of architecture and design may vary
• Implementation – language and development environment
• Testing – degree of extensiveness
• Deployment - user acceptance
• Personnel - experience & expertise
• Technology - Multi-platform or not
 Finding all the factors affecting a project is almost impossible
 Most factors are hard to quantify
 We need systematic approaches to help to reduce the
inaccuracy of estimates!

ⓒ KAIST SE LAB 2008 5 / 29


Overview of the estimation process (1/2)
Total Estimation cycle
Define Project
Requirements
(Products, Process)
Work Breakdown Structure
Environmental and Identify and
BEFORE
business factors Evaluate Risks

Staff Skill and Estimate Cost and


Availability Schedule

Re-estimate Cost Budget and


Project Estimate Project Plan
and Schedule Schedule

Changes in Perform Planned


DURING Requirements, Activities
Design and
Environment
Project Actuals

Documented Revised Inputs Compare Planned


Estimation Models and Actual Values Status and Estimate
Enhancements Close out report to Complete
Organization’s
Calibrate
AFTER Historical Data
Update
Procedures and Improve Process
Checklists 6 / 29
Source : R. Stutzke, “Estimating Software-intensive Systems”
Overview of the estimation process (2/2)
Technical view using various techniques
history data

Estimating effort
Estimating project size
Top-down
SLOC estimation
Project
characteristics Function Point Apply productivity

Use Case Point


Algorithmic
AlgorithmicModels
Models

Estimated effort
Analogy
Analogy

Expert
Expertjudgment
judgment
Experts
… Bottom-up
estimation

ⓒ KAIST SE LAB 2008 7 / 29


Software Sizing Methods
 Estimating Source Lines of Code
 Function Point
 Use case Point

ⓒ KAIST SE LAB 2008 8 / 29


Estimating Source Lines of Code
 Characteristics
 Traditional method to measure software size
 Developers view of the software
 How to estimate?
 Experience
 Previous / existing system size
 Breaking system into pieces (bottom up)
 Problems with SLOC estimation
 Hard to reflect the complexity
 Hard to consider environmental considerations (language, etc.)
“Measuring programming progress by
lines of code is like measuring aircraft
building progress by weight”
Bill Gates
ⓒ KAIST SE LAB 2008 9 / 29
Function Point (1/4)
 Characteristics
 User (functional) view of software
• Calculated from system requirements
• Counts attributes of the planned system
– Inputs, outputs, inquiries, internal logical files, external interface files

External System Boundary


Inquiries

Internal Files
External
Outputs

External Interface File


External Inputs

 Standard is maintained by IFPUG


 In Korea, FP is used as a pricing standard of software development

ⓒ KAIST SE LAB 2008 10 / 29


Function Point (2/4)
 Attributes
 External Inputs (EI)
• Control or business information coming from out of system boundary
– Ex) user input / sensor data
 External Outputs (EO)
• Derived data from calculation or applying algorithms, sent to outside
– Ex) graphics, reports
 External Inquiry (EQ)
• A process that retrieves data from ILF or EIF, including no calculation
 Internal Logical Files (ILF)
• Data that reside within application boundary
– Ex) database, masterfile
 External Interface Files (EIF)
• Data that reside outside the application and is maintained by another
applications, used for reference purposes only
– Ex) help message, reference data
ⓒ KAIST SE LAB 2008 11 / 29
Function Point (3/4)
 Calculating unadjusted function point (UFP)
Low Avg. High
External Input __ x 3 __ x 4 __ x 6
External Output __ x 4 __ x 5 __ x 7
Internal Logical File __ x 7 __ x 10 __ x 15
External Interface File __ x 5 __ x 7 __ x 10
External Inquiry __ x 3 __ x 4 __ x 6
3 5 (Weight is given by the number of data types, number of fields, etc.)

UFP   wij xij


i 1 j 1

 Calculating adjusted function point (AFP) 14


AFP  UFP VAF ( VAF  0.65  0.01   ri )
i 1

(VAF considers characteristics of the environment and complexity of product)

ⓒ KAIST SE LAB 2008 12 / 29


Function Point (4/4)
 Value adjustment factor (VAF)
1. Data communications
2. Distributed functions
3. Performance 0 – No Influence
4. Heavily used configuration
5. Transaction rate
6. Online data entry
7. End user efficiency
8. Online update
9. Complex processing
10. Reusability
11. Installation ease
12. Operational ease
13. Multiple sites 5 – Very Influential
14. Facilitates change

ⓒ KAIST SE LAB 2008 13 / 29


Use Case Point (1/2)
 Characteristics
 Sizing based on use-cases
• Intuitive to stakeholders and project team
• Traceable for controlling projects
 Derived from function points
 Consider actors, use-cases, technical factors and environmental
factors

ⓒ KAIST SE LAB 2008 14 / 29


Use Case Point (2/2)
 Calculating Unadjusted Use Case Point (UUCP)
 UUCP = UAW + UUCW
• Unadjusted Actor Weights (UAW) = ∑(# Actors* weight factor)
– Weight is decided by complexity of communication with actors
• Unadjusted Use Case Weights (UUCW) = ∑(# Use Cases* weight
factor)
– Weight is decided by # of transactions

 Calculating adjusted Use Case Point (UCP)


 UCP = UUCP * TCF * EF
• Technical Complexity Factor (TCF) = 0.6+0.01*Tfactor
– Derived from Function Points
• Environmental Factor (EF) = 1.4 + (-0.03*EFactor)
– Based on the interview from Objectory

ⓒ KAIST SE LAB 2008 15 / 29


Comparison of sizing methods
SLOC Function Point Use Case Point
Viewpoints Developer User (functionality) User (functionality)
Collecting phase Early After basic requirement After identifying use
analysis cases
Understandability of Easy Not intuitive and difficult Relatively easy
concepts to communicate
Ease of collecting data Easy to automate Difficult to automate Requires use-case
from previous projects document
Subjectivity of High Relatively objective, but Subjective creation of
estimation still easy to get biased use cases
according to estimators
Expertise needed for Experience needed Specialized training None
estimation needed
Limitations Unable to reflect Hard to reflect internal Granularity of use cases
complexity complexity may vary according to
/ Dependent on / Time consuming analysts
languages / Hard to apply to a
/ not suitable for web or corrective maintenance
IDE project

ⓒ KAIST SE LAB 2008 16 / 29


Estimation Method
 Techniques for estimation
 Algorithmic models
 Expert judgment
 Analogy
 Top-down and bottom-up estimation

ⓒ KAIST SE LAB 2008 17 / 29


Algorithmic models (1/3)
 Characteristics
 Use a mathematical formulae for estimation
• Yielded from analysis of historical data
• Regression is widely used for algorithmic models
 Calibration using local history data is strongly required
• Project environments vary according to the characteristics of each
organization
• Information of a new project environment need to be updated to the
model

ⓒ KAIST SE LAB 2008 18 / 29


Algorithmic models (2/3)
 Example - COCOMO II
 Application composition model
• Used when software is prototyped or composed from existing parts
• Utilizing Object Point, similar to Function Point, and productivity
information
– Counting the number of screens, reports and 3GL modules
 Early design model
• Used when requirements are available but design has not yet started
 Post-architecture model
• Used once the system architecture has been designed and more
information about the system is available
 Reuse model
• Used to compute the effort of integrating reusable components

ⓒ KAIST SE LAB 2008 19 / 29


Algorithmic models (3/3)
 Example - COCOMO II (Cont’d)
 Early design model and post-architecture model utilize the form of
formula “Effort = A x SizeB x M”
• For sizing, SLOC or FP is used
• A: organization-dependent constant
• B: scale factors
– Factors affecting the effort in an exponential manner
• M: effort Multipliers
– Supports four categories - product factors, platform factors,
personnel factors, project factors
– Early design model supports 7 factors, while post-architecture model
supports 17 factors, according to the available amount of information

ⓒ KAIST SE LAB 2008 20 / 29


Expert judgment
 Characteristics
 Utilize personal expertise and experience in estimation
 One of the most commonly used method for estimation
• Not requires preparation such as data and models
 Example - Wideband Delphi
 1. Coordinator explains the project
 2. Coordinator calls a group meeting to discuss estimation issues
 3. Experts fill out estimation forms anonymously
 4. Coordinator distributes a summery of the estimates
 5. Coordinator calls a group meeting to discuss points where their
estimates varied widely
 6. Experts fill out forms again anonymously and Step 4 to 6 are
iterated for as many rounds as appropriate

ⓒ KAIST SE LAB 2008 21 / 29


Analogy
 Characteristics
 Effort is estimated by comparing the project to a similar project in
the same application domain
 Estimation can be done either at the total project level or at a
subsystem level
 Process
 1. Selection of analogous projects
 2. Assessing similarities and differences
 3. Adjust the estimate using the difference
 4. Consideration of any special cases
 5. Creating the estimate (http://www.ecfc.u-net.com)

ⓒ KAIST SE LAB 2008 22 / 29


Comparison of estimation techniques
Algorithmic models Expert judgment Analogy
Approach Building statistical model Depending on experts’ Measuring similarity and
opinion adjustment
Need for data Yes (for calibration) No Yes
Need for expert Low High Low

Strong point Objective, repeatable, Relatively cheap Based on actual


analyzable formula estimation experience on a project
/ Suitable for sensitivity / Accurate if experts
analysis have direct experience
of similar systems
Week point Weak against exceptional / No better than Requires accurate
circumstances participants details on many past
/ calibrated to past, not / The result may be projects
future biased / Doesn’t work if similar
project doesn’t exist
/ Hard to define the
similarity between
projects

ⓒ KAIST SE LAB 2008 23 / 29


Top-down and bottom-up estimation (1/4)
 Any of above approaches may be used top-down or
bottom-up
 Top-down
 Starts at the system level and assess the overall system
functionality and how this is delivered through sub-systems
 Does not consider details at first
• Usable without knowledge of the architecture and the components
• Can underestimate the cost of solving difficult low-level technical
problems

ⓒ KAIST SE LAB 2008 24 / 29


Top-down and bottom-up estimation (2/4)
 Bottom-up
 Start at the component level and estimate the effort required for
each component
 Usable when the architecture of the system is known and
components identified
 Accurate if the system has been designed in detail
 May underestimate the costs of system level activities such as
integration and documentation

ⓒ KAIST SE LAB 2008 25 / 29


Top-down and bottom-up estimation (3/4)
 Process-based estimation
 Most commonly-used technique for project estimation
 Project is broken down into a relatively small set of tasks and the
effort required to accomplish each task is estimated
• Bottom up estimation
 Process
• Begins with outlining software functions obtained from the project
scope
• Once functions and activities are identified, the planner estimates the
effort (person-months) required to accomplish each activity per
function
• Average labor rates are then applied to the estimated efforts
(i.e. cost per unit effort - may vary per task)
• Cost and effort for each function and activity (row and column totals)
are computed as the last step

ⓒ KAIST SE LAB 2008 26 / 29


Top-down and bottom-up estimation (4/4)
 Process-based estimation (Cont’d)
 Example – CAD software
Risk
Activity CC Planning Engineering Release CE Totals
analysis
Task → Anal. Design Code Test
Function ↓
UICF 0.75 2.50 0.40 5.00 n/a 8.65
3DGA 0.50 4.00 0.60 2.00 n/a 7.10
CGDF 0.50 4.00 1.00 3.00 n/a 8.50
DBM 0.50 3.00 1.00 1.50 n/a 6.00
PCF 0.25 2.00 0.75 1.50 n/a 4.50
DAM 0.50 2.00 0.50 2.00 n/a 5.00

Totals .25 .25 .25 3.00 17.50 4.25 15.00 34.80

% effort 1% 1% 1% 7% 45% 12% 40%

CC: Customer communication UICF: User interface and control facilities


CE: Customer evaluation 3DGA: Three-dimensional geometric analysis
CGDF: Computer graphics display facilities
DBM: Database management
PCF: Peripheral control function
ⓒ KAIST SE LAB 2008 DAM: Design analysis modules 27 / 29
Best practices for estimation (1/2)
 Estimate early and often!
 Cost estimation should be done throughout the software life cycle
• Allow refinement according to changes
 Select experts with similar experience on the project
 Ask justification
 Can help to reduce mistakes in consideration of estimation
 Gather and analyze historical data
 Must perform calibration when using algorithmic models
 Do a final assessment of cost estimation at the end of the
project
 Identify problems in estimation

ⓒ KAIST SE LAB 2008 28 / 29


Best practices for estimation (2/2)
 Avoid informal estimates due to the pressure from
management
 Allow time for estimation and plan for it
 Use developer-based estimates
 Manager based estimates are easy to be biased to the positive side
and not to reflect the realistic development environment
 Estimate at a low level of detail, if possible
 Can be more accurate with considering details

ⓒ KAIST SE LAB 2008 29 / 29


Q & A?

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy