Pmse Module IV
Pmse Module IV
Software Project
A Software Project is the complete procedure of software
development from requirement gathering to testing and
maintenance, carried out according to the execution
methodologies, in a specified period of time to achieve intended
software product.
Need of software project management
Software is said to be an intangible product. Software development is a kind of all new stream
in world business and there’s very little experience in building software products. Most
software products are tailor made to fit client’s requirements. The most important is that the
underlying technology changes and advances so frequently and rapidly that experience of one
product may not be applied to the other one. All such business and environmental constraints
bring risk in software development hence it is essential to manage software projects
efficiently.
The image above shows triple constraints for software projects. It is an essential part of
software organization to deliver quality product, keeping the cost within client’s budget
constrain and deliver the project as per scheduled.
There are several factors, both internal and external, which may impact this
triple constrain triangle. Any of three factor can severely impact the other
two.
Therefore, software project management is essential to incorporate user
requirements along with budget and time constraints.
Managing People
Act as project leader
Liaison with stakeholders
Managing human resources
Setting up reporting hierarchy etc.
Managing Project
Defining and setting up project scope
Managing project management activities
Monitoring progress and performance
Risk analysis at every phase
Take necessary step to avoid or come out of problems
Act as project spokesperson
Project Scheduling
Project Scheduling in a project refers to roadmap of all activities to be done with
specified order and within time slot allotted to each activity. Project managers tend to
define various tasks, and project milestones and arrange them keeping various factors in
mind. They look for tasks lie in critical path in the schedule, which are necessary to
complete in specific manner (because of task interdependency) and strictly within the
time allocated. Arrangement of tasks which lies out of critical path are less likely to
impact over all schedule of the project.
For scheduling a project, it is necessary to -
2. Effort estimation
The managers estimate efforts in terms of personnel requirement and man-hour required
to produce the software. For effort estimation software size should be known. This can
either be derived by managers’ experience, organization’s historical data or software size
can be converted into efforts by using some standard formulae.
3. Time estimation
Once size and efforts are estimated, the time required to produce the software can be
estimated. Efforts required is segregated into sub categories as per the requirement
specifications and interdependency of various components of software. Software tasks are
divided into smaller tasks, activities or events by Work Breakthrough Structure (WBS). The
tasks are scheduled on day-to-day basis or in calendar months.
The sum of time required to complete all tasks in hours or days is the total time invested to
complete the project.
4. Cost estimation
This might be considered as the most difficult of all because it depends on
more elements than any of the previous ones. For estimating project cost,
it is required to consider -
Size of software
Software quality
Hardware
Additional software or tools, licenses etc.
Skilled personnel with task-specific skills
Travel involved
Communication
Training and support
Project Estimation Techniques
We discussed various parameters involving project estimation such as size, effort, time and
cost. Project manager can estimate the listed factors using two broadly recognized techniques
Decomposition Technique
This technique assumes the software as a product of various compositions.
where:
• Size is the product size (whatever size estimate is used by your organization is appropriate).
• B is a scaling factor and is a function of the project size
• Productivity is the Process Productivity, the ability of a particular software organization to produce software
of a given size at a particular defect rate.
• Effort is the total effort applied to the project in person/years.
• Time is the total schedule of the project in years.
An estimated software size at project completion and organizational process productivity is used. Plotting effort as a
function of time yields the Time-Effort Curve. The points along the curve represent the estimated total effort to
complete the project at some time. One of the distinguishing features of the Putnam model is that total effort decreases
as the time to complete the project is extended.
COCOMO
COCOMO stands for COnstructive COst MOdel, developed by Barry W. Boehm. It is a model
for estimating effort, cost, and schedule for software projects. It divides the software product
into three categories of software: organic, semi-detached and embedded.
1. Basic COCOMO
Basic COCOMO computes software development effort (and cost) as a function of program size.
Program size is expressed in estimated thousands of source lines of code (SLOC) COCOMO
applies to three classes of software projects:
1. Organic projects - "small" teams with "good" experience working with "less than rigid"
requirements
2. Semi-detached projects - "medium" teams with mixed experience working with a mix of
rigid and less than rigid requirements
3. Embedded projects - developed within a set of "tight" constraints. It is also combination
of organic and semi-detached projects.(hardware, software, operational, ...)
The basic COCOMO equations take the form
where, KLOC is the estimated number of delivered lines (expressed in thousands ) of code for project. The coefficients a
b, bb, c b and d b are given in the following table:
Basic COCOMO is good for quick estimate of software costs. However it does not account for differences in hardware
constraints, personnel quality and experience, use of modern tools and techniques, and so on.
2. Intermediate COCOMOs
Intermediate COCOMO computes software development effort as function of program size and a set of "cost drivers"
that include subjective assessment of product, hardware, personnel and project attributes. This extension considers a set
of four "cost drivers",each with a number of subsidiary attributes:-
• Product attributes
• Required software reliability
• Size of application database
• Complexity of the product
• •Hardware attributes
• Run-time performance constraints
• Memory constraints
• Volatility of the virtual machine environment
• Required turnabout time
• Personnel attributes
• Analyst capability
• Software engineering capability
• Applications experience
• Virtual machine experience
• Programming language experience
• Project attributes
• Use of software tools
• Application of software engineering methods
• Required development schedule
The Intermediate COCOMO formula now takes the form:
where E is the effort applied in person-months, SLoC is the estimated number of thousands of delivered lines of code
for the project, and EAF is the factor calculated above. The coefficient ai and the exponent bi are given in the next
table.
The Development time D calculation uses E in the same way as in the Basic COCOMO.
3. Detailed COCOMO
Detailed COCOMO incorporates all characteristics of the intermediate
version with an assessment of the cost driver's impact on each step
(analysis, design, etc.) of the software engineering process. In detailed
COCOMO, the effort is calculated as function of program size and a set of
cost drivers given according to each phase of software life cycle.
1. Identification - Make note of all possible risks, which may occur in the project.
2. Categorize - Categorize known risks into high, medium and low risk intensity as per their possible
impact on the project.
3. Manage - Analyze the probability of occurrence of risks at various phases. Make plan to avoid or
face risks. Attempt to minimize their side-effects.
4. Monitor - Closely monitor the potential risks and their early symptoms. Also monitor the effects of
steps taken to mitigate or avoid them.
Quality management
this include procedures tools and techniques that are used to ensure
that the output and benefits meet customer requirements. quality
management has four components quality planning, quality
assurance, quality control and internal improvements
the first component quality planning involves the preparation of a
quality management plan that describes the processes and metrics
that will be used but quality management plan needs to be agreed
with elements stakeholders to ensure that their expectations to
quality are correctly identified. the processes described in the
quality management plan should confirm to the processes culture
and values of the host organization.
quality assurance provides confidence to the first
organization that it's projects programs and portfolios are
being well managed. where it's the consistent use of
procedures and standards and ensures that stuff have the
correct knowledge skills and attitudes to fulfill their project
roles and responsibilities in commission manner. quality
assurance must be independent of the project program for
portfolio to which it applies.
quality control consists of inspection, testing and measurements to verify
that the deliverables conform to specification are fit for purpose and meet
stakeholder expectations. determine whether acceptance criteria have or have
not been made for this to be effective specifications must be under strict
configuration control it is possible that once agreed the specification may
need to be modified commonly this is to accommodate change request or
issues while maintaining acceptable time and cost constraints any
consequence changes to acceptance criteria should be approved and
communicated the last component continual improvement is the generic
term used by organisations to describe how information provided by quality
assurance and quality control processes is used to drive improvements in in
efficiency and effectiveness 3 maturity model provides a framework against
which continual improvement can be initiated and embedded in the
organisation.
Configuration Management
Configuration management is a process of tracking and controlling the changes
in software in terms of the requirements, design, functions and development of
the product.
IEEE defines it as “the process of identifying and defining the items in the
system, controlling the change of these items throughout their life cycle,
recording and reporting the status of items and change requests, and
verifying the completeness and correctness of items”.
Events are shown as numbered nodes. They are connected by labeled arrows depicting
sequence of tasks in the project.
Resource Histogram
This is a graphical tool that contains bar or chart representing number of resources (usually
skilled staff) required over time for a project event (or phase). Resource Histogram is an
effective tool for staff planning and coordination.
Critical Path Analysis(CPA)
This tools is useful in recognizing interdependent tasks in the
project. It also helps to find out the shortest path or critical path to
complete the project successfully. Like PERT diagram, each event is
allotted a specific time frame. This tool shows dependency of event
assuming an event can proceed to next only if the previous one is
completed.
Life cycle the life cycle of the framework explain the stages involved in the project and what needs to happen
at each stage it allows the management team to make adjustments and customize the stages based on the size
and scope of the project
for example a company that is completing a project designing children's learning materials may have a life
cycle as follows
first the company initiatives the projects the company may go through numerous upfront procedures to decide
whether or not to pursue the business opportunity make it available for BITS and finalize contracts
second the company plans for projects the company defines in detail the project deliverables schedule detailed
budget extra
third the company executes the project by proceeding with Excel project as such as creating the learning
materials it manages the project according to plan by using the control cycle to monitor progress and adjust
the plan if required
finally when the materials are completed the company terminates the project and follows a closure procedure
Capability Maturity Model (CMM)
CMMI provides:
• Guidelines for processes improvement
• An integrated approach to process improvement
• Embedding process improvements into a state of business as usual
• A phased approach to introducing improvements
CMMI Maturity Levels
A maturity level is a well-defined evolutionary plateau toward
achieving a mature software process. Each maturity level
provides a layer in the foundation for continuous process
improvement.
In CMMI models with a staged representation, there are five
maturity levels designated by the numbers 1 through 5
1. Initial
2. Managed
3. Defined
4. Quantitatively Managed
5. Optimizing
CMMI Staged Representation- Maturity Levels
Maturity levels consist of a predefined set of process areas. The maturity
levels are measured by the achievement of the specific and generic goals that
apply to each predefined set of process areas. The following sections
describe the characteristics of each maturity level in detail.
Matruity Level 1 – Initial
At maturity level 1, processes are usually ad hoc and chaotic. The organization usually
does not provide a stable environment. Success in these organizations depends on the
competence and heroics of the people in the organization and not on the use of proven
processes.
Maturity level 1 organizations often produce products and services that work; however,
they frequently exceed the budget and schedule of their projects.
Maturity Level 2 - Managed
At maturity level 2, an organization has achieved all the specific and generic goals of the
maturity level 2 process areas. In other words, the projects of the organization have
ensured that requirements are managed and that processes are planned, performed,
measured, and controlled.
At maturity level 2, requirements, processes, work products, and services are managed.
The status of the work products and the delivery of services are visible to management
at defined points.
Maturity Level 3 - Defined
At maturity level 3, an organization has achieved all the specific and
generic goals of the process areas assigned to maturity levels 2 and 3.
At maturity level 3, processes are well characterized and understood,
and are described in standards, procedures, tools, and methods.
A critical distinction between maturity level 2 and maturity level 3 is
the scope of standards, process descriptions, and procedures. At
maturity level 2, the standards, process descriptions, and procedures
may be quite different in each specific instance of the process (for
example, on a particular project). At maturity level 3, the standards,
process descriptions, and procedures for a project are tailored from the
organization's set of standard processes to suit a particular project or
organizational unit.
Maturity Level 4 - Quantitatively Managed
At maturity level 4, an organization has achieved all the specific goals of the
process areas assigned to maturity levels 2, 3, and 4 and the generic goals
assigned to maturity levels 2 and 3.
At maturity level 4 Subprocesses are selected that significantly contribute to
overall process performance. These selected subprocesses are controlled using
statistical and other quantitative techniques.
Maturity Level 5 - Optimizing
At maturity level 5, an organization has achieved all the specific goals of the
process areas assigned to maturity levels 2, 3, 4, and 5 and the generic goals
assigned to maturity levels 2 and 3.
Processes are continually improved based on a quantitative understanding of
the common causes of variation inherent in processes.
Maturity level 5 focuses on continually improving process performance
through both incremental and innovative technological improvements.
CMMI Models
CMMI consists of three overlapping disciplines (constellations)
providing specific focus into the Development, Acquisition and
Service Management domains respectively:
• CMMI for Development (CMMI-DEV) – Product and service
development
• CMMI for Services (CMMI-SVC) – Service establishment,
management, and delivery
• CMMI for Acquisition (CMMI-ACQ) – Product and service
acquisition
CMMI Benefits
CMMI-based process improvement benefits include
• improved schedule and budget predictability
• improved cycle time
• increased productivity
• improved quality (as measured by defects)
• increased customer satisfaction
• improved employee morale
• increased return on investment
• decreased cost of quality
Computer Aided Software Engineering (CASE)
Computer-Aided Software Engineering (CASE) technologies are tools that
provide automated assistance for software development . The goal of
introducing CASE tools is the reduction of the time and cost of software
development and the enhancement of the quality of the systems developed.
CASE can also help as a warehouse for documents related to projects, like business
plans, requirements and design specifications. One of the major advantages of using
CASE is the delivery of the final product, which is more likely to meet real-world
requirements as it ensures that customers remain part of the process.
CASE illustrates a wide set of labor-saving tools that
are used in software development. It generates a
framework for organizing projects and to be helpful in
enhancing productivity. There was more interest in the
concept of CASE tools years ago, but less so today, as the
tools have morphed into different functions, often in
reaction to software developer needs. The concept of CASE
also received a heavy dose of criticism after its release.
CASE Tools:
The essential idea of CASE tools is that in-built programs can
help to analyze developing systems in order to enhance quality
and provide better outcomes. Throughout the 1990, CASE tool
became part of the software lexicon, and big companies like IBM
were using these kinds of tools to help create software.
Central Repository - CASE tools require a central repository, which can serve as a
source of common, integrated and consistent information. Central repository is a central
place of storage where product specifications, requirement documents, related reports
and diagrams, other useful information regarding management is stored. Central
repository also serves as data dictionary.
Upper Case Tools - Upper CASE tools are used in planning, analysis and
design stages of SDLC.
Lower Case Tools - Lower CASE tools are used in implementation, testing
and maintenance.
Integrated Case Tools - Integrated CASE tools are helpful in all the stages of
SDLC, from Requirement gathering to Testing and documentation.
1. Diagramming Tools:
It helps in diagrammatic and graphical representations of the data and
system processes. It represents system elements, control flow and data flow
among different software components and system structure in a pictorial
form.
For example, Flow Chart Maker tool for making state-of-the-art flowcharts.
4. Central Repository:
It provides the single point of storage for data diagrams, reports and documents related
to project management.
5. Documentation Generators:
It helps in generating user and technical documentation as per standards. It creates
documents for technical users and end users.
For example, Doxygen, DrExplain, Adobe RoboHelp for documentation.
6. Code Generators:
It aids in the auto generation of code, including definitions, with the help of
the designs, documents and diagrams.
Advantages of the CASE approach:
Several CASE tools square measure obtainable. A number of these CASE tools assist in
part connected tasks like specification, structured analysis, design, coding, testing, etc.;
and other to non-phase activities like project management and configuration
management.
to extend productivity
CASE environment:
Although individual CASE tools square measure helpful, the true power of a
toolset is often completed only this set of tools square measure integrated into a
typical framework or setting. CASE tools square measure characterized by the
stage or stages of package development life cycle that they focus on. Since totally
different tools covering different stages share common data, it’s needed that they
integrate through some central repository to possess an even read of data related to
the package development artifacts. This central repository is sometimes
information lexicon containing the definition of all composite and elementary data
things.
Through the central repository, all the CASE tools in a very CASE setting share
common data among themselves. therefore a CASE setting facilities the
automation of the step-wise methodologies for package development. A schematic
illustration of a CASE setting is shown in the below diagram:
A CASE environment facilitates the automation of the in small stages methodologies for package
development. In distinction to a CASE environment, a programming environment is an Associate
in a Nursing integrated assortment of tools to support solely the cryptography part of package
development.
Benefits of CASE
Several benefits accrue from the employment of a CASE setting or perhaps isolated
CASE tools. a number of those benefits are:
- A key profit arising out of the employment of a CASE setting is value saving
through all development phases. totally different studies performed to live the impact
of CASE place the trouble reduction between 30% to 40%.
- The use of CASE tools results in goodish enhancements to quality. this can be
primarily thanks to the fact that one can effortlessly retell through the various phases
of code development and therefore the possibilities of human error ar significantly
reduced.
- CASE tools facilitate the manufacture of prime quality and consistent documents.
Since the necessary information with reference to a wares ar maintained in an
exceedingly central repository, redundancy within the keep information is reduced
and thus possibilities of inconsistent documentation are reduced to a good extent.
- CASE tools confiscate most of the labor in an exceedingly code
engineer’s work. for instance, they have not to check meticulously the
leveling of the DFDs, however, they will effortlessly through the press
of a button.
- CASE tools have crystal rectifier to revolutionary value saving in
code maintenance efforts. This arises not solely thanks to the
tremendous worth of a CASE setting in traceability and consistency
checks, however additionally thanks to the systematic info capture
throughout the assorted phases of code development as a result of
adhering to a CASE setting.
The introduction of a CASE setting has an impression on the
design of operating of an organization and makes it oriented towards
the structured and orderly approach.