The Systems Life Cycle by Mehy
The Systems Life Cycle by Mehy
.
Stages of Analysis:
1. Analysis
2. Design
3. Development & Testing
4. Implementation
5. Documentation
6. Evaluation
techniques:
● Observation:
○ involves watching users interact with the system to figure out its workings
○ it is used for understanding how users interact with the current system
○ Advantages: provides first-hand, unbiased information
○ Disadvantages: can be time-consuming, may not reveal all issues
● Interviews:
○ direct one-to-one conversations with users on their experience with the
current system
○ used to gather comprehensive information about individual users
○ Advantages: allows for in-depth exploration of issues
○ Disadvantages: relatively expensive, time-consuming, no user-anonymity
which may affect the response
● Questionnaires:
○ set of predetermined questions is given to the users to complete and give
their opinion on the current system
○ it is majorly used in collecting data from a larger group of people
○ Advantages: allows for quantitative analysis, efficient data collection,
questions can be answered quickly
○ Disadvantages: limited by predetermined questions, may suffer from low
response rates, users may exaggerate answers due to anonymity
● Examination of existing documents:
○ reviewing system documentation, user guides, or reports
○ understanding the current system's design and any known issues
○ Advantages: provides insights into the system's history, can reveal
previously unknown issues
○ Disadvantages: may be an outdated or incomplete, time-consuming,
rather expensive method.
● Data that is inputted, processed, and outputted into the system are identified.
● Problems with the current system are identified. What could be improved?
● The requirements of the user and the potential new system are identified. What is
the new system meant to do?
● Problems: issues that users face with the current system
● User requirements: what needs to be added to the new system
● Information requirements: data or information the new system must process
New System Requirements Specification:
• Once the systems analysts have completed the analysis stage of the systems life
cycle they should be fully aware of the limitations of the current system.
• The next step will be to design a new system (normally computer-based) to resolve the
• The Requirements Specification will be created which will outline the required
System specification
Hardware and Software Selection
● Vital to identify the suitable hardware needed for the new system
○ contemplating system requirements, compatibility, costs
○ justifying choices based on user needs and system performance
● Hardware that needs to be considered:
○ barcode readers,
○ scanners,
○ touch screens,
○ 3D printers,
○ monitors,
○ speakers.
● Identifying suitable software needed for the new system
○ considering functionality, compatibility, and ease of use
○ justifying choices based on user requirements and system efficiency
● Software that needs to be considered:
○ operating system,
○ applications software,
○ size of storage,
○ type of storage.
Design
Once the analysis has taken place and the systems analyst has some idea of the scale
of the problem and what needs to be done, the next stage is to design the key parts of
File/Data Structures
● Data capture forms: designed to collect data from users in a structured format,
they come in two types: paper-based and electronic-based. Paper-based
data-capturing forms need to be carefully designed with headings, concise
instructions, character and information fields, checkboxes, and enough writing
space. Text boxes, on-screen help, drop-down menus, radio buttons, automatic
validation, and control buttons for data entry are all features of computer-based
forms.
Consider a user-friendly layout, clear instructions, and appropriate data fields
Output Formats
meets specific requirements. It is a routine check that the computer does as part of its
programming.
Testing
● Guarantees the system's functionality before it is put into use.
● Identification and removal of errors, thus improving system reliability and
performance.
Test designs
● Test data structures, file structures, input formats, output formats, and validation
routines
● Ensure all components function correctly and interact seamlessly
Test strategies
● Test each module: verify individual components function as intended
● Test each function: ensure all features work correctly
● Test the whole system: confirm overall system performance and integration
Test plan
The following data types will be explained using the example of months in a year.
● Normal data: valid and expected data values within the range of acceptability,
have an expected outcome. E.g. any whole number between 1-12.
● Abnormal data: invalid or unexpected data values. This can either be:
○ Data outside the range of acceptability or
○ Data that is the wrong data type
○ In this case, examples could be…
■ any value less than 1 (i.e. 0, -6, etc.)
■ any value greater than 12 (i.e. 13, 15, etc.)
■ letters or nun-numeric data (i.e. July, etc.)
■ non-integral values (i.e. 3.5, 4.2, etc.)
● Extreme data: values at the limits of acceptability (E.g. 1 or 12)
What is live data?
System Implementation
The system must then be fully implemented after it has been thoroughly tested.
We will now think more carefully about switching to the new system. Four popular
techniques are utilized to transition from the old system to the new one.
Before selecting the approach best suited for a given application, the pros and cons of
1. Direct changeover:
● Both current and new systems run simultaneously for a period before the old
system is phased out
● Used when a smooth transition with minimal risk is required
● Advantages
○ Lower risk
○ easy system comparison
● Disadvantages
○ Time-consuming
○ resource-intensive
3. Pilot Running
Documentation
In the life cycle of a system, documentation enables the correct recording of design,
Instruction and guidance for end-users on how to operate the system. Used to help
● Purpose of the system: Explanation of the system's intended function and goals
● Limitations: Known constraints or issues with the system
● Hardware & software requirements: Necessary equipment and software to run
the system
● Loading/running/installing software: Instructions for setting up the system on
user devices
● Saving files: Procedures for storing data within the system
● Printing data: Steps to produce hard copies of system data
● Adding records: Instructions for creating new entries in the system
● Deleting/editing records: Guidelines for modifying or removing existing entries
in the system
● Input format: Structure and format for entering data into the system
● Output format: Structure and format for presenting data generated by the
system
● Sample runs: Examples of system operation, including input and expected
output
● Error messages: Explanations of system warnings and error notifications
● Error handling: Steps to resolve issues and errors within the system
● Troubleshooting guide/helpline: Assistance for diagnosing and addressing
common problems
● Frequently asked questions: Answers to common user inquiries
● Glossary of terms: Definitions of key terms and concepts related to the system
Evaluate a solution
It measures the productivity, efficiency, and compliance of a system with its goals in
order to identify its strengths, shortcomings, and potential development areas. This
assessment informs decision-making and improves overall performance over the course
● Look at the solution's usability and accessibility for the target market. Check to
see if the system is simple to understand and use, and if users have no trouble
completing their jobs.
○ Describe the user interface and how it facilitates interaction with the
system
○ Mention any feedback from users regarding their experience with the
system, and address any issues they encountered
● Questions to ask:
○ Are all the users able to use the system and make bookings easily?
○ Are all the users able to change and cancel bookings easily?
○ Can all staff understand how to use the system with minimal training?
Determine the suitability of the solution:
● Examine how well the implemented solution satisfies the desired outcome by
contrasting it with the original task criteria.
○ Outline the initial objectives of the system and discuss how the solution
addresses each one
○ Highlight any requirements that may not have been fully met and discuss
possible reasons for this
● Questions to ask:
○ Is the system suitable for each of the departments?
○ Does it meet the needs of the customers?
○ Does it meet the needs of the staff?
○ Does the solution match the original requirements?
Collect and examine user feedback:
● Collect users' responses to the results of testing the system. Their feedback can
provide insights into potential issues and improvements, and help determine
overall user satisfaction
○ Summarise the testing process, including test data, expected outcomes,
and actual outcomes
○ Discuss users' reactions to the system, addressing any concerns or
suggestions they may have
Identify limitations and suggest necessary improvements: