0% found this document useful (0 votes)
34 views25 pages

Cisa Module 3 Part B-I.s Implementation

The document discusses information systems implementation and testing. It covers various types of testing methodologies like unit testing, integration testing, system testing and acceptance testing. It also discusses the roles of quality assurance testing and user acceptance testing. The document outlines best practices for data integrity testing, application systems testing, and the auditor's role in testing. It covers configuration and release management, system migration, infrastructure deployment, and important considerations for successful data conversion.

Uploaded by

REJAY89
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views25 pages

Cisa Module 3 Part B-I.s Implementation

The document discusses information systems implementation and testing. It covers various types of testing methodologies like unit testing, integration testing, system testing and acceptance testing. It also discusses the roles of quality assurance testing and user acceptance testing. The document outlines best practices for data integrity testing, application systems testing, and the auditor's role in testing. It covers configuration and release management, system migration, infrastructure deployment, and important considerations for successful data conversion.

Uploaded by

REJAY89
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 25

MODULE 3: INFORMATION SYSTEMS ACQUISITION,

DEVELOPMENT AND IMPLEMENTATION


PART B: INFORMATION SYSTEMS IMPLEMENTATION
INTRODUCTION
 Implementation is when the system is installed and moved into the
production environment after appropriate system and user acceptance
testing. At this stage:
 End users are notified
 Data entry or conversions occur
 Training takes place
 Post-implementation reviews occur
TESTING METHODOLOGIES
 Proper selection of testing methodologies is integral to I.S implementation
 Testing provides confidence to stakeholders that a system or system
component operates as intended and delivers the benefits realization as
required at the start of the project.
 An i.s auditor should understand the application of various forms of testing ;
and also how QA monitoring and evaluation contribute to the quality of an
organization’s internal processes.
 An i.s auditor plays a preventive or detective role in the testing process.
TESTING METHODOLOGIES
Testing Classifications
Unit testing-testing of an individual program or module
Interface or integration testing-evaluation of connection between two or more
components that pass info from one area to another.
System testing-a series of tests designed to ensure that modified programs, objects,
etc which constitute a system funtion properly
Final acceptance testing-performed after staff is satisfied with the system tests. it
occurs during implementation phase.
i. Quality Assurance Testing (QAT)-focusses on technical aspects
ii. User Acceptance Testing (UAT)-focusses on functional aspects
TESTING METHODOLOGIES
Other types of Testing:
Alpha and beta testing- alpha testing is conducted by users within the org while beta
testing is performed by limited number of external users.
Pilot testing-A preliminary test that focusses on specific and predetermined aspects of
a system.
White box testing -assesses the effectiveness of software program logic.
black box testing -integrity-based form of testing associated with testing components
of an info system’s functional operating effectiveness.
Fxn/validation testing-similar to system testing but often used to test the fxnality of
the system against the detailed requirements.
Regression testing-rerunning a portion of the a test scenario or test plan to ensure that
changes or corrections have not introduced new errors
Parallel testing-process of feeding test data into two systems and comparing the
results.
Sociability testing-tests to confirm that the new system can operate in the target
environment without adversely impacting existing systems.
SOFTWARE TESTING
 Test plans identify specific portions of the system to be tested and may include a
categorization of types of deficiencies that can be found during the test.
 Test plans also identify test approaches, such as :
1. bottom up-testing begins with atomic units , such as programs or modules, and works
upward until a complete system testing has taken place.
 advantages:
 no need for stubs or drivers
 testing can be started before all programs are complete
 errors in critical modules are found early.
2. Top down-testing follows the opposite path, either in depth-first or breadth-first search
order.
 advantages
 tests of major fxns and processing are conducted early
 interface errors can be detected sooner
 confidence in the system is increased because programmers and users actually see a
working system.
DATA INTEGRITY TESTING
 This is a set of substantive tests that examines accuracy, completeness, consistency
and authorization of data in the system.
 Two common types of data integrity tests are relational and referential integrity
tests:
 Relational integrity tests
 Referential integrity tests
 In multi user transaction systems, it is necessary to manage parallel user access to
stored data typically controlled by a DBMIS and deliver fault tolerance. The ACID
principle is of importance :
 Atomicity
 Consistency
 Isolation
 Durability
APPLICATION SYSTEMS TESTING
 Involves analyzing computer application programs, testing computer application
program controls or selecting and monitoring data process transactions.
 Testing controls by applying appropriate audit procedures is important to ensure
their functionality and effectiveness.
 Techniques include : snapshot, mapping, tracing and tagging, test data/deck, base-
case system evaluation, parallel operation, integrated testing facility, parallel
simulation, transaction selection programs, embedded audit data collection,
extended records.
I.S AUDITOR’S ROLE IN I.S TESTING
 I.S auditor should be involved in reviewing this phase and perform the following:
 Review test plan for completeness; evidence of user participation
 Reconcile control totals
 review error reports
 verify cyclical processing for correctness
 verify accuracy of critical reports
 review system and end user documentation to determine completeness and accuracy
 review unit and system test plans to determine whether tests are planned and
performed.
CONFIGURATION AND RELEASE MANAGEMENT
 Effective and efficient development and maintenance of IT systems requires rigorous
configuration, change and release management processes be implemented and adhered to
within an org.
 these processes provide systematic, consistent and unambigous control on attributes of IT
components(hw,sw,fw and network connectivity).
 changes to IT systems must be carefully assessed, planned, tested,approved, documented
and communicated to minimize any undesirable consequences to the business processes.
 an I.S auditor should be aware of the tools available for managing configuration, change and
release management and of the controls in place to ensure SoD between development staff
and production environment.
 in a config mngt system, maintenance requests must be formally documented and approved
by a change control group.
 in addition, careful control is exercised over each stage of the maintenance process via
checkpoints, reviews and sign off procedures.
 for configuration mngt to work, mngt support is critical.
 the process is implemented by developing and following a config and mngt plan and
operating procedures.the plan should not be limited to just the software developed but
should also include all system documentation, test plans and procedures.
CONFIGURATION AND RELEASE MANAGEMENT
 Configuration mngt tools support change mngt and release mngt through:
 identification of items affected by a proposed change to assist with impact
assessment
 recording config items affected by authorized changes
 implementation of changes in accordance with authorization records
 registering of config item changes when authorized changes and releases are
implemented
 recording of baselines that are related to releases to which an org would revert if an
implemented change fails
 preparing a release to avoid human errors and resource costs
 from i.s audit perspective, effective use of configuration mngt sw provides important
evidence of mngt’s commitment to careful control over the maintenance process.
SYSTEM MIGRATION, INFRASTRUCTURE DEPLOYMENT AND
DATA CONVERSION
 data format, coding, structure and integrity are to be preserved or properly
translated.
 a migration scenario must be set up and a rollback plan needs to be in place.
 the importance of correct results is critical, and success depends on the use of good
practices by the devpmnt team.
 source data must be correctly characterized and the destination database must
accomodate all existing data values.
 resulting data should be carefully tested.
 steps for the conversion that are developed in the test environment must be recorded
so they can be repeated on the production system.
 an i.s auditor should ensure that any tools and techniques selected for the process
are adequate and appropriate, data conversion achieves the necessary objectives
without data loss or corruption, and any loss of data is both minimal and formally
accepted by user mngt.
DATA MIGRATION,
 a data conversion is required if the source and target systems use different field
formats or sizes, file/database structures, or coding schemes.
 the objective of data conversion is to convert existing data into the new required
format, coding and structure while preserving the meaning and integrity of the data.
 the data conversion process must provide some means,such as audit trails and logs, to
allow for the verification of the accuracy and completeness of the converted data.
 steps for a successful data conversion are:
i. determine programs to be converted using programs or manually
ii. perform any necessary data cleansing ahead of conversion
iii. establish the parameters for a successful conversion
iv. schedule the sequence of conversion tasks
v. design audit trail reports to document the conversion
vi. design exception reports to record any items that cannot be converted manually
DATA MIGRATION
vii. establish responsibility for verifying and signing off on individual conversion steps and
accepting the overall conversion.
viii. develop and test conversion programs
ix. perform one or more conversion rehearsals
x. control the outsourcing of the conversion process with a proper agreement covering
non-disclosure, data privacy, data destruction and other warranties
xi. run the actual conversion with all the necessary personnel onsite or able to to be
contacted
CHANGEOVER(GO-LIVE OR CUTOVER) TECHNIQUES
1. Parallel changeover-includes running the old system, then running both the old and
new system in parallel, and finally fully changing over to the new system.
Risks/advantages/disadvantages:
2. Phased changeover- the older system is broken into deliverable modules . The first
module of the older system is phased out using the first module of the new system.
Then, the second module of the old system is phased out using the second module of
the newer system, and so forth until the last module is phased out.
Risks/advantages/disadvantages:
3. Abrupt changeover-the newer system is changed over from the older system on a cutoff
date and time, and the older system is discontinued once changeover takes place.
SYSTEM IMPLEMENTATION
 Implementation is initiated only after a successful testing phase.
 The system should be installed as per the orgn’s change control procedures.
 An i.s auditor should verify that appropriate sign-offs have been obtained prior to
implementation.
 The auditor should also perform the following tasks:
 Review the programmed procedures used for scheduling and running the system along
with system parameters used in executing the production schedule
 Review all system documentation to ensure its completeness and confirm that recent
updates have been incorporated
 Verify all data conversions to ensure that they are correct and complete before
implementing the system
IMPLEMENTATION PLANNING
 The newly implemented system will need an efficient support structure
 Support personnel will need to acquire new skills; workload be distributed so that the
right people support the right issues; new processes may be developed while respecting
specificities of IT dept. requirements
 The objectives of implementation planning :
 Provide appropriate support structures for first, second and third –line support teams
 Provide a single point of contact
 Provide roles and skills definitions with applicable training plans
 To achieve significant success in updating staff on changes to the business process and
introducing new software, some important questions should be addressed:
 How can the existing support staff be involved in the setup of the new project without
neglecting the currently running system?
 What is the gap of knowledge/skills that must be addressed in the training plan?
 How large is the difference from the current legacy environment operation to the
operation of the new platform?
SYSTEM CHANGE PROCEDURES AND PROGRAM MIGRATION
PROCESS
 Following implementation and stabilization, a system enters into the ongoing
development or maintenance stage.
 This phase continues until the system is retired. It involves those activities required to
either correct errors in the system or enhance the capabilities of the system.
 An i.s auditor should consider the following:
 Existence and use of a methodology for authorizing, prioritizing and tracking system
change requests from the user
 Whether emergency change procedures are addressed in the operations manuals
 Whether change control is a formal procedure for the user and devepmt groups
 Whether change control log ensures all changes shown were resolved
 The user’s satisfaction with the turnaround-timelines and cost-of change requests
 The adequacy of the security access restrictions over production source and executable
modules…..etc
 Additionally, the auditor should review the overall change mngt process for possible
improvements in acknowledgement, response time, response effectiveness and user
satisfaction with the process.
SYSTEM CHANGE PROCEDURES AND PROGRAM MIGRATION
PROCESS
Critical success factors
CSF of planning the implementation include:
 To avoid delays, the appropriate skilled staff must attend workshops and participate for
the entire project duration
 The documentation needed for carrying out the work needs to be ready at project
initiation.
 Decision-makers must be involved at all steps to ensure all necessary decisions can be
made.
SYSTEM CHANGE PROCEDURES AND PROGRAM MIGRATION
PROCESS
End user training
 The goal of a training plan is to ensure that the end user can become self-sufficient in
the operation of the system.
 End user training ensures that training is considered and a training project plan is
created early in the development process.
 The training should be piloted using a cross-section of users to determine how best to
customize the training to different user groups.
 Timing of training is important. If training is delivered too early, users will forget much
of the training before the system goes live and if too late, there will be no enough time
to obtain feedback from the pilot group.
 Training format and delivery mechanisms may include:
 Case studies, role-based training, lecture and breakout sessions, modules at different
experience levels, practical sessions…. etc
CERTIFICATION/ACCREDITATION
 Certification is a process by which an assessor organization performs a comprehensive
assessment against a std of mngt and operational and technical controls in an
information system.
 The goal is to determine the extent to which controls are implemented correctly,
operating as intended and producing the desired outcome with respect to meeting the
system’s security requirements.
 Accreditation is the official mngt decision to authorize operation of an info system and
to explicitly accept the risk to the orgn’s operations, assets or individuals based on the
implementation of an agreed-upon set of requirements and security controls.
 By accrediting an I.S a senior official accepts responsibility for the security of the
system and is fully accountable for any adverse impact to the org if a breach of
security occurs.
POST-IMPLEMENTATION REVIEW
 Projects should be formally closed to provide accurate information on project results,
improve future projects and allow an orderly release of resources.
 The closure process should determine whether the project objectives were met or
excused; identify lessons learnt.
 Post-implementation is carried out weeks or months after completion, where major
benefits and shortcomings of the implemented solution will be realized.
 The review is part of the benefits realization process and includes an estimate of the
project’s overall success and impact on the business.
 Post-implementation review should consider both technical details and the process
that was followed in the course of the project, including:
 Adequacy of the system
 Projected cost versus benefits or ROI measurements
 Recommendations that address any system inadequacies and deficiencies
 Plan for implementing any recommendations
 Assessment of the development project process
POST-IMPLEMENTATION REVIEW
 Closing a project is a formal process that focuses on capturing lessons learned for
future use.
 Project closeout steps:
 Assign responsibility for outstanding issues
 Assign custody of contracts, and either archive documentation or pass it on to those
who need it
 Conduct a post-implementation review with the project team, devpmt team, users
and other stakeholders
 Document any risk that was identified in the course of the project
 Complete a second post-implementation review after the project deliverables have
been completed
 An I.S auditor should perform the following functions:
 Determine if the system’s objectives and requirements were achieved.
 Determine if the cost benefits identified in the feasibility study are being measured,
analysed and accurately reported to management.
POST-IMPLEMENTATION REVIEW
 An I.S auditor should perform the following functions:
 Review program change requests performed to assess the type of changes required of
the system.
 Review controls built into the system to ensure that they are operating according to
design.
 Review operator’s error logs to determine if there are any resource or operating
problems inherent within the system.
 Review input and output control balances and reports to verify that the system is
processing data accurately.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy