CIE ICT Chapter 7 Notes
CIE ICT Chapter 7 Notes
CHAPTER 7
System’s Life Cycle
Analysis
Stages of Analysis:
1. Analysis
2. Design
4. Implementation
5. Documentation
6. Evaluation
The current system is studied using four different methodologies. These are the four
techniques:
• Observation:
o involves watching users interact with the system to figure out its workings
o it is used for understanding how users interact with the current system
• Interviews:
o direct one-to-one conversations with users on their experience with the current system
• Questionnaires:
o set of predetermined questions is given to the users to complete and give their opinion
o Advantages: allows for quantitative analysis, efficient data collection, questions can be
answered quickly
o Advantages: provides insights into the system's history, can reveal previously unknown
issues
• Data inputted, processed, and outputted into the system are identified.
• Problems with the current system are identified. What could be improved?
• The requirements of the user and the potential new system are identified. What is the
• Once the systems analysts have completed the systems life cycle analysis stage, they
• The next step will be to design a new system (normally computer-based) to resolve the
System Specification
• It is vital to identify the suitable hardware needed for the new system
o barcode readers,
o scanners,
o touch screens,
o 3D printers,
o monitors,
o speakers.
ICT
o operating system,
o applications software,
o size of storage,
o type of storage.
Design
Once the analysis has taken place and the systems analyst has some idea of the scale
of the problem and what needs to be done, the next stage is to design the critical parts
File/Data Structures
• Data type: specifies the kind of data that can be stored in a field, e.g., text, numbers,
dates
• Coding of data: using codes to represent data, e.g. M for male, F for female
Input Formats
• Data capture forms: These are designed to collect data from users in a structured
format. They come in two types: paper-based and electronic-based. Paper-based data-
ICT
character and information fields, checkboxes, and enough writing space. Text boxes,
on-screen help, drop-down menus, radio buttons, automatic validation, and control
Output Formats
Validation Routines
specific requirements. It is a routine check that the computer does as part of its
programming.
• Identification and removal of errors, thus improving system reliability and performance.
Test Designs
• Test data structures, file structures, input formats, output formats, and validation
routines
Test Strategies
• Test the whole system: confirm overall system performance and integration
Test Plan
The following data types will be explained using the example of months in a year.
• Normal data: valid and expected data values within the range of acceptability have an
• Abnormal data: invalid or unexpected data values. This can either be:
Implementation
The system must then be fully implemented after it has been thoroughly tested.
We will now think more carefully about switching to the new system. Four popular
techniques are utilized to transition from the old system to the new one.
Before selecting the approach best suited for a given application, the pros and cons of
4 Methods of Implementation
1. Direct changeover:
• Advantages
ICT
o fast implementation
• Disadvantages
o no fallback
2. Parallel Running
• Both current and new systems run simultaneously for a period before the old system is
phased out
• Advantages
o Lower risk
• Disadvantages
o Time-consuming
o resource-intensive
3. Pilot Running
implementation
• Advantages
• Disadvantages
o Slower implementation
o potential inconsistencies
4. Phased Implementation
• The new system is implemented in stages, with each stage replacing a part of the old
system
• Advantages
o reduced risk
o easier to manage
• Disadvantages
o Takes longer
Documentation
• In the life cycle of a system, documentation enables the correct recording of design,
goals
components
• Hardware & software requirements: Necessary equipment and software to run the
system
• File structures: Organization and layout of the system's files and data
• List of variables: Collection of variables used within the system, including their names
and purposes
• Input format: Structure and format for entering data into the system
• Output format: Structure and format for presenting data generated by the system
• Sample runs/test runs: Examples of system operation, including input and expected
output
• Validation routines: Techniques used to check and confirm the accuracy of data
User Documentation
Instruction and guidance for end-users on how to operate the system. Used to help
• Purpose of the system: Explanation of the system's intended function and goals
• Hardware & software requirements: Necessary equipment and software to run the
system
devices
system
• Input format: Structure and format for entering data into the system
• Output format: Structure and format for presenting data generated by the system
• Sample runs: Examples of system operation, including input and expected output
• Error handling: Steps to resolve issues and errors within the system
problems
• Glossary of Terms: Definitions of key terms and concepts related to the system
ICT
Evaluation
It measures a system's productivity, efficiency, and compliance with its goals to identify
its strengths, shortcomings, and potential development areas. This assessment informs
decision-making and improves overall performance over the course of a system's life
cycle.
• Analyse the system's efficiency in time, money, and resource use. Examine whether the
o Identify areas that may be consuming excessive resources or time and suggest ways to
optimize them
• Questions to ask:
• Look at the solution's usability and accessibility for the target market. Check to see if the
system is simple to understand and use and if users have no trouble completing their
jobs.
o Describe the user interface and how it facilitates interaction with the system
ICT
o Mention any feedback from users regarding their experience with the system and
• Questions to ask:
o Are all the users able to use the system and make bookings easily?
o Are all the users able to change and cancel bookings easily?
o Can all staff understand how to use the system with minimal training?
• Examine how well the implemented solution satisfies the desired outcome by
o Outline the initial objectives of the system and discuss how the solution addresses each
one
o Highlight any requirements that may not have been fully met and discuss possible
• Questions to ask:
• Collect users' responses to the results of testing the system. Their feedback can provide
insights into potential issues and improvements and help determine overall user
satisfaction.
o Summarise the testing process, including test data expected and actual outcomes.
o Discuss users' reactions to the system, addressing any concerns or suggestions they
may have
• Based on the analysis of efficiency, ease of use, appropriateness, and user feedback,