0% found this document useful (0 votes)
16 views

Chapter 7 Notes

Uploaded by

Aaashiq
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

Chapter 7 Notes

Uploaded by

Aaashiq
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

System’s Life Cycle

Analysis
Stages of Analysis:

1. Analysis
2. Design
3. Development & Testing
4. Implementation
5. Documentation
6. Evaluation
Analysis of the Current System
The current system is studied using four different methodologies. These are the four
techniques:
 Observation:
o involves watching users interact with the system to figure out its workings
o it is used for understanding how users interact with the current system
o Advantages: provides first-hand, unbiased information
o Disadvantages: can be time-consuming, may not reveal all issues
 Interviews:

direct one-to-one conversations with users on their experience with the current
o
system
o used to gather comprehensive information about individual users
o Advantages: allows for in-depth exploration of issues
o Disadvantages: relatively expensive, time-consuming, no user-anonymity, which
may affect the response
 Questionnaires:

o set of predetermined questions is given to the users to complete and give their
opinion on the current system
o it is majorly used in collecting data from a larger group of people
o Advantages: allows for quantitative analysis, efficient data collection, questions
can be answered quickly
o Disadvantages: limited by predetermined questions, may suffer from low
response rates, users may exaggerate answers due to anonymity

 Examination of existing documents:

o reviewing system documentation, user guides, or reports


o understanding the current system's design and any known issues
o Advantages: provides insights into the system's history, can reveal previously
unknown issues
o Disadvantages: it may be outdated, incomplete, time-consuming, and rather
expensive.
Record and Analyse Information about the Current System

Identifying key aspects of the current system

 Data inputted, processed, and outputted into the system are identified.
 Problems with the current system are identified. What could be improved?
 The requirements of the user and the potential new system are identified. What is
the new system meant to do?
 Problems: issues that users face with the current system
 User requirements: what needs to be added to the new system
 Information requirements: data or information the new system must process
New System Requirements Specification:

 Once the systems analysts have completed the systems life cycle analysis stage, they
should be fully aware of the current system's limitations.
 The next step will be to design a new system (normally computer-based) to resolve
the problems identified by the users and the systems analyst.
 The Requirements Specification will be created, outlining the required improvements
and expectations for the new system.
System Specification

Hardware and Software Selection

 It is vital to identify the suitable hardware needed for the new system
o contemplating system requirements, compatibility, costs
o justifying choices based on user needs and system performance

 Hardware that needs to be considered:


o barcode readers,
o scanners,
o touch screens,
o 3D printers,
o monitors,
o speakers

 Identifying suitable software needed for the new system

o considering functionality, compatibility, and ease of use


o justifying choices based on user requirements and system efficiency
 Software that needs to be considered:

o operating system,
o applications software,
o size of storage,
o type of storage.
Design
Once the analysis has taken place and the systems analyst has some idea of the scale of
the problem and what needs to be done, the next stage is to design the critical parts of the
recommended system.
File/Data Structures

 Field length: number of characters allowed in a field


 Field name: an identifier for the field in the data structure
 Data type: specifies the kind of data that can be stored in a field, e.g., text, numbers,
dates
 Coding of data: using codes to represent data, e.g. M for male, F for female
Input Formats

Data capture forms: These are designed to collect data from users in a structured format.
They come in two types: paper-based and electronic-based. Paper-based data-capturing
forms must be carefully designed with headings, concise instructions, character and
information fields, checkboxes, and enough writing space. Text boxes, on-screen help,
drop-down menus, radio buttons, automatic validation, and control buttons for data entry
are all features of computer-based forms.
Consider a user-friendly layout, clear instructions, and appropriate data fields.
Output Formats
 Screen layouts: how information is presented to users on a screen
 Report layouts: how information is organized in a printed or digital report
 Consider readability, visual appeal, and efficient use of space
Validation Routines

It is a method of examining data submitted to a computer to determine if it meets specific

requirements. It is a routine check that the computer does as part of its programming.

1. Range check: ensures data is within a specified range of values

2. Character check: ensures data contains only allowed characters

3. Length check: ensures data is of a specified length


4. Type check: ensures data is of the correct data type

5. Format check: ensures data conforms to a specific format

6. Presence check: ensures data is present and not left blank

7. Check digit: a digit added to a number to verify its accuracy

Development and Testing


 Guarantees the system's functionality before it is put into use.
 Identification and removal of errors, thus improving system reliability and
performance.
Test Designs

 Test data structures, file structures, input formats, output formats, and validation
routines
 Ensure all components function correctly and interact seamlessly
Test Strategies
 Test each module: verify individual components function as intended
 Test each function: ensure all features work correctly
 Test the whole system: confirm overall system performance and integration
Test Plan

 Test data: specific data used for testing purposes


 Expected outcomes: predicted results based on test data
 Actual outcomes: results obtained from testing
 Remedial action: steps taken to fix identified issues

Test Data Types

The following data types will be explained using the example of months in a year.
 Normal data: valid and expected data values within the range of acceptability have
an expected outcome. E.g., any whole number between 1 and 12.
 Abnormal data: invalid or unexpected data values. This can either be:

o Data outside the range of acceptability or


o Data that is the wrong data type
o In this case, examples could be…
 any value less than 1 (i.e. 0, -6, etc.)
 any value greater than 12 (i.e. 13, 15, etc.)
 letters or nun-numeric data (i.e. July, etc.)
 non-integral values (i.e. 3.5, 4.2, etc.)
 Extreme data: values at the limits of acceptability (E.g. 1 or 12)
What is live data?

 Data that has been used with the current system


 Hence, the results are already known
Implementation
The system must then be fully implemented after it has been thoroughly tested.
We will now think more carefully about switching to the new system. Four popular
techniques are utilized to transition from the old system to the new one.
Before selecting the approach best suited for a given application, the pros and cons of each
must be carefully considered.

4 Methods of Implementation

1. Direct changeover:

 The old system is replaced by the new system immediately


 Used when quick implementation is necessary
 Advantages

o fast implementation
o cost-effective as only one system is in operation
 Disadvantages

o High risk of failure


o no fallback
o users can’t be trained on the new system
2. Parallel Running

 Both current and new systems run simultaneously for a period before the old
system is phased out
 Used when a smooth transition with minimal risk is required

 Advantages

o Lower risk
o easy system comparison
 Disadvantages

o Time-consuming
o resource-intensive
3. Pilot Running

 The new system is implemented in a small, controlled environment before full-scale


implementation
 Used when testing the new system in a real-world setting
 Advantages

o Low risk as only trialled in one department/centre/branch


o allows for fine-tuning
o staff have time to train with the new system
o few errors as it's fully tested

 Disadvantages

o Slower implementation
o potential inconsistencies
o confusion as there are two systems in use
o There is no backup for the department/center/branch using the new system
4. Phased Implementation

 The new system is implemented in stages, with each stage replacing a part of the old
system
 Used when a gradual transition is preferred to minimize disruption
Advantages

o reduced risk
o easier to manage
Disadvantages
o Takes longer
o potential compatibility issues

Documentation
 In the life cycle of a system, documentation enables the correct recording of design,
implementation, testing, and maintenance data, facilitating effective communication,
troubleshooting, and potential future improvements.
 Technical documentation: detailed information on the system's inner workings
and programming for developers and IT staff
Used to maintain, repair, and update the system with improvements
 Purpose of the system/program: Explanation of the system's intended function
and goals
 Limitations: Known constraints or issues with the system
 Program listing: The code or scripts used in the system
 Program language: The programming language used to develop the system
 Program flowcharts/algorithms: Visual representations or descriptions of the
system's logic and processes
 System flowcharts: Visual representations of the interactions between system
components
 Hardware & software requirements: Necessary equipment and software to run
the system
 File structures: Organization and layout of the system's files and data
 List of variables: Collection of variables used within the system, including their
names and purposes
 Input format: Structure and format for entering data into the system
 Output format: Structure and format for presenting data generated by the system
 Sample runs/test runs: Examples of system operation, including input and
expected output
 Validation routines: Techniques used to check and confirm the accuracy of data
entered into the system
User Documentation

Instruction and guidance for end-users on how to operate the system. Used to help users
effectively use the system and overcome problems
 Purpose of the system: Explanation of the system's intended function and goals
 Limitations: Known constraints or issues with the system
 Hardware & software requirements: Necessary equipment and software to run
the system
 Loading/running/installing software: Instructions for setting up the system on
user devices
 Saving files: Procedures for storing data within the system
 Printing data: Steps to produce hard copies of system data
 Adding records: Instructions for creating new entries in the system
 Deleting/editing records: Guidelines for modifying or removing existing entries in
the system
 Input format: Structure and format for entering data into the system
 Output format: Structure and format for presenting data generated by the system
 Sample runs: Examples of system operation, including input and expected output
 Error messages: Explanations of system warnings and error notifications
 Error handling: Steps to resolve issues and errors within the system
 Troubleshooting guide/helpline: Assistance for diagnosing and addressing
common problems
 Frequently Asked Questions: Answers to common user inquiries
 Glossary of Terms: Definitions of key terms and concepts related to the system
Evaluation
It measures a system's productivity, efficiency, and compliance with its goals to identify its
strengths, shortcomings, and potential development areas. This assessment informs
decision-making and improves overall performance over the course of a system's life cycle.
Assess the Efficiency of the Solution

 Analyse the system's efficiency in time, money, and resource use. Examine whether
the system is performing at its best or if its efficiency could be increased.
o Provide examples of specific aspects that contribute to the system's efficiency
o Identify areas that may be consuming excessive resources or time and
suggest ways to optimize them
 Questions to ask:

o Does it operate quicker than the previous system?


o Does it operate by reducing staff time in making bookings?
o Does it operate by reducing staff costs?
Evaluate the Ease of Use

 Look at the solution's usability and accessibility for the target market. Check to see if
the system is simple to understand and use and if users have no trouble completing
their jobs.
o Describe the user interface and how it facilitates interaction with the system
o Mention any feedback from users regarding their experience with the system
and address any issues they encountered

 Questions to ask:

o Are all the users able to use the system and make bookings easily?
o Are all the users able to change and cancel bookings easily?
o Can all staff understand how to use the system with minimal training?

Determine the Suitability of the Solution

 Examine how well the implemented solution satisfies the desired outcome by
contrasting it with the original task criteria.
o Outline the initial objectives of the system and discuss how the solution
addresses each one
o Highlight any requirements that may not have been fully met and discuss
possible reasons for this
 Questions to ask:

o Is the system suitable for each of the departments?


o Does it meet the needs of the customers?
o Does it meet the needs of the staff?
o Does the solution match the original requirements?
Collect and Examine User’s Feedback

 Collect users' responses to the results of testing the system. Their feedback can
provide insights into potential issues and improvements and help determine overall
user satisfaction.
o Summarise the testing process, including test data expected and actual
outcomes.
o Discuss users' reactions to the system, addressing any concerns or
suggestions they may have
Identify Limitations and Suggest Necessary Improvements

 Based on the analysis of efficiency, ease of use, appropriateness, and user feedback,
identify any limitations in the system and suggest necessary improvements
o List the limitations and provide explanations for each one
o Recommend specific changes or enhancements for these issues

You might also like