Software Engneerng
Software Engneerng
SOFTWARE
1.0 Introduction: The Computer system has two major components namely hardware and
software. The hardware component is physical (can be touched or held). The non-physical
part of the computer system is the software. As the voice of man is non-physical yet it is so
important for the complete performance of man, so is the software. In this unit, the
categories of software are examined.
2.0 Objectives: By the end of this unit, you should be able to: • Define what software is •
Differentiate between System, Application and programming Software. • Explain the role
of System Software.
3.0 Definition of software: Computer software is a general name for all forms of
programs. A program itself is a sequence of instruction which the computer follows to
perform a given task.
3.1 Types of software: Software can be categorized into three major types namely: system
software, programming software and application software.
3.1.2 System software: System software helps to run the computer hardware and the entire
computer system. It includes the following:
• device drivers
• operating systems
• servers
• utilities
• windowing systems
The function of systems software is to assist the applications programmer from the details
of the particular computer complex being used, including such peripheral devices as
communications, printers, readers, displays and keyboards, and also to partition the
computer's resources such as memory and processor time in a safe and stable manner.
3.1.3 Programming software: Programming software offers tools to assist a programmer
in writing programs, and software using different programming languages in a more
convenient way. The tools include:
• compilers
• debuggers
• interpreters
• linkers
• text editors
3.1.4 Application software: Application software is a class of software which the user of
computer needs to accomplish one or more definite tasks. The common applications
include the following:
• industrial automation
• business software
• computer games
• quantum chemistry and solid state physics software
• telecommunications (i.e., the internet and everything that flows on it)
• databases
• educational software
• medical software
• military software
• molecular modeling software
• photo-editing
• spreadsheet
• Word processing
• Decision making software
Requirement
Design
Implementation &
Unit Testing
Integration &
system Testing
Operation
Fig 2 Waterfall Life Cycle
Source: http://codebetter.com/blogs/raymond.lewallen/archive/2005/07/13/129114.aspx.
3.2.1 Advantages
• Simple and easy to use. • Easy to manage due to the rigidity of the model – each phase
has specific deliverables and a review process. • Phases are processed and completed one
at a time. • Works well for smaller projects where requirements are very well understood.
3.2.2 Disadvantages
• Adjusting scope during the life cycle can kill a project • No working software is produced
until late during the life cycle. • High amounts of risk and uncertainty. • Poor model for
complex and object-oriented projects. • Poor model for long and ongoing projects. • Poor
model where requirements are at a moderate to high risk of changing.
3.3 V-Shaped Model
Just like the waterfall model, the V-Shaped life cycle is a sequential path of execution of
processes. Each phase must be completed before the next phase begins. Testing is
emphasized in this model more so than the waterfall model. The testing procedures are
developed early in the life cycle before any coding is done, during each of the phases
preceding implementation.
Requirements begin the life cycle model just like the waterfall model. Before development
is started, a system test plan is created. The test plan focuses on meeting the functionality
specified in the requirements gathering.
The high-level design phase focuses on system architecture and design. An integration test
plan is created in this phase as well in order to test the pieces of the software systems
ability to work together.
The low-level design phase is where the actual software components are designed, and unit
tests are created in this phase as well.
The implementation phase is, again, where all coding takes place. Once coding is
complete, the path of execution continues up the right side of the V where the test plans
developed earlier are now put to use.
3.3.1 Advantages
• Simple and easy to use. • Each phase has specific deliverables. • Higher chance of
success over the waterfall model due to the development of test plans early on during the
life cycle. • Works well for small projects where requirements are easily understood.
3.3.2 Disadvantages
• Very rigid, like the waterfall model. • Little flexibility and adjusting scope is difficult and
expensive. • Software is developed during the implementation phase, so no early
prototypes of the software are produced. • Model doesn’t provide a clear path for problems
discovered during testing phases.
3.4 Incremental Model
The incremental model is an intuitive approach to the waterfall model. It is a kind of a
“multi-waterfall” cycle. In that multiple development cycles take at this point. Cycles are
broken into smaller, more easily managed iterations. Each of the iterations goes through
the requirements, design, implementation and testing phases.
The first iteration produces a working version of software and this makes possible to have
working software early on during the software life cycle. Subsequent iterations build on
the initial software produced during the first iteration.
Incremental Life Cycle Model
Unit 3 Modularity
1.0 Introduction
In unit 2 we discussed about software lifecycle models in general and also in detailed the
requirement and the design phases of software development. In this unit we will look at
Modularity in programming.
2.0 Objectives
By the end of this unit, you should be able to: • Define Modularity • Differentiate between
logical and physical modularity • Explain benefits of modular design • Explain approaches
of writing modular program • Explain Criteria for using modular design • Outlines the
attributes of a good module • Outline the steps to creating effective module • Differentiate
between Top-down and Bottom-up programming approach
3.0 What is Modularity?
Modularity is a general systems concept which is the degree to which a system’s
components may be separated and recombined. It refers to both the tightness of coupling
between components, and the degree to which the “rules” of the system architecture enable
(or prohibit) the mixing and matching of components
The concept of modularity in computer software has been promoted for about five decades.
In essence, the software is divided into separately names and addressable components
called modules that are integrated to satisfy problem requirements. It is important to note
that a reader cannot easily understand large programs with a single module. The number of
variables, control paths and sheer complexity make understanding almost impossible. As a
result a modular approach will allow for the software to be intellectually manageable.
However, it is important to note that software cannot be subdivided indefinitely so as to
make the effort required to understand or develop it negligible. This is because the more
the number of modules, the less the effort to develop them.
3.14 Logical Modularity
Generally, in software, modularity can be categorized as logical or physical. Logical
Modularity is concerned with the internal organization of code into logically-related units.
In modern high-level languages, logical modularity usually starts with the class, the
smallest code group that can be defined. In languages such as Java and C#, classes can be
further combined into packages which allow developers to organize code into group of
related classes. Depending on the environment, a module can be implemented as a single
class, several classes in a package, or an entire API (a collection of packages). You should
be able to describe the functionality of tour module in a single sentence (i.e. this module
calculates tax per zip code) regardless of the implementation scale of your module,). Your
module should expose its functionality as simple interfaces that shield callers from all
implementation details. The functionality of a module should be accessible through a
published interface that allows the module to expose its functionalities to the outside world
while hiding its implementation details.
3.15 Physical Modularity
Physical Modularity is probably the earliest form of modularity introduced in software
creation. Physical modularity consists of two main components namely: (1) a file that
contains compiled code and other resources and (2) an executing environment that
understand how to execute the file. Developers build and assemble their modules into
compiled assets that can be distributed as single or multiple files. In Java for example, the
jar file is the unit of physical modularity for code distribution (Net has the assembly). The
file and its associated meta-data are designed to be loaded and executed by the run time
environment that understands how to run the compiled code. Physical modularity can also
be affected by the context and scale of abstraction. Within Java, for instance, the developer
community has created and accept several physical modularity strategies to address
different aspects of enterprise development 1) WAR for web components 2) EJB for
distributed enterprise components 3) EAR for enterprise application components 4) vendor
specific modules such as JBoss Service Archive (SAR). These are usually a variation of
the JAR file format with special meta data to target the intended runtime environment. The
current trend of adoption seems to be pointing to OSGi as a generic physical module
format. OSGi provides the Java environment with additional functionalities that should
allow developers to model their modules to scale from small emddeable to complex
enterprise components (a lofty goal in deed).
3.16 Benefits of Modular Design
• Scalable Development: a modular design allows a project to be naturally subdivided
along the lines of its modules. A developer (or groups of developers) can be assigned a
module to implement independently which can produce an asynchronous project flow.
• Testable Code Unit: when your code is partition into functionally-related chunks, it
facilitates the testing of each module independently. With the proper testing framework,
developers can exercise each module (and its constituencies) without having to bring up
the entire project.
• Build Robust System: in the monolithic software design, as your system grows in
complexity so does its propensity to be brittle (changes in one section causes failure in
another). Modularity lets you build complex system composed of smaller parts that can
be independently managed and maintained. Fixes in one portion of the code does not
necessarily affect the entire system.
• Easier Modification & Maintenance: post-production system maintenance is another
crucial benefit of modular design. Developers have the ability to fix and make non-
infrastructural changes to module without affecting other modules. The updated module
can independently go through the build and release cycle without the need to re-build
and redeploy the entire system.
• Functionally Scalable: depending on the level of sophistication of your modular design,
it's possible to introduce new functionalities with little or no change to existing
modules. This allows your software system to scale in functionality without becoming
brittle and a burden on developers.
3.17 Approaches of writing Modular program
The three basic approaches of designing Modular program are:
• Process-oriented design
This approach places the emphasis on the process with the objective being to design
modules that have high cohesion and low coupling. (Data flow analysis and data flow
diagrams are often used.)
• Data-oriented design
In this approach the data comes first. That is the structure of the data is determined first
and then procedures are designed in a way to fit to the structure of the data.
• Object-oriented design
In this approach, the objective is to first identify the objects and then build the product
around them. In concentrate, this technique is both data- and process-oriented.
3.18 Criteria for using Modular Design
• Modular decomposability – If the design method provides a systematic means for
breaking problem into sub problems, it will reduce the complexity of the overall
problem, thereby achieving a modular solution.
• Modular composability - If the design method enables existing (reusable) design
components to be assembled into a new system, it will yield a modular solution that
does not reinvent the wheel.
• Modular understandability – If a module can be understood as a standalone unit
(without reference to other modules) it will be easier to build and easier to change.
• Modular continuity – If small changes to the system requirements result in changes to
individual modules, rather than system-wide changes, the impact of change-induced
side-effects will be minimized
• Modular protection – If an abnormal condition occurs within a module and its effects
are constrained within that module, then impact of error-induced side-effects are
minimized
3.19 Attributes of a good Module
• Functional independence - modules have high cohesion and low coupling
• Cohesion - qualitative indication of the degree to which a module focuses on just one
thing
• Coupling - qualitative indication of the degree to which a module is connected to other
modules and to the outside world
3.20 Steps to Creating Effective Module
• Evaluate the first iteration of the program structure to reduce coupling and improve
cohesion. Once program structure has been developed modules may be exploded or
imploded with aim of improving module independence. o An exploded module
becomes two or more modules in the final program structure. o An imploded module is
the result of combining the processing implied by two or more modules.
An exploded module normally results when common processing exists in two or more
modules and can be redefined as a separate cohesive module. When high coupling is
expected, modules can sometimes be imploded to reduce passage of control, reference
to global data and interface complexity.
• Attempt to minimise structures with high fan-out; strive for fan-in as structure depth
increases. The structure shown inside the cloud in Fig. 3 does not make effective use of
factoring.
Variables: Price of item, sales tax rate, sales tax, final price
Note that the operations are numbered and each operation is unambiguous and effectively
computable. We also extract and list all variables used in our pseudo-code. This will be
useful when translating pseudo-code into a programming language
Example 2 – Computing Weekly Wages: Gross pay depends on the pay rate and the
number of hours worked per week. However, if you work more than 50 hours, you get paid
time-and-a-half for all hours worked over 50. Pseudo-code the task of computing gross pay
given pay rate and hours worked.
1. get hours worked
2. get pay rate
3. if hours worked ≤ 50 then
3.1 gross pay = pay rate times hours worked
4. else
4.1 gross pay = pay rate times 50 plus 1.5 times pay rate times (hours worked
minus 50)
5. display gross pay
6. halt
Detail diagram — A low-level IPO chart that shows how specific input and output data
elements or data structures are linked to specific processes. Hierarchy chart — A diagram
that graphically represents a program’s control structure. HIPO (Hierarchy plus Input-
Process-Output) — A tool for planning and/or documenting a computer program that
utilizes a hierarchy chart to graphically represent the program’s control structure and a set
of IPO (Input-Process-Output) charts to describe the inputs to, the outputs from, and the
functions performed by each module on the hierarchy chart. IPO (Input-Process-Output)
chart — A chart that describes or documents the inputs to, the outputs from, and the
functions (or processes) performed by a program module. Overview diagram — A high-
level IPO chart that summarizes the inputs to, processes or tasks performed by, and outputs
from a module. Visual Table of Contents (VTOC) — A more formal name for a hierarchy
chart.
3.15 Software
In the 1970s and early 1980s, HIPO documentation was typically prepared by hand using a
template. Some CASE products and charting programs include HIPO support. Some forms
generation programs can be used to generate HIPO forms. The examples in this # were
prepared using Visio.
Activity J Discuss the historical development of Case Tools
4.0 Conclusion
Programming tools are so important for effective program design.
5.0 Summary.
In this unit, you have learnt that:
• Programming environments gives the basic tools and Application Programming
Interfaces, or APIs, necessary to construct programs. • Using the HIPO technique,
designers can evaluate and refine a program’s design, and correct flaws prior to
implementation. • CASE tools are a class of software that automates many of the activities
involved in various life cycle phases.
Unit 4 Compatibility
1.0 Introduction
In the last unit, we considered Software Quality Assurance (SQA). We saw the essence of
Software Quality Assurance to ensure that the software development and control processes
described in the project's Management Plan are correctly carried out and that the project's
procedures and standards are followed at testing phase of software development. In this
unit, we shall look at Compatibility testing. After studying the unit you are expected to
have achieved the following objectives listed below.
2.0 Objectives By the end of this unit, you should be able to: • Define Compatibility
Testing • Explain Usefulness of Compatibility Testing.
3.0 What is Compatibility Testing? Software testing comes in different types.
Compatibility testing is one of the several types of software testing which can be carried
out on a system that is develop based on certain yardsticks and which has to perform
definite functionality in an already existing setup/environment. Many things are decided n
compatibility of a system/application being developed with, for example, other
systems/applications, OS, Network. They include the use of the system/application in that
environment, demand of the system/application etc. On many occasions, the reason while
users prefer not to go for an application/system cannot be unconnected with it non-
compatibility of such application/system with any other system/application, network,
hardware or OS they are already using. This explains the reason why the efforts of
developers may appear to be in vain. Compatibility testing can also be used to certify
compatibility of the system/application/website built with various other objects such as
other web browsers, hardware platforms, users, operating systems etc. It helps to find out
how well a system performs in a particular environment such as hardware, network;
operating system etc. Compatibility testing can be performed manually or with automation
tools.
3.1 Compatibility testing computing environment.
. Computing environment that will require compatibly testing may include some or all of
the below mentioned elements:
• Computing capacity of Hardware Platform (IBM 360, HP 9000, etc.).. • Bandwidth
handling capacity of networking hardware • Compatibility of peripherals (Printer, DVD
drive, etc.) • Operating systems (MVS, UNIX, Windows, etc.) • Database (Oracle, Sybase,
DB2, etc.) • Other System Software (Web server, networking/ messaging tool, etc.) •
Browser compatibility (Firefox, Netscape, Internet Explorer, Safari, etc.)
Browser compatibility testing which can also be referred to as user experience testing
requires that the web applications are tested on different web browsers, to ensure the
following:
• Users have the same visual experience irrespective of the browsers through which they
view the web application. • In terms of functionality, the application must behave and
respond the same way across different browsers.
Compatibility between versions: This has to do with testing of the performance of
system/application in connection with its own predecessor/successor versions. This is
sometimes referred to as backward and forward compatibility. For example, Windows 98
was developed with backward compatibility for Windows 95.
Software Compatibility testing: This is the evaluation of the performance of
system/application in connection with other software. For example: Software compatibility
with operating tools for network, web servers, messaging tools etc.
Operating System compatibility testing: This is the evaluation of the performance of
system/application in connection with the underlying operating system on which it will be
used.
Databases compatibility testing: Many applications/systems operate on databases. Database
compatibility testing is used to evaluate an application/system’s performance in connection
to the database it will interact with.
3.3 Usefulness of Compatibility Testing
Compatibility testing can help developers understand the yardsticks that their
system/application needs to reach and fulfil, so as to get acceptance by intended users who
are already using some OS, network, software and hardware etc. It also helps the users to
find out which system will better fit in the existing setup they are using.
3.4 Certification testing falls within the range of Compatibility testing. Product Vendors do
run the complete suite of testing on the newer computing environment to get their
application certified for a specific Operating Systems or Databases.
Activity K What is Browser compatibility testing
4.0 Conclusion
Compatibility testing is highly beneficial to software development. It can help developers
understand the criteria that their system/application needs to attain and fulfil, in order to get
accepted by intended users who are already using some OS, network, software and
hardware etc. It also helps the users to find out which system will better fit in the existing
setup they are using.
5.0 Summary
In this unit, we have learnt that:
• Compatibility testing is one of the several types of software testing performed on a
system that is built based on certain criteria and which has to perform specific functionality
in an already existing setup/environment. • Compatibility testing can be automated using
automation tools or can be performed manually and is a part of non-functional software
testing. • Computing environment may contain some or all of the below mentioned
elements: o Computing capacity of Hardware Platform (IBM 360, HP 9000, etc.).. o
Bandwidth handling capacity of networking hardware o Compatibility of peripherals
(Printer, DVD drive, etc.) o Operating systems (MVS, UNIX, Windows, etc.) o Database
(Oracle, Sybase, DB2, etc.) o Other System Software (Web server, networking/ messaging
tool, etc.) o Browser compatibility (Firefox, Netscape, Internet Explorer, Safari, etc.) • The
most important use of the compatibility testing is to ensure its performance in a computing
environment in which it is supposed to operate. This helps in figuring out necessary
changes/modifications/additions required to make the system/application compatible with
the computing environment.