Soft 3
Soft 3
SOFTWARE ENGINEERING
CONCEPTS & TOOLS
In computer science, coupling or dependency is the degree to which each program module
relies on each one of the other modules.
Coupling is usually contrasted with cohesion. Low coupling often correlates with high
cohesion, and vice versa. The software quality metrics of coupling and cohesion were
invented by Larry Constantine, an original developer of Structured Design who was also an
early proponent of these concepts (see also SSADM). Low coupling is often a sign of a well-
structured computer system and a good design, and when combined with high cohesion,
supports the general goals of high readability and maintainability.
Types of coupling
Coupling can be "low" (also "loose" and "weak") or "high" (also "tight" and "strong"). Some
types of coupling, in order of highest to lowest coupling, are as follows:
Module coupling
Coupling in Software Engineering describes a version of metrics associated with this concept.
For example, if a module has only a single input and output data parameter
If a module has 5 input and output data parameters, an equal number of control parameters,
and accesses 10 items of global data, with a fan-in of 3 and a fan-out of 4,
(2)
ADVANTAGES
The network model can handle the one-to-many and many-to-many relationships.
Data Integrity
In a network model, no member can exist without an owner. A user must therefore
first define the owner record and then the member record. This ensures the integrity.
Data Independence
The network model draws a clear line of demarcation between programs and the
complex physical storage details. The application programs work independently of
the data. Any changes made in the data characteristics do not affect the application
program.
DISADVANTAGES
System complexity
In a network model, data are accessed one record at a time.This males it essential
for the database designers, administrators, and programmers to be familiar with the
internal data structures to gain access to the data. Therefore, a user friendly
database management system cannot be created using the network model.
(3)
The traditional object model is insufficient in the context of
real-time systems. Here a completely new aspect has to be added to the object
concept, namely time. It has to be investigated how to annotate the functional
specification of types with timing constraints and how to guarantee and implement
these timing specifications. Also other concepts that were already included
in the traditional object model have to be reviewed in the context of a real-time
system, due to the difficulties to obtain deterministic timing behavior. These
concepts include inheritance, dynamic binding, dynamic memory allocation,
concurrency, and synchronization. A lot of research has been undertaken in order
to resolve the inherent contradiction between object-orientation and real-time. In
the following subsections several different approaches to real-time objects will
be reviewed.
(4). A consistent user interface may be impossible to produce for complex systems
with a large number of interface options. In such systems, there is a wide imbalance
between the extent of usage of different commands so for frequently used
commands, it is desirable to have short cuts. Unless all commands have short cuts,
then consistency is impossible.
It may also be the case in complex systems that the entities manipulated are of quite
different types and it is inappropriate to have consistent operations on each of these
types.
(5).
Test in the small: a test that checks a single function or class (Unit test)
Test in the large: a test that checks a group of classes, such as
o Module test (a single module)
o Integration test (more than one module)
o System test (the entire system)
Acceptance test: a formal test defined to check acceptance criteria for a software
o Functional test
o Non functional test (performance, stress test)
Software verification is often confused with software validation. The difference between
'verification and validation:
Software verification asks the question, "Are we building the product right?"; that is,
does the software conform to its specification.
Software validation asks the question, "Are we building the right product?"; that is, is
the software doing what the user really requires.
The aim of software verification is to find the errors introduced by an activity, i.e. check if
the product of the activity is as correct as it was at the beginning of the activity.
Static verification is the process of checking that software meets requirements by doing a
physical inspection of it. For example:
It is sometimes said that validation can be expressed by the query "Are you building
the right thing?" and verification by "Are you building it right?" "Building the right
thing" refers back to the user's needs, while "building it right" checks that the
specifications be correctly implemented by the system. In some contexts, it is required
to have written requirements for both as well as formal procedures or protocols for
determining compliance.
Verification:
1. It is a Quality improvement process.
2. It is involve with the reviewing and evaluating the process.
3. It is conducted by QA team.
4. Verification is Correctness.
5. Are we producing the product right?
Validation:
1. It is ensures the functionality.
2. It is conducted by development team with the help from QC team.
3. Validation is Truth.
4. Validation is the following process of verification.
5. Are we producing the right product?
(6).
In all models of the software development cycle, the software specification case is
always followed by the design phase. This is true, no matter whether a model
encompasses a full range of activities, as described in IEEE Std. 1074 , or a limited
subset composed of the four primary phases, specification, design,
coding, and testing, as in our case. Once the requirements have been frozen, which
means that no more significant changes to the requirements document can be
expected, the developers should be ready to start the design process. At this stage,
one has to organize the design activities in some systematic manner by choosing an
appropriate notation for this phase, techniques to derive a specific representation of
software from a more general one, and automatic software tools to assist in the
derivation process. A complete set of these three elements (a method, techniques,
and tools) is called a software design (or development) methodology.
Simplifying things a little, one can say that the design activities usually concentrate
on two aspects:
- architectural design, which for the most part describes the structure of software
- detailed design, which provides the insight into the functioning of the structural
elements of software developed at the architectural level.
There are two primary approaches to develop the software architecture: structured
approach (also called a functional approach) and object-oriented design approach.
The difference between the two is significant but, in fact, both approaches have
common origin. The common roots and drastic differences become clear, if we look
at the operation of software from an abstract perspective. Since their inception in the
1940s, computers and their software, no matter which level we consider, microcode,
processor instruction level, operating system, programming language, etc., involve
only two primary entities: data and operations on these data.
The way the structured (functional) approach treats these two entities is completely
opposite to the way they are treated in object-oriented approaches . In structured
approaches, the primary focus is on operations. This means that in the structured
development process, we first determine the operations to be executed and their
order, encapsulate those operations in a form of a module, function, procedure,
subroutine, etc., and then add the data paths in the form of parameters or
arguments, through which data can flow in and out of these design units. This is an
operations-centered approach.
None of these methods, however, gained widespread popularity, because they were
lacking techniques of well-defined transformations and, in particular, automatic
software tools to help in the development process. In addition to that, even though
these notations capture quite adequately various intricacies of real-time software,
they are not suitable for expressing timing requirements.