Unit No-3
Unit No-3
Requirement Engineering
The objective behind the feasibility study is to create the reasons for developing the software
that is acceptable to users, flexible to change and conformable to established standards.
Types of Feasibility:
1. Technical Feasibility - Technical feasibility evaluates the current technologies, which are
needed to accomplish customer requirements within the time and budget.
2. Operational Feasibility - Operational feasibility assesses the range in which the required
software performs a series of levels to solve business problems and customer requirements.
3. Economic Feasibility - Economic feasibility decides whether the necessary software can
generate financial profits for an organization.
This is also known as the gathering of requirements. Here, requirements are identified with
the help of customers and existing systems processes, if available.
Analysis of requirements starts with requirement elicitation. The requirements are analyzed
to identify inconsistencies, defects, omission, etc. We describe requirements in terms of
relationships and also resolve conflicts if any.
The models used at this stage include ER diagrams, data flow diagrams (DFDs), function
decomposition diagrams (FDDs), data dictionaries, etc.
Data Flow Diagrams: Data Flow Diagrams (DFDs) are used widely for modeling the
requirements. DFD shows the flow of data through a system. The system may be a
company, an organization, a set of procedures, a computer hardware system, a software
system, or any combination of the preceding. The DFD is also known as a data flow
graph or bubble chart.
Data Dictionaries: Data Dictionaries are simply repositories to store information about all
data items defined in DFDs. At the requirements stage, the data dictionary should at least
define customer data items, to ensure that the customer and developers use the same
definition and terminologies.
Entity-Relationship Diagrams: Another tool for requirement specification is the entity-
relationship diagram, often called an "E-R diagram." It is a detailed logical representation
of the data for the organization and uses three main constructs i.e. data entities,
relationships, and their associated attributes.
After requirement specifications developed, the requirements discussed in this document are
validated. The user might demand illegal, impossible solution or experts may misinterpret
the needs. Requirements can be the check against the following conditions –
If they can practically implement
If they are correct and as per the functionality and specially of software
If there are any ambiguities
If they are full
If they can describe
Document Analysis
Reviewing the documentation of an existing system can help when creating AS–IS process
document, as well as driving gap analysis for scoping of migration projects. In an ideal
world, we would even be reviewing the requirements that drove creation of the existing
system – a starting point for documenting current requirements. Nuggets of information are
often buried in existing documents that help us ask questions as part of validating
requirement completeness.
Focus Group
A focus group is a gathering of people who are representative of the users or customers of a
product to get feedback. The feedback can be gathered about needs/opportunities/ problems
to identify requirements, or can be gathered to validate and refine already elicited
requirements. This form of market research is distinct from brainstorming in that it is a
managed process with specific participants.
Interface analysis
Interfaces for a software product can be human or machine. Integration with external
systems and devices is just another interface. User centric design approaches are very
effective at making sure that we create usable software. Interface analysis – reviewing the
touch points with other external systems is important to make sure we don’t overlook
requirements that aren’t immediately visible to users.
Interview
Interviews of stakeholders and users are critical to creating the great software. Without
understanding the goals and expectations of the users and stakeholders, we are very unlikely
to satisfy them. We also have to recognize the perspective of each interviewee, so that, we
can properly weigh and address their inputs. Listening is the skill that helps a great analyst to
get more value from an interview than an average analyst.
Observation
By observing users, an analyst can identify a process flow, steps, pain points and
opportunities for improvement. Observations can be passive or active (asking questions
while observing). Passive observation is better for getting feedback on a prototype (to refine
requirements), where active observation is more effective at getting an understanding of an
existing business process. Either approach can be used.
Prototyping
Prototyping is a relatively modern technique for gathering requirements. In this approach,
you gather preliminary requirements that you use to build an initial version of the solution -
a prototype. You show this to the client, who then gives you additional requirements. You
change the application and cycle around with the client again. This repetitive process
continues until the product meets the critical mass of business needs or for an agreed number
of iterations.
Requirement Workshops
Workshops can be very effective for gathering requirements. More structured than a
brainstorming session, involved parties collaborate to document requirements. One way to
capture the collaboration is with creation of domain-model artifacts (like static diagrams,
activity diagrams). A workshop will be more effective with two analysts than with one.
Reverse Engineering
When a migration project does not have access to sufficient documentation of the existing
system, reverse engineering will identify what the system does. It will not identify what the
system should do, and will not identify when the system does the wrong thing.
Survey/Questionnaire
When collecting information from many people – too many to interview with budget and
time constraints – a survey or questionnaire can be used. The survey can force users to select
from choices, rate something (“Agree Strongly, agree…”), or have open ended questions
allowing free-form responses. Survey design is hard – questions can bias the respondents.
2. Completeness: The SRS is complete if, and only if, it includes the following elements:
(1)All essential requirements, whether relating to functionality, performance, design,
constraints, attributes, or external interfaces.
(2) Definition of their responses of the software to all realizable classes of input data in all
available categories of situations.
(3) Full labels and references to all figures, tables, and diagrams in the SRS and definitions
of all terms and units of measure.
3. Consistency: The SRS is consistent if, and only if, no subset of individual requirements
described in its conflict. There are three types of possible conflict in the SRS:
(1) The specified characteristics of real-world objects may conflicts. For example,
(a) The format of an output report may be described in one requirement as tabular but in
another as textual.
(b) One condition may state that all lights shall be green while another states that all lights
shall be blue.
(2) There may be a reasonable or temporal conflict between the two specified actions.
For example,
(a) One requirement may determine that the program will add two inputs, and another may
determine that the program will multiply them.
(b) One condition may state that "A" must always follow "B," while other requires that "A
and B" co-occurs.
(3). Two or more requirements may define the same real-world object but use different
terms for that object. For example, a program's request for user input may be called a
"prompt" in one requirement's and a "cue" in another. The use of standard terminology and
descriptions promotes consistency.
4. Unambiguousness: SRS is unambiguous when every fixed requirement has only one
interpretation. This suggests that each element is uniquely interpreted. In case there is a
method used with multiple definitions, the requirements report should determine the
implications in the SRS so that it is clear and simple to understand.
5. Ranking for importance and stability: The SRS is ranked for importance and stability if
each requirement in it has an identifier to indicate either the significance or stability of that
particular requirement.
Typically, all requirements are not equally important. Some prerequisites may be essential,
especially for life-critical applications, while others may be desirable. Each element should
be identified to make these differences clear and explicit. Another way to rank requirements
is to distinguish classes of items as essential, conditional, and optional.
7. Verifiability: SRS is correct when the specified requirements can be verified with a cost-
effective system to check whether the final software meets those requirements. The
requirements are verified with the help of reviews.
8. Traceability: The SRS is traceable if the origin of each of the requirements is clear and if
it facilitates the referencing of each condition in future development or enhancement
documentation.
10. Testability: An SRS should be written in such a method that it is simple to generate test
cases and test plans from the report.
11. Understandable by the customer: An end user may be an expert in his/her explicit
domain but might not be trained in computer science. Hence, the purpose of formal notations
and symbols should be avoided too as much extent as possible. The language should be kept
simple and clear.
12. The right level of abstraction: If the SRS is written for the requirements stage, the
details should be explained explicitly. Whereas, for a feasibility study, fewer analysis can be
used. Hence, the level of abstraction modifies according to the objective of the SRS.
Black-box view: It should only define what the system should do and refrain from stating
how to do these. This means that the SRS document should define the external behavior of
the system and not discuss the implementation issues. The SRS report should view the
system to be developed as a black box and should define the externally visible behavior of
the system. For this reason, the SRS report is also known as the blackbox specification of a
system.
Conceptual integrity: It should show conceptual integrity so that the reader can merely
understand it. Response to undesired events: It should characterize acceptable responses to
unwanted events. These are called system response to exceptional conditions.
Verifiable: All requirements of the system, as documented in the SRS document, should be
correct. This means that it should be possible to decide whether or not requirements have
been met in an implementation.