MIS & DSS - Solutions To Previous Year Questions
MIS & DSS - Solutions To Previous Year Questions
f) Define DFD.
DFD stands for Data Flow Diagram. It is a visual representation of how data moves
within a system, depicting the flow of information and the processes that
transform or manipulate that data. DFDs are commonly used in system analysis
and design to understand, document, and communicate the processes and data
involved in a particular system or business process.
g) What is Extranet?
An extranet is a private computer network that allows controlled access by
external parties to an organization's internal information or resources. It extends
the concept of an intranet beyond an organization's boundaries, providing a
secure and restricted connection between an organization and its external
stakeholders, such as customers, suppliers, partners, or other authorized third
parties.
i) What is Telnet?
Telnet is a network protocol used to provide a command-line interface to
communicate with devices over a Transmission Control Protocol (TCP) network,
such as the Internet. It enables users to log into remote computers or devices,
access their command-line interfaces, and execute commands as if they were
directly connected to the device. Telnet operates on the application layer of the
OSI (Open Systems Interconnection) model.
Group – B
(Each question carries 8 marks)
(Answer all questions)
The decision-making process is interconnected across these levels, with strategic decisions
providing the overarching framework for tactical and operational decisions. Information flows
from operational levels to tactical levels, and then to strategic levels, while decisions and
directives move in the opposite direction.
Effective communication and coordination between these levels are crucial for the successful
implementation of organizational goals. Each level of decision-making plays a vital role in
contributing to the overall success and adaptability of the organization in a dynamic business
environment.
OR
Information Sharing: GDSS enables the sharing of relevant data, documents, and
other information among group members. This ensures that all participants have
access to the same information, promoting a shared understanding of the decision
context.
Decision Modeling and Analysis: GDSS includes tools for modeling and analyzing
decision scenarios. It may offer simulations, modeling techniques, and analytical
tools to help the group evaluate various alternatives and their potential outcomes.
Voting and Consensus Building: GDSS often includes features for voting on
alternatives and building consensus within the group. This helps streamline the
decision-making process and identify preferences among group members.
Executive Support Systems play a critical role in helping top management navigate the
complexities of the business environment and make decisions that align with the
organization's overarching goals and objectives. They contribute to the strategic success of
the organization by providing timely, relevant, and actionable information to executives.
3. What are the different stages of system? Explain the RAD model in details?
The systems development life cycle (SDLC) consists of several stages that guide the process
of developing information systems. Each stage involves specific activities, and the stages are
typically sequential, with outputs from one stage becoming inputs for the next. One of the
models used in the SDLC is the Rapid Application Development (RAD) model. Let's explore
both concepts:
The RAD model is an iterative and incremental approach to software development that
prioritizes rapid prototyping and quick feedback from end-users. It aims to reduce the
development time and accommodate changes efficiently. The RAD model typically consists of
the following phases:
• Requirements Planning:
• Identifying project scope, objectives, and requirements.
• Defining system functionalities and user expectations.
• User Design:
• Involving end-users in the design process.
• Creating rapid prototypes to visualize the system.
• Collecting feedback and refining prototypes.
• Construction:
• Developing the system based on the approved prototypes.
• Using automated tools and techniques to expedite coding.
• Conducting regular testing and refinement throughout the construction phase.
• Cutover:
• Preparing for the deployment of the system.
• Conducting training sessions for end-users and support staff.
• Migrating from the old system to the new one.
• Post-Implementation and Maintenance:
• Monitoring the system's performance in the live environment.
• Collecting user feedback and making necessary adjustments.
• Addressing issues and implementing updates as needed.
The RAD model emphasizes collaboration between developers and end-users, and it allows
for flexibility and adaptability to changing requirements. It is particularly suited for projects
where requirements are not well understood initially, and rapid delivery is a priority. The
iterative nature of RAD ensures that user feedback is incorporated early and often in the
development process.
OR
Explain the context diagram and level – 0 diagram. Draw the level-0 diagram
of Examination System.
Context Diagram:
A context diagram, also known as a level-0 DFD (Data Flow Diagram), is a high-level visual
representation of a system that shows the interactions between the system and its external
entities. It provides a simplified and holistic view of the system's scope, illustrating how data
flows into and out of the system without delving into the internal details of processes.
Key elements of a context diagram:
1. System Boundary: The boundary of the system, represented by a circle or boundary box,
encloses all the system's components and processes.
2. External Entities: External entities are entities outside the system that interact with it.
These can be users, other systems, or external data sources. External entities are typically
represented by squares or rectangles connected to the system boundary.
3. Data Flows: Arrows represent the flow of data between the system and its external
entities. These data flows illustrate the exchange of information between the system and
its environment.
4. Processes: In a context diagram, processes are represented as a single process symbol
within the system boundary. The focus is on the high-level functions of the system, without
detailing internal processes.
Level-0 Diagram:
A level-0 DFD (Data Flow Diagram) is the next level of refinement after the context diagram.
It provides a more detailed view of the system by breaking down the single process from the
context diagram into sub-processes or functions. The level-0 DFD decomposes the system into
major processes, data stores, and the data flows between them.
1. Processes (Circles): The main processes within the system are represented by circles. Each
circle represents a high-level function or sub-process that contributes to the overall system
functionality.
2. Data Flows (Arrows): Data flows illustrate the movement of data between processes, data
stores, and external entities. They represent the paths along which data travels within the
system.
3. Data Stores (Rectangles): Data stores represent where data is stored within the system.
These can be databases, files, or other storage repositories.
4. External Entities (Squares/Rectangles): External entities, similar to the context diagram,
represent entities outside the system that interact with it.
The level-0 DFD serves as a foundation for more detailed DFDs at subsequent levels. It breaks
down the high-level processes identified in the context diagram, providing a more granular
understanding of the system's structure and interactions. Subsequent levels, such as level-1,
level-2, and so on, continue to decompose processes into more detailed subprocesses until a
comprehensive and detailed view of the system is achieved.
Level-0 diagram of Examination System
Registration Result
STUDENT STUDENT
Opt for Examination Certificate
ONLINE
EXAMINATION
SYSTEM
There are several types of networking in an organization, each serving specific purposes.
Here are some common types:
Local Area Network (LAN):
A LAN is a network that is limited to a small geographic area, such as a single
building or a campus.
It allows computers and devices within the same physical location to connect and
share resources like files, printers, and internet access.
Ethernet and Wi-Fi are common technologies used to implement LANs.
Wide Area Network (WAN):
Unlike LANs, WANs cover a broader geographic area, connecting multiple LANs
across cities, countries, or even continents.
WANs use various technologies, including public and private data networks, to
enable communication between distant locations.
The internet itself is a vast example of a WAN.
Metropolitan Area Network (MAN):
A MAN is an intermediate-sized network that covers a larger geographic area than
a LAN but is smaller than a WAN.
It typically spans a city or a large campus, connecting multiple buildings within a
defined metropolitan area.
Virtual Private Network (VPN):
A VPN is a network that is constructed by using public wires (usually the internet)
to connect to a private network securely.
It enables remote users or branch offices to access the organization's network
over the internet while maintaining data privacy and security.
Intranet:
An intranet is a private network within an organization that uses internet
technologies to securely share information, resources, and collaboration tools.
It is often used for internal communication, document sharing, and collaboration
among employees.
Extranet:
An extranet is an extension of an intranet that allows controlled access to
authorized external users, such as customers, suppliers, or business partners.
It provides a secure way for external entities to access specific resources and
information.
Client-Server Network:
In a client-server network architecture, computers and devices are divided into
clients (end-user devices) and servers (centralized resources).
Clients request services or resources from the servers, which are dedicated to
providing those services efficiently.
Peer-to-Peer Network:
In a peer-to-peer network, all devices have equal status and can communicate
directly with each other without a centralized server.
This type of network is often used in small environments where simplicity and
cost-effectiveness are prioritized.
Organizations may use a combination of these network types to meet their specific
requirements, depending on factors such as size, geographic distribution, security needs, and
the nature of their operations.
OR
Explain the Human Resource Information System.
A human resource information system (HRIS) supports the human resources function of
an organization with information. The complexity of human resource management has
grown immensely over recent years, primary due to the need to conform with new laws
and regulations.
A HRIS has to ensure the appropriate degree of access to a great variety of internal
stakeholders, including:
The employees of the Human Resources department in performance of their duties.
All the employees of the firm wishing to inspect their own records.
All the employees of the firm seeking information regarding open positions or
available benefit plans.
Employees availing themselves of the computer-assisted training and evaluation
opportunities.
Managers throughout the firm in the process of evaluating their subordinates and
making personnel decisions.
Corporate executives involved in tactical and strategic planning and control.
Coupling:
Data Coupling: Data coupling occurs when modules share data but do not share the
details of each other's internal data structures. A function, for example, may receive
parameters without needing to understand the structure of the data.
Control Coupling: Control coupling happens when one module controls the behavior
of another by passing it information on what to do. This is generally considered a
higher level of coupling.
Stamp Coupling: Stamp coupling occurs when modules share a composite data
structure, like a record or an object. This implies a dependency on the structure of the
shared data.
Cohesion:
Cohesion refers to the degree to which the elements within a module or component
are related to one another. A highly cohesive module performs a specific and well-
defined set of tasks, making it easier to understand, maintain, and modify.
Like coupling, cohesion also has different levels:
Functional Cohesion: Functional cohesion occurs when the elements within a module
are grouped because they all contribute to a single, well-defined task.
Temporal Cohesion: Temporal cohesion occurs when elements within a module are
related by the fact that they are executed at the same time.
OR
What is Design Strategy? Explain the different types of design strategies.
Design strategy refers to a plan or approach that guides the process of creating a product,
system, or solution. It involves making decisions about how to achieve specific goals related
to functionality, aesthetics, user experience, and other aspects of design. Design strategy is
crucial in ensuring that the design aligns with the overall objectives and needs of the project
or organization.
There are various types of design strategies, and the choice of strategy depends on factors
such as the project goals, target audience, resources, and constraints. Here are some common
types of design strategies:
• User-Centred Design (UCD): User-Centred Design is a strategy that prioritizes the needs
and preferences of end-users throughout the design process. It involves methods such as
user research, personas, prototyping, and usability testing to ensure that the final product
meets user expectations and is easy to use.
• Agile Design: Agile design is an iterative approach that aligns with the principles of agile
development. It emphasizes flexibility and collaboration, allowing designers to adapt to
changes in requirements quickly. Agile design often involves rapid prototyping,
continuous feedback, and incremental improvements.
• Design Thinking: Design thinking is a human-centric approach to problem-solving that
involves empathy, ideation, and experimentation. It encourages multidisciplinary teams
to collaborate and iterate on solutions by understanding user needs, defining problems,
brainstorming ideas, prototyping, and testing.
• Innovative Design: Innovative design strategies focus on creating products or solutions
that are novel and ground breaking. This involves pushing the boundaries of traditional
design thinking to come up with creative and unique solutions. It often requires a culture
that encourages experimentation and risk-taking.
• Sustainable Design: Sustainable design aims to create products or solutions with minimal
environmental impact. This strategy considers the life cycle of a product, from raw
material extraction to disposal, and seeks to minimize resource consumption, energy use,
and waste generation.
• Responsive Design: Responsive design is a strategy commonly used in web and digital
design. It involves creating designs that adapt and respond to different devices and screen
sizes, providing a consistent user experience across a range of platforms, from desktops
to mobile devices.
• Universal Design: Universal design aims to create products or environments that are
accessible and usable by people of all abilities, including those with disabilities. This
strategy promotes inclusivity and ensures that a diverse range of users can interact with
and benefit from the design.
• Emotional Design: Emotional design focuses on evoking specific emotional responses
from users. It considers aesthetics, storytelling, and other elements that contribute to the
overall emotional experience of interacting with a product.
• Incremental Design: Incremental design involves making small, incremental
improvements to an existing design over time. This strategy is often employed to respond
to user feedback, address issues, and enhance features without overhauling the entire
design.
6. What is FTP? Explain the various factors of FTP.
FTP stands for File Transfer Protocol. It's a standard network protocol used to transfer
files from one host to another over a TCP-based network, such as the internet. FTP is
often used for managing and updating files on a web server but can also be used for
general file transfers between computers.
Modes of FTP:
FTP operates in two modes: Active mode and Passive mode.
Active mode: In active mode, the client establishes a connection to the server on
port 21, and the server then opens a random port (usually above 1023) to transfer
data to the client.
Passive mode: In passive mode, the server opens a random port for data transfer,
and the client connects to that port. Passive mode is often used when the client
is behind a firewall.
Authentication:
FTP typically requires authentication before allowing access to files. This is
commonly done using a username and password. However, FTP also supports
anonymous logins, where users can log in with the username "anonymous" and
provide their email address as the password.
Commands:
FTP uses a set of commands to perform various operations. Common FTP commands
include:
GET: Retrieve a file from the server.
PUT: Send a file to the server.
LIST: List the files in the current directory.
CD: Change the current directory.
PWD: Print the current working directory.
Security:
Traditional FTP does not encrypt data during transmission, making it susceptible
to eavesdropping. For improved security, FTP can be secured using protocols like
FTPS (FTP Secure) or SFTP (SSH File Transfer Protocol), which add encryption to
the data transfer process.
Anonymous FTP:
Anonymous FTP allows users to log into an FTP server without a username or with
the username "anonymous." This is often used for providing public access to files,
such as software distributions, documentation, or other publicly available
content.
Port Numbers:
FTP uses two well-known port numbers:
Port 21: This is the default control connection port. The client connects to the
server on this port to send commands and receive responses.
Port 20: This is the default data connection port in active mode. In passive mode,
a random port is used.
Passive FTP:
Passive FTP is a configuration where the server, rather than the client, opens a
data connection for file transfers. This is often used in situations where the client
is behind a firewall, and the firewall may block incoming connections.
Firewall Considerations:
FTP can have challenges working through firewalls due to the need for multiple
ports (control and data ports). Passive mode is often used to address firewall
issues.
FTP Clients and Servers:
FTP involves two key components: the FTP client and the FTP server. The client
initiates the connection, sends commands, and requests file transfers. The server
listens for incoming connections, processes commands, and manages file
transfers.
FTP Over SSH (FTPS) and SFTP:
FTPS and SFTP are secure versions of FTP. FTPS (FTP Secure) adds a layer of
security through SSL/TLS encryption, while SFTP (SSH File Transfer Protocol) uses
the secure shell (SSH) protocol for secure file transfers.
OR
b) HTTP
It is a protocol used to access the data on the World Wide Web (www).
The HTTP protocol can be used to transfer the data in the form of plain text, hypertext,
audio, video, and so on.
This protocol is known as HyperText Transfer Protocol because of its efficiency that
allows us to use in a hypertext environment where there are rapid jumps from one
document to another document.
HTTP is similar to the FTP as it also transfers the files from one host to another host.
But HTTP is simpler than FTP as HTTP uses only one connection, i.e., no control
connection to transfer the files.
HTTP is used to carry the data in the form of media along with content (MIME) format.
HTTP is similar to SMTP as the data is transferred between client and server. The HTTP
differs from the SMTP in the way the messages are sent from the client to the server
and from server to the client. SMTP messages are stored and forwarded while HTTP
messages are delivered immediately.
Features of HTTP
Connectionless protocol: HTTP is a connectionless protocol. HTTP client initiates a
request and waits for a response from the server. When the server receives the
request, the server processes the request and sends back the response to the HTTP
client after which the client disconnects the connection. The connection between
client and server exists only during the current request and response time only.
Stateless: HTTP is a stateless protocol as both the client and server know each other
only during the current request. Due to this nature of the protocol, both the client and
server do not retain the information between various requests of the web pages
Mechanism of HTTP
The below figure shows the HTTP transaction between client and server. The client
initiates a transaction by sending a request message to the server. The server replies to
the request message by sending a response message.
PREVIOUS YEAR QUESTIONS WITH ANSWERS
2021
Group – A
(Each question carries 1 mark)
b) Define DSS.
DSS stands for Decision Support System. It is an information system that assists individuals
and organizations in making informed decisions by providing relevant and timely information.
DSS combines data from various sources, analytical tools, and models to support decision-
making processes at different levels within an organization.
f) Define Cohesion.
Cohesion refers to the degree to which the elements within a module or component are
related to one another. A highly cohesive module performs a specific and well-defined set of
tasks, making it easier to understand, maintain, and modify.
Group – B
(Each question carries 8 marks)
OR
Write Short Notes on:
a) GDSS
b) TPS
a) GDSS
A Group Decision Support System (GDSS) is an interactive computer-based information
system that facilitates collaborative decision-making among a group of individuals. It is
designed to enhance communication, coordination, and collaboration within a group
during the decision-making process. GDSS incorporates both hardware and software
components to support face-to-face and remote group meetings.
Information Sharing: GDSS enables the sharing of relevant data, documents, and
other information among group members. This ensures that all participants have
access to the same information, promoting a shared understanding of the decision
context.
Decision Modeling and Analysis: GDSS includes tools for modeling and analyzing
decision scenarios. It may offer simulations, modeling techniques, and analytical
tools to help the group evaluate various alternatives and their potential outcomes.
Voting and Consensus Building: GDSS often includes features for voting on
alternatives and building consensus within the group. This helps streamline the
decision-making process and identify preferences among group members.
b) TPS
Purpose:
The primary purpose of a TPS is to ensure the timely and accurate processing of
transactions to support the operational aspects of a business. It is fundamental for
maintaining the records of basic business operations.
Characteristics:
Components:
Input: Involves capturing and entering transaction data into the system. This can
include data from point-of-sale terminals, online orders, or other sources.
Processing: The system processes the transactions according to predefined rules and
procedures, updating the database with the new information.
Output: TPS produces outputs such as invoices, receipts, or updated inventory
reports, providing a record of the completed transactions.
Database: TPS relies on a well-organized database to store and manage transaction
data efficiently.
Examples:
Sales Processing: Recording sales transactions, updating inventory levels, and
generating invoices.
Financial Transactions: Managing activities like payroll processing, accounts payable,
and accounts receivable.
Order Processing: Handling customer orders from initiation to fulfilment.
Challenges:
Challenges in TPS include managing large volumes of transactions efficiently, ensuring
data accuracy, and maintaining system responsiveness under high loads.
A Data Flow Diagram (DFD) is a visual representation that depicts the flow of data within a
system. In the context of an Online Examination System, a DFD can illustrate how data moves
between different components of the system. Below is a simplified DFD for an Online
Examination System:
ONLINE
EXAMINATION
SYSTEM
Advantages:
• High Data Transfer Rates: LANs typically offer high data transfer rates, making them
suitable for tasks that require quick data exchange.
• Low Latency: The short distance between devices results in low latency, contributing
to fast communication.
• Ease of Maintenance: LANs are usually easier to maintain since they cover a smaller
area, and troubleshooting is more localized.
• High Security: It's generally easier to implement and manage security measures in a
LAN due to the limited physical access.
Disadvantages:
• Limited Geographic Scope: LANs are limited to a small geographic area, which can be
a drawback for organizations with multiple locations.
• Cost: Setting up a LAN can involve significant costs, especially if high-quality hardware
and cabling are required.
WAN (Wide Area Network):
A Wide Area Network covers a broad area, such as a city, country, or even global
connections.
Advantages:
• Wide Geographic Coverage: WANs allow for connectivity over long distances, making
them suitable for connecting remote offices or branches.
• Resource Sharing: WANs enable resource sharing among different locations,
improving efficiency.
• Centralized Data Management: Centralized data management is feasible in a WAN,
allowing for consistent information across multiple sites.
Disadvantages:
• Higher Latency: The longer distances involved can lead to higher latency, impacting
the speed of communication.
• Cost: WANs can be more expensive to set up and maintain compared to LANs due to
the need for additional infrastructure and services.
MAN (Metropolitan Area Network):
A Metropolitan Area Network covers a larger geographic area than a LAN but is smaller
than a WAN, typically spanning a city.
Advantages:
• Balanced Coverage: MANs offer a balance between the wide coverage of WANs and
the local focus of LANs.
• Medium Data Transfer Rates: MANs generally provide moderate data transfer rates,
suitable for city-wide connectivity.
Disadvantages:
• Limited Coverage: MANs are limited to the size of a city, which may not be sufficient
for organizations with broader geographical needs.
• Cost: While not as costly as WANs, MANs can still involve significant costs for setup
and maintenance.
PAN (Personal Area Network):
Advantages:
• Personal Connectivity: PANs are designed for personal devices, allowing for seamless
connectivity between devices like smartphones, laptops, and wearable gadgets.
• Low Power Consumption: PANs often involve low-power communication
technologies like Bluetooth, contributing to longer device battery life.
Disadvantages:
• Limited Range: PANs have a very limited range, usually within a few meters, making
them unsuitable for broader connectivity needs.
• Data Transfer Rates: The data transfer rates in PANs are generally lower compared to
LANs or WANs due to the shorter range and focus on personal connectivity.
OR
Explain the Internet, Intranet and Extranet in an Information System.
Internet: The Internet is a global network of interconnected computers and computer
networks. It is a vast and public network that allows communication and the sharing of
information across the globe.
Key Characteristics:
Public Access: The Internet is open to the public, and anyone with an internet connection
can access its resources.
Global Connectivity: It spans the entire globe, connecting millions of devices and
networks worldwide.
Diverse Services: Provides a wide range of services, including the World Wide Web
(WWW), email, file sharing, online gaming, and more.
Use of Standard Protocols: Internet communication relies on standard protocols such as
TCP/IP (Transmission Control Protocol/Internet Protocol).
Intranet: An Intranet is a private network within an organization that uses internet
technologies to securely share information, resources, and collaboration tools among its
employees.
Key Characteristics:
Private Network: Limited to the organization and accessible only by its members
(employees, stakeholders).
Security: Intranets often have security measures in place, including firewalls and
authentication systems, to ensure the privacy and integrity of internal information.
Collaboration Tools: Provides tools like internal websites, file sharing, and communication
platforms for enhancing collaboration within the organization.
Information Sharing: Facilitates the sharing of documents, data, and other resources
among employees.
Extranet: An Extranet is an extension of an Intranet that allows controlled access to specific
external users, such as business partners, suppliers, or customers.
Key Characteristics:
Limited External Access: While extending beyond the organization, access is limited to
specific external entities with a legitimate relationship (e.g., business partners).
Collaboration Beyond the Organization: Enables collaboration with external partners,
allowing them to access shared resources and information.
Enhanced Communication: Extranets enhance communication and coordination between
an organization and its external stakeholders.
Security Measures: Similar to Intranets, Extranets implement security measures to
protect sensitive information.
Differences:
5. Define Coupling and Cohesion. Explain the steps of coupling and cohesion
with suitable diagrams.
Coupling: Coupling is a measure of the degree of interdependence between software modules
or components. It reflects how closely one module is connected to, or relies on, another. Low
coupling is desirable because it enhances maintainability, reusability, and flexibility in a
software system.
Types of Coupling:
Low Coupling (Good): Modules are independent and can be modified without affecting
each other.
High Coupling (Undesirable): Modules are strongly dependent on each other, making
changes in one module likely to impact others.
Cohesion: Cohesion is a measure of how closely the elements within a module are related. It
reflects the degree to which the responsibilities of a module form a meaningful and logical
unit. High cohesion is desirable because it leads to more maintainable and understandable
code.
Types of Cohesion:
High Cohesion (Good): The module performs a single, well-defined task, and its internal
elements are closely related.
Low Cohesion (Undesirable): The module performs multiple tasks or has elements with
weak relationships.
Steps of Coupling and Cohesion:
Coupling:
Step 1: Separate Concerns: Divide the system into modules, each handling a distinct
concern or responsibility.
Step 2: Define Interfaces: Clearly define interfaces between modules, specifying how they
communicate.
Step 3: Minimize Dependencies: Minimize dependencies between modules. Use well-
defined interfaces and avoid direct access to internal details of other modules.
Cohesion:
Step 1: Identify Responsibilities: Ensure each module has a well-defined and single
responsibility.
Step 2: Group Related Functions: Group related functions and data within a module.
Step 3: Avoid Mixing Concerns: Avoid mixing unrelated concerns within a module. Keep
the module focused on its specific task.
OR
Discuss how the responsibility of system analyse differs in the file and
database aspect of information system design.
In the context of information system design, the responsibility of system analysis varies when
dealing with the file-based approach compared to the database approach. Let's discuss how
the responsibilities differ in these two aspects:
Data Organization: In a file-based system, the system analyst is responsible for designing
how data is organized and stored in files. This includes defining file structures, record
layouts, and relationships between files.
Data Redundancy: System analysts need to address the issue of data redundancy in file-
based systems. Redundant data can lead to inconsistencies and increased storage
requirements, so analysts must devise methods to minimize redundancy.
Data Integrity: Ensuring data integrity (accuracy and consistency) is a key responsibility.
System analysts must implement validation rules and checks to maintain the quality of
data within files.
File Retrieval and Update: Designing methods for retrieving and updating data in files is
crucial. This includes specifying how records are accessed, modified, and deleted.
Data Security: System analysts need to address data security concerns within the file-
based system. This involves implementing access controls, authentication mechanisms,
and encryption methods to protect sensitive information.
Database Design: In a database system, the system analyst collaborates with database
designers to define the overall structure of the database. This includes creating tables,
defining relationships, and establishing data constraints.
Data Independence: System analysts focus on achieving data independence, separating
the logical view of data from its physical implementation. This allows changes in one
aspect without affecting the other, providing flexibility and adaptability.
Query Optimization: Analyzing and optimizing queries is important in a database system.
System analysts work to ensure that queries are efficient and leverage indexing and other
optimization techniques for performance.
Data Integrity and Constraints: System analysts play a role in defining and enforcing data
integrity constraints, such as primary keys, foreign keys, and unique constraints, to
maintain the consistency of data.
Transaction Management: System analysts are involved in designing transaction
management processes. This includes ensuring the atomicity, consistency, isolation, and
durability (ACID properties) of database transactions.
Scalability and Performance: Addressing scalability and performance concerns is crucial
in a database system. System analysts work on strategies to handle increasing data
volumes and ensure the system performs efficiently.
Comparison:
Scope of Responsibility: In file-based systems, the focus is on individual files and their
organization, while in database systems, the emphasis is on the overall database
structure and its relationships.
Flexibility and Adaptability: Database systems offer greater flexibility and adaptability
due to data independence, allowing changes to be made more easily without affecting
the entire system.
Efficiency: Database systems often provide more efficient data retrieval and
manipulation, especially for complex queries, compared to file-based systems.
a) Telnet
Telnet, short for "Telecommunication Network," is a protocol used on the internet or local
area networks to provide a command-line interface (CLI) to communicate with remote
systems. It allows users to log into another computer on the network, provided they have
the necessary permissions, and perform tasks as if they were physically present at that
computer.
b) Internet Explorer:
Internet Explorer (IE) was a web browser developed by Microsoft. It was one of the most
widely used web browsers for many years, especially during the late 1990s and early 2000s.
However, Microsoft officially phased out Internet Explorer in 2021, encouraging users to
transition to Microsoft Edge, its modern and more feature-rich browser.
Key Points:
Introduction and Dominance: Internet Explorer was first introduced as part of the
add-on package Plus! for Windows 95 in 1995. Subsequent versions were bundled
with various Windows operating systems, contributing to its widespread usage.
Versions and Updates: Internet Explorer went through numerous versions, each
introducing new features and improvements. Some notable versions include IE 6, IE 7,
IE 8, IE 9, IE 10, and IE 11.
Web Standards and Compatibility Issues: Internet Explorer gained a reputation for
not always adhering to web standards, leading to compatibility issues with certain
websites and web applications. This prompted users to seek alternative browsers.
Competition and Decline: The rise of competitors, especially Mozilla Firefox and
Google Chrome, significantly impacted Internet Explorer's market share. These
browsers were often perceived as more modern, secure, and feature-rich.
Security Concerns: Internet Explorer faced several security vulnerabilities over its
lifespan, leading to security concerns and incidents. As a result, users were often
advised to keep the browser updated or switch to alternatives.
End of Support: In August 2021, Microsoft officially ended support for Internet
Explorer. This means that Microsoft no longer provides security updates, technical
support, or bug fixes for the browser.
Transition to Microsoft Edge: Microsoft Edge, introduced in 2015, is the successor to
Internet Explorer. It is designed to be more modern, secure, and aligned with current
web standards.
Despite its decline, Internet Explorer had a significant impact on the early days of web
browsing and played a crucial role in shaping the evolution of browsers.
c) Electronic Data Security:
Electronic data security, also known as cybersecurity, involves safeguarding digital
information from unauthorized access, disclosure, alteration, or destruction.
Here are some key aspects and practices related to electronic data security:
• Confidentiality: Confidentiality ensures that only authorized individuals or systems
can access sensitive information. Encryption is a common method to protect data
confidentiality, making it unreadable to unauthorized parties without the appropriate
decryption key.
• Integrity: Integrity ensures the accuracy and reliability of data. Measures such as
checksums, digital signatures, and version control help detect and prevent
unauthorized modifications to data.
• Availability: Availability ensures that data and systems are accessible when needed.
Redundancy, backups, and disaster recovery plans are critical components to ensure
data availability, even in the face of hardware failures, cyber attacks, or natural
disasters.
• Authentication: Authentication verifies the identity of users or systems attempting to
access data or resources. This can involve passwords, biometrics, multi-factor
authentication (MFA), and other methods to ensure that only authorized entities gain
access.
• Authorization: Authorization defines the level of access granted to authenticated
users or systems. Role-based access control (RBAC) is a common practice where
permissions are based on job roles, limiting access to the minimum necessary for each
role.
• Firewalls and Intrusion Detection/Prevention Systems: Firewalls monitor and control
network traffic, allowing or blocking data packets based on predefined security rules.
Intrusion detection and prevention systems (IDPS) identify and respond to suspicious
activities or security policy violations.
• Endpoint Security: Endpoint security focuses on securing individual devices
(endpoints) such as computers, smartphones, and servers. This includes antivirus
software, anti-malware tools, and device management solutions to prevent and
detect threats.
• Security Awareness Training: Educating users about security best practices is
essential. Training programs help employees recognize phishing attempts, understand
password hygiene, and be aware of other security threats, reducing the risk of human-
related security breaches.
• Data Backups: Regularly backing up data is crucial for recovery in case of data loss due
to accidental deletion, hardware failure, or cyber-attacks. Backups should be stored
securely and tested for reliability.
• Encryption: Encryption is the process of converting data into a coded format that
requires a specific key or password for decryption. This safeguards sensitive
information during transmission and storage.
PREVIOUS YEAR QUESTIONS WITH ANSWERS
2020
Group – A
(Each question carries 1 mark)
ii. A type of decision in which there may be a several ‘’right” answer and no precise way to
get a right answer is ________________________ type of decision.
Ans: Unstructured
v. In any real-time system ___________ factors are termed as boundary of the system.
Ans: Limit
vii. Information systems that monitor the element activities and transaction of the
organization are ______________________.
Ans: Operational Level Systems
viii. Summary of transaction data, high volume data, and model data are information input
characteristics of ___________________.
Ans: Management information System (MIS)
ix. What is the difference between ‘0’ level diagram and context diagram?
A context diagram only has 1 process, while a DFD level 0 can have more.
The context diagram established context at the system to be developed that is it represents
the interaction at the system with various external entities. Where data flow diagram is a
simple graphical notation that can be used to represent a system in the term of input data to
the system, various processing carried out on this data and the output generated by the
system. It is simple to understand and use.
x. Deciding whether to locate a new production facility is an example of a manufacturing
and production operation information system operating at ____________ level.
Ans: Strategic
Group-B
Planning:
Data Analysis for Strategic Planning: MIS provides managers with access to relevant
data and analytics, enabling them to analyze past performance and current trends.
This information aids in strategic planning by offering insights into market conditions,
customer behavior, and internal operations.
Forecasting and Decision Support: MIS supports the planning process through
forecasting models and decision support systems. Managers can use historical data
and predictive analytics to anticipate future trends, make informed decisions, and
develop effective strategies.
Scenario Analysis: MIS allows for scenario analysis, where managers can simulate
different business scenarios to understand potential outcomes and risks. This aids in
developing contingency plans and making proactive decisions.
Organizing:
Resource Allocation: MIS assists in organizing by providing information on resource
availability, utilization, and allocation. Managers can efficiently allocate human,
financial, and other resources based on real-time data, optimizing organizational
efficiency.
Workflow Management: MIS helps streamline workflows and business processes by
providing visibility into various departments and their activities. This supports
effective coordination and collaboration among different units within the
organization.
Communication and Collaboration: MIS facilitates communication and collaboration
through tools such as email, messaging systems, and collaborative platforms. This
ensures that relevant information is shared across the organization, enhancing
organizational alignment.
Controlling:
Performance Monitoring: MIS enables real-time monitoring of key performance
indicators (KPIs) and critical metrics. Managers can track performance against set
benchmarks and take corrective actions if necessary.
Variance Analysis: MIS supports variance analysis by comparing actual performance
with planned or expected performance. Deviations from the plan are highlighted,
allowing managers to investigate and address issues promptly.
Feedback Mechanism: MIS establishes a feedback loop by providing regular reports
and updates on organizational performance. This feedback is essential for controlling
operations, adjusting strategies, and ensuring that organizational goals are met.
Compliance Monitoring: MIS aids in monitoring compliance with internal policies,
industry regulations, and external standards. This helps organizations identify areas of
non-compliance and take corrective actions to maintain integrity and legality.
OR
Write Short notes on:
a) DSS
b) Expert System
a) DSS
A Decision Support System (DSS) is a computer-based information system designed to
support decision-making at various levels within an organization. DSS combines data,
analytical tools, and models to assist managers in making informed and timely decisions.
It helps analyze complex scenarios, evaluate alternatives, and provides interactive support
for decision-makers. DSS typically includes components such as databases, modeling
tools, user interfaces, and data analytics capabilities. It is designed to enhance the
decision-making process by offering relevant information and insights, often in the form
of reports, dashboards, and scenario analyses. DSS is particularly valuable in situations
where decisions involve a high degree of uncertainty and complexity.
Components of DSS: DSS consists of three main components: database, model base,
and user interface. The database stores relevant data, the model base contains
analytical models and algorithms, and the user interface allows decision-makers to
interact with the system.
Types of DSS: There are various types of DSS, including model-driven DSS, data-driven
DSS, document-driven DSS, and knowledge-driven DSS. Each type is tailored to specific
decision-making needs and processes.
Interactivity: DSS emphasizes interactivity, allowing users to explore data, manipulate
variables, and test different scenarios. This interactive capability enables decision-
makers to gain a deeper understanding of the potential outcomes of their decisions.
Decision Modeling: Decision Support Systems often incorporate decision models that
help in structuring decision problems. These models can range from simple rule-based
systems to complex mathematical and statistical models.
Collaborative Decision-Making: DSS supports collaborative decision-making by
providing shared access to information and decision models. This promotes
communication and consensus-building among team members involved in the
decision process.
Real-Time Data Integration: Many modern DSS integrate real-time data, allowing
decision-makers to access the most current information. This is particularly valuable
in dynamic business environments where decisions need to be made quickly.
What-If Analysis: DSS enables what-if analysis, allowing users to simulate the impact
of different scenarios on outcomes. Decision-makers can assess the consequences of
various decisions before making a final choice.
Sensitivity Analysis: Sensitivity analysis is a key feature of DSS, helping decision-
makers understand how changes in variables or assumptions affect the overall
outcome. This is crucial for assessing the robustness of decisions.
Strategic Decision Support: DSS is often used for strategic decision support, helping
organizations with long-term planning, goal setting, and strategic analysis. It aids in
aligning decisions with organizational objectives.
Adaptability: DSS is adaptable to different domains and industries. It can be
customized to meet the specific decision-making needs of various organizations, from
healthcare and finance to manufacturing and logistics.
Challenges: Challenges associated with DSS include the need for accurate and reliable
data, potential resistance to technology adoption, and the complexity of integrating
DSS into existing organizational processes.
b) Expert System
Expert System is a computer-based information system which employs human
knowledge and intelligence to solve problems that require human expertise.
The Knowledge Base is a repository that stores factual information, rules, heuristics, and
domain-specific expertise. It is created by knowledge engineers who extract and encode
the knowledge of human experts into a format suitable for computer processing.
Inference Engine
The Inference Engine is the core of the Expert System, responsible for processing
information in the Knowledge Base to draw conclusions and make decisions. It employs
reasoning mechanisms, such as forward chaining or backward chaining, to infer new
knowledge from existing information.
User Interface
The User Interface provides a means for users to interact with the Expert System. It can
take various forms, including a graphical user interface (GUI) or a command-line
interface. The interface allows users to input queries, receive advice, and interpret
system outputs.
Throwaway/Rapid Prototyping:
This method involves quickly creating a prototype with the intention of discarding it after
gathering feedback. It is a fast and cost-effective way to explore design alternatives and
validate user requirements.
Process:
Develop a rapid prototype quickly.
Collect feedback from stakeholders.
Discard the prototype.
Use feedback to refine requirements for the final product.
Advantages: Speeds up the design process, identifies issues early, and enhances
collaboration with stakeholders.
Disadvantages: Not suitable for all projects, as some may require more formal
documentation.
Evolutionary Prototyping:
This method involves refining and extending an initial prototype through a series of iterations
until it eventually evolves into the final product. It is a more flexible approach that allows for
continuous improvement.
Process:
Extreme Prototyping:
This method combines prototyping with extreme programming practices. It involves creating
a skeletal system quickly and then continuously refining it based on user feedback and
evolving requirements.
Process:
Incremental Prototyping:
In incremental prototyping, the system is built and enhanced in small sections or increments.
Each increment adds additional features or functionality to the existing prototype, allowing
for step-by-step development.
Process:
Extreme Prototyping:
Extreme Prototyping, inspired by extreme programming (XP), involves creating a basic,
functional prototype quickly and then refining it based on continuous feedback from users
and stakeholders. It emphasizes user involvement throughout the development process.
Process:
OR
Draw the DFD for Library Automation with the following functions:
- Issue-book
- Search-books
- -Renew-books
Data Flow Diagram (DFD) depicts the flow of information and the transformation applied
when a data moves in and out from a system. The overall system is represented and described
using input, processing and output in the DFD. The inputs can be:
Book request when a student requests for a book.
Library card when the student has to show or submit his/her identity as a proof.
The overall processing unit will contain the following output that a system will produce or
generate:
Book will be the output as the book demanded by the student will be given to
them.
Information of demanded book should be displayed by the library information
system that can be used by the student while selecting the book which makes it
easier for the student.
Level 0 DFD –
Level 1 DFD –
At this level, the system has to show or exposed with more details of processing.
The processes that are important to be carried out are:
Book delivery
Search by topic
List of authors, List of Titles, List of Topics, the bookshelves from which books can be located
are some information that is required for these processes. Data store is used to represent this
type of information.
Bus Topology
The bus topology is designed in such a way that all the stations are connected through a
single cable known as a backbone cable. Each node is either connected to the backbone
cable by drop cable or directly connected to the backbone cable.
When a node wants to send a message over the network, it puts a message over the
network. All the stations available in the network will receive the message whether it has
been addressed or not (used to broadcast a message).
The bus topology is mainly used in 802.3 (Ethernet) and 802.4 standard networks.
Low-cost cable
Moderate data speeds
Familiar technology
Limited failure
Extensive cabling
Difficult troubleshooting
Signal interference
Reconfiguration difficult
Attenuation
Ring Topology
Ring topology is like a bus topology, but with connected ends. The node that receives
the message from the previous computer will retransmit to the next node. The data
flows in one direction, i.e., it is unidirectional. The data flows in a single loop
continuously known as an endless loop. It has no terminated ends, i.e., each node is
connected to other node and having no termination point. The data in a ring topology
flow in a clockwise direction. The most common access method of the ring topology
is token passing.
Token: It is a frame that circulates around the network. Token passing: It is a network
access method in which token is passed from one node to another node.
Advantages of Ring topology:
• Network Management
• Product availability
• Cost
• Reliable
Star Topology
Star topology is an arrangement of the network in which every node is connected to the
central hub, switch or a central computer. The central computer is known as a server, and
the peripheral devices attached to the server are known as clients. Coaxial cable or RJ-45
cables are used to connect the computers. Hubs or Switches are mainly used as
connection devices in a physical star topology. Star topology is the most popular topology
in network implementation.
Efficient troubleshooting
Network control
Limited failure
Familiar technology
Easily expandable
Cost effective
High data speeds
Cable
Tree Topology
Tree topology combines the characteristics of bus topology and star topology. A tree
topology is a type of structure in which all the computers are connected with each
other in hierarchical fashion. The top-most node in tree topology is known as a root
node, and all other nodes are the descendants of the root node. There is only one path
exists between two nodes for the data transmission. Thus, it forms a parent-child
hierarchy.
Difficult troubleshooting
High cost
Failure
Reconfiguration difficult
Mesh Topology
Mesh topology is an arrangement of the network in which computers are
interconnected with each other through various redundant connections. There are
multiple paths from one computer to another computer. It does not contain the
switch, hub or any central computer which acts as a central point of communication.
The Internet is an example of the mesh topology. Mesh topology is mainly used for
WAN implementations where communication failures are a critical concern.
Number of cables = (n*(n-1))/2 where n is the number of nodes that represents the
network.
Types of Mesh Topology
Full Mesh Topology: In a full mesh topology, each computer is connected to all the
computers available in the network. Also called a completely connected network.
Partial Mesh Topology: In a partial mesh topology, not all but certain computers are
connected to those computers with which they communicate frequently.
Advantages of Mesh topology:
● Reliable
● Fast Communication
● Easier Reconfiguration
Cost
Management
Efficiency
Hybrid Topology
The combination of various different topologies is known as Hybrid topology. A Hybrid
topology is a connection between different links and nodes to transfer the data.
When two or more different topologies are combined together is termed as Hybrid
topology and if similar topologies are connected with each other will not result in
Hybrid topology. For example, if there exist a ring topology in one branch of ICICI bank
and bus topology in another branch of ICICI bank, connecting these two topologies
will result in Hybrid topology.
Reliable
Scalable
Flexible
Effective
Disadvantages of Hybrid topology:
Complex design
Costly Hub
Costly infrastructure
OR
Write short notes on:
a) Accounting Information System
b) Production and Manufacturing Information System
5. Define Coupling and Cohesion. Explain the steps of coupling and cohesion
with suitable diagrams.
Coupling: Coupling is a measure of the degree of interdependence between software modules
or components. It reflects how closely one module is connected to, or relies on, another. Low
coupling is desirable because it enhances maintainability, reusability, and flexibility in a
software system.
Types of Coupling:
Low Coupling (Good): Modules are independent and can be modified without affecting
each other.
High Coupling (Undesirable): Modules are strongly dependent on each other, making
changes in one module likely to impact others.
Cohesion: Cohesion is a measure of how closely the elements within a module are related. It
reflects the degree to which the responsibilities of a module form a meaningful and logical
unit. High cohesion is desirable because it leads to more maintainable and understandable
code.
Types of Cohesion:
High Cohesion (Good): The module performs a single, well-defined task, and its internal
elements are closely related.
Low Cohesion (Undesirable): The module performs multiple tasks or has elements with
weak relationships.
Steps of Coupling and Cohesion:
Coupling:
Step 1: Separate Concerns: Divide the system into modules, each handling a distinct
concern or responsibility.
Step 2: Define Interfaces: Clearly define interfaces between modules, specifying how they
communicate.
Step 3: Minimize Dependencies: Minimize dependencies between modules. Use well-
defined interfaces and avoid direct access to internal details of other modules.
Cohesion:
Step 1: Identify Responsibilities: Ensure each module has a well-defined and single
responsibility.
Step 2: Group Related Functions: Group related functions and data within a module.
Step 3: Avoid Mixing Concerns: Avoid mixing unrelated concerns within a module. Keep
the module focused on its specific task.
OR
Discuss how the responsibility of system analyse differs in the file and
database aspect of information system design.
In the context of information system design, the responsibility of system analysis varies when
dealing with the file-based approach compared to the database approach. Let's discuss how
the responsibilities differ in these two aspects:
Data Organization: In a file-based system, the system analyst is responsible for designing
how data is organized and stored in files. This includes defining file structures, record
layouts, and relationships between files.
Data Redundancy: System analysts need to address the issue of data redundancy in file-
based systems. Redundant data can lead to inconsistencies and increased storage
requirements, so analysts must devise methods to minimize redundancy.
Data Integrity: Ensuring data integrity (accuracy and consistency) is a key responsibility.
System analysts must implement validation rules and checks to maintain the quality of
data within files.
File Retrieval and Update: Designing methods for retrieving and updating data in files is
crucial. This includes specifying how records are accessed, modified, and deleted.
Data Security: System analysts need to address data security concerns within the file-
based system. This involves implementing access controls, authentication mechanisms,
and encryption methods to protect sensitive information.
Database Design: In a database system, the system analyst collaborates with database
designers to define the overall structure of the database. This includes creating tables,
defining relationships, and establishing data constraints.
Data Independence: System analysts focus on achieving data independence, separating
the logical view of data from its physical implementation. This allows changes in one
aspect without affecting the other, providing flexibility and adaptability.
Query Optimization: Analyzing and optimizing queries is important in a database system.
System analysts work to ensure that queries are efficient and leverage indexing and other
optimization techniques for performance.
Data Integrity and Constraints: System analysts play a role in defining and enforcing data
integrity constraints, such as primary keys, foreign keys, and unique constraints, to
maintain the consistency of data.
Transaction Management: System analysts are involved in designing transaction
management processes. This includes ensuring the atomicity, consistency, isolation, and
durability (ACID properties) of database transactions.
Scalability and Performance: Addressing scalability and performance concerns is crucial
in a database system. System analysts work on strategies to handle increasing data
volumes and ensure the system performs efficiently.
Comparison:
Scope of Responsibility: In file-based systems, the focus is on individual files and their
organization, while in database systems, the emphasis is on the overall database
structure and its relationships.
Flexibility and Adaptability: Database systems offer greater flexibility and adaptability
due to data independence, allowing changes to be made more easily without affecting
the entire system.
Efficiency: Database systems often provide more efficient data retrieval and
manipulation, especially for complex queries, compared to file-based systems.
FTP is an essential tool for those who build and maintain websites. Many FTP clients are
free to download, although most websites already have the FTP built-in.
Objectives of FTP:
o It provides the sharing of files.
o It is used to encourage the use of remote computers.
o It transfers the data more reliably and efficiently.
Although transferring files from one system to another is very simple and straightforward,
but sometimes it can cause problems.
For example,
o Two systems may have different file conventions.
o Two systems may have different ways to represent text and data.
o Two systems may have different directory structures.
FTP protocol overcomes these problems by establishing two connections between hosts.
o One connection is used for data transfer, and
o Another connection is used for the control connection.
There are two types of connections in FTP:
o Control Connection:
The control connection uses very simple rules for communication. Through
control connection, we can transfer a line of command or line of response
at a time. The control connection is made between the control processes.
The control connection remains connected during the entire interactive
FTP session.
o Data Connection:
The data connection uses very complex rules as data types may vary. The
data connection is made between data transfer processes. The data
connection opens when a command comes for transferring the files and
closes when the file is transferred.
Mechanism of FTP
The FTP client has three components: the user interface, control process, and data
transfer process. The server has two components: the server control process and the
server data transfer process.
FTP client is a program that implements a file transfer protocol which allows you to
transfer files between two hosts on the internet.
It allows a user to connect to a remote host and upload or download the files.
It has a set of commands that we can use to connect to a host, transfer the files between
you and your host and close the connection.
This GUI based FTP client makes the file transfer very easy and also does not require to
remember the FTP commands.
b) HTTP
It is a protocol used to access the data on the World Wide Web (www).
The HTTP protocol can be used to transfer the data in the form of plain text, hypertext,
audio, video, and so on.
This protocol is known as HyperText Transfer Protocol because of its efficiency that
allows us to use in a hypertext environment where there are rapid jumps from one
document to another document.
HTTP is similar to the FTP as it also transfers the files from one host to another host.
But HTTP is simpler than FTP as HTTP uses only one connection, i.e., no control
connection to transfer the files.
HTTP is used to carry the data in the form of media along with content (MIME) format.
HTTP is similar to SMTP as the data is transferred between client and server. The HTTP
differs from the SMTP in the way the messages are sent from the client to the server
and from server to the client. SMTP messages are stored and forwarded while HTTP
messages are delivered immediately.
Features of HTTP
Connectionless protocol: HTTP is a connectionless protocol. HTTP client initiates a
request and waits for a response from the server. When the server receives the
request, the server processes the request and sends back the response to the HTTP
client after which the client disconnects the connection. The connection between
client and server exists only during the current request and response time only.
Stateless: HTTP is a stateless protocol as both the client and server know each other
only during the current request. Due to this nature of the protocol, both the client and
server do not retain the information between various requests of the web pages
Mechanism of HTTP
The below figure shows the HTTP transaction between client and server. The client
initiates a transaction by sending a request message to the server. The server replies to
the request message by sending a response message.
OR
What is electronic data security system? Explain its relevance in computer
application.
An Electronic Data Security System refers to a set of measures, protocols, and technologies
designed to protect electronic data from unauthorized access, disclosure, alteration, or
destruction. It encompasses various security practices and mechanisms to ensure the
confidentiality, integrity, and availability of data in computer applications. The relevance of
an electronic data security system in computer applications is paramount due to several
reasons:
Confidentiality Protection:
Importance: Sensitive and confidential information, such as personal data, financial
records, and proprietary business information, must be protected from unauthorized
access.
Role of Security System: Encryption, access controls, and authentication
mechanisms are implemented to safeguard data confidentiality, ensuring that only
authorized individuals or systems can access protected information.
Integrity Assurance:
Importance: Maintaining the accuracy and reliability of data is critical for trustworthy
computing and decision-making processes.
Role of Security System: Data integrity checks, hash functions, and digital signatures
are employed to verify that data has not been tampered with during storage,
transmission, or processing.
Availability Guarantee:
Importance: Data needs to be available to authorized users whenever required,
ensuring uninterrupted business operations and user access.
Role of Security System: Redundancy, backups, and disaster recovery plans are
implemented to ensure data availability, even in the face of hardware failures,
cyberattacks, or other disruptions.