DCA1101 & FUNDAMENTAL IT PROGRMING
DCA1101 & FUNDAMENTAL IT PROGRMING
SET - I
Question 1.
Answer:
(a) Define the term ‘computer’?
A computer is an electronic device that processes raw data to produce information as output.
It accepts data as input and transforms it under the influence of a set of special instructions
called Programs, to produce the desired output. It consists of components such as memory for
temporary storage, storage devices for data retention, input devices like keyboards, and
output devices like monitors. The motherboard is the main component which connects and
facilitates communication between the input and output devices. Computers operate on a
binary system of 0s and 1s, executing diverse tasks through software.
(b) Explain the organization of computer?
Ans: The organisation of computers are:
The organization of a computer is a complex and hierarchical structure that enables the
efficient execution of tasks. Computer organization consist of following parts:
1. CPU – central processing unit
2. Memory
3. Input devices
4. Output devices
1. CPU – central processing unit, alternatively referred to as the brain of the computer,
processor, central processor, or microprocessor. The computer CPU is responsible for
handling all instructions it receives from hardware and software running on the
computer. CPU is considered as the brain of the computer. CPU performs all types of
data processing operations. It stores data, intermediate results and instructions (program).
It controls the operation of all parts of computer. CPU has following three components:
a. ALU (Arithmetic Logic Unit) - All arithmetic calculations and logical operations
are performed using the Arithmetic/Logical Unit or ALU
b. Memory Unit - A memory is just like a human brain. It is used to store data and
instruction. Computer memory is used to Stores information being processed by
the CPU
c. Control Unit - Control unit helps to perform operations of input unit, output unit,
Memory unit and ALU in a sequence.
b. Non-volatile memory - Non-volatile memory keeps its contents even if the power
is lost. Example: ROM or EPROM is a good example of a non-volatile memory
3. Input Devices – An input device is a device that allows the users to interact with
computer by inserting into a computer system using hardware tools, without it a
computer will not be able to perform any tasks. The most fundamental pieces of
information are keystrokes on a keyboard and clicks with a mouse. Examples of input
devices include keyboards, mouse, scanners, digital cameras and joysticks.
2. Based on the purpose of its usage, the computers are classified into 2 types:
a. General-purpose computer - A general-purpose computer is built to do a
variety of common tasks. Computers of this type have the ability to store
multiple programs. They can be applied in the workplace, in science, in
education, and even at home.
3. Based on the capability of data handling, the computers are further classified into
three types:
a. Digital computer - A digital computer deals with the data that can be stored in
binary format i.e. in the form of 0s and 1s. This computer stores data or
information as voltage pulses that either indicate 0 or 1.
Question 3.
Answer:
Random access memory (RAM):
Random Access Memory (RAM) is a type of volatile computer memory that enables the CPU
to access data quickly during the execution of programs. It serves as a temporary storage area
for actively used data and instructions, allowing for rapid read and write operations. RAM is
crucial for the efficient functioning of applications and the operating system.
There are several types of RAM:
1. DRAM (Dynamic RAM): Requires periodic refreshing to maintain the stored
information. It is widely used in computer systems for its capacity and cost-effectiveness.
2. SRAM (static RAM): Does not require refreshing, making it faster than DRAM.
SRAM is often used in cache memory due to its high-speed characteristics.
3. DDR (Double data Rate): A type of Synchronous DRAM that transfers data on both
the rising and falling edges of the clock signal, providing higher bandwidth compared to its
predecessor.
RAM is volatile, meaning that its contents are erased when power is turned off. This
characteristic makes it suitable for temporary storage but requires persistent storage solutions
like hard drives or SSDs for data retention between power cycles.
Read only memory (ROM):
Read-Only Memory (ROM) is a non-volatile type of memory used for permanent storage of
data and instructions essential for the basic functionality of a computer. Unlike RAM, the
data in ROM is typically read-only and not easily modified during normal operation.
Types of ROM include:
1. Mask Rom: Contents are permanently written during manufacturing and cannot be
changed. It is cost-effective for large-scale production of devices with fixed software.
FLASH MEMORY: Flash memory is a type of EEPROM that enables multiple memory cells
to be erased or written in a single operation. It is widely used in USB drives, memory cards,
and solid-state drives (SSDs) due to its speed, durability, and non-volatile nature. In
summary, RAM provides temporary and fast storage for actively used data, while ROM
serves for the permanent storage of critical instructions and data essential for a computer's
basic functions. Flash memory, a type of EEPROM, extends the non-volatile storage
capabilities and finds extensive use in various portable and storage devices. Each type plays a
vital role in the overall memory hierarchy, contributing to the efficient operation of computer
systems.
SET - II
Question 4.
Answers:
(a) Software testing - Software testing is a systematic process of evaluating and verifying
that a computer program or application functions as intended. It involves executing the
software to identify any discrepancies between expected and actual results, ensuring the
product meets specified requirements. The primary goals of testing are to detect defects,
ensure the reliability and quality of the software, and enhance overall user satisfaction.
Testing encompasses various techniques, including functional testing, performance testing,
security testing, and more. It is an integral part of the software development life cycle,
helping developers identify and rectify issues early in the process. Effective testing
contributes to the delivery of robust, reliable, and high-quality software, reducing the
likelihood of defects in the production environment and enhancing the overall success of
software projects.
(b) Software testing strategy A software testing strategy is a comprehensive plan that
outlines the systematic approach, methodologies, and activities employed to ensure the
quality and reliability of a software application. It involves a series of steps and
considerations that collectively contribute to the effective verification and validation of the
software against its specified requirements. • The first critical aspect is to clearly define the
objectives and scope of the testing effort. This involves identifying the features, modules, and
integration points that will be subjected to testing. Establishing clear objectives helps guide
the testing process and ensures alignment with the overall project goals.
• Test levels are a fundamental component of the strategy, encompassing unit testing,
integration testing, system testing, and acceptance testing. Each level targets specific aspects
of the software, ranging from individual units to the entire system, ensuring a comprehensive
evaluation of functionality and performance.
• Test levels are a fundamental component of the strategy, encompassing unit testing,
integration testing, system testing, and acceptance testing. Each level targets specific aspects
of the software, ranging from individual units to the entire system, ensuring a comprehensive
evaluation of functionality and performance.
• Test design techniques, such as equivalence partitioning, boundary value analysis,
decision table testing, and state transition testing, contribute to the creation of effective and
efficient test cases. These techniques guide the selection of test inputs and conditions to
maximize test coverage.
• The testing environment and test data management are critical considerations,
ensuring that the test environment mirrors the production setup and that relevant and diverse
test data is available. A robust test execution plan, along with defect management processes,
helps orchestrate the testing effort and addresses issues as they arise.
• Automation is integrated strategically, targeting repetitive and time-consuming test
cases to enhance efficiency. Reporting mechanisms and metrics provide insights into test
progress and results, facilitating informed decision-making and continuous improvement.
Risk analysis
and mitigation strategies are essential components, allowing teams to identify potential
challenges and proactively address them. Regular review and updating of the testing strategy
accommodate changes in project requirements and ensure its relevance throughout the
software development lifecycle.
In summary, a well-defined testing strategy is indispensable for ensuring the delivery of high
quality software. It serves as a roadmap, guiding testing activities, mitigating risks, and
facilitating collaboration between development and testing teams for successful software
development and delivery.
Question 5.
Answers:
(a) Operating system - An operating system is a basic piece of software that controls the
hardware of computers and offers an interface. It serves as a bridge between users and
computer hardware, making it easier for programs to run and organizing operations like
memory management, file organization, and process scheduling. Users can connect with
computers, run applications, and use peripheral devices with operating systems like
Windows, macOS, and Linux, which guarantee effective and safe use of the device's
resources.
Common examples include Windows, macOS, and Linux, each serving to create a stable and
user-friendly environment for running applications and utilizing computer resources.
(b) Components of operating system - An operating system (OS) is composed of several
essential components that work together to manage and facilitate the various functions of a
computer. These components include:
1. Kernel • The core of the operating system that provides essential services like
process and memory management.
• It interacts directly with the hardware and manages resources such as CPU, memory,
and peripheral devices.
2. File system
• Manages the organization and storage of files on storage devices.
• Provides a hierarchical structure, allowing users and applications to store, retrieve,
and organize data
3. Device driver:
Software modules that enable communication between the operating system and
hardware devices.
They act as intermediaries, translating high-level OS commands into specific
commands understood by hardware.
4. User interface :
Allows users to communicate with the computer.
They can be graphical user interfaces (GUI) with menus and icons or command-line
interfaces (CLI) where users input commands.
Process management:
Oversees the way processes (or programs in progress) are carried out.
5. Memory management:
Oversees the memory hierarchy of the computer, assigning and releasing memory to
applications.
Manages virtual memory, enabling the use of methods like paging and swapping to
use more memory than is physically available.