0% found this document useful (0 votes)
7 views

COS 111- Introduction to Computing Sciences- Lecture Note Week 2

The document provides an overview of computing sciences, including the definition and components of a computer, which consist of hardware and software. It discusses the evolution of computers through various generations, highlighting key technological advancements and classifications based on size, functionality, and data handling. Additionally, it outlines the functions of computers and includes references for further reading.

Uploaded by

dialorakson
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

COS 111- Introduction to Computing Sciences- Lecture Note Week 2

The document provides an overview of computing sciences, including the definition and components of a computer, which consist of hardware and software. It discusses the evolution of computers through various generations, highlighting key technological advancements and classifications based on size, functionality, and data handling. Additionally, it outlines the functions of computers and includes references for further reading.

Uploaded by

dialorakson
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 14

2 COS111: Introduction to Computing Sciences (3 Credit Units).

Lecture note: Week 2

Dennis Osadebay University

Faculty of Computing

COS111:Introduction to Computing Sciences


3Credits

LECTURE NOTE WEEK 2

INTRODUCTION

The computer is fast becoming the universal machine of the twenty-first century. Early computers
were large and too expensive to be owned by individuals. Thus, they were confined to the

Faculty of Computing 2023 Semester 1 November, 2023


3 COS111: Introduction to Computing Sciences (3 Credit Units). Lecture note: Week 2

laboratories and few research institutes. They could only be programmed by computer engineers.
The basic applications were confined to undertaking complex calculations in science and
engineering. Today, the computer is no longer confined to the laboratory. Computers, and indeed,
computing have become embedded in almost every item we use. Computing is fast becoming
ubiquitous. Its application in engineering, communication, space science, aviation, financial
institutions, social sciences, humanities, the military, transportation, manufacturing, and the
extractive industries to mention but a few.

Definitions of a Computer

A computer is an electronic device that takes raw data as input from the user and processes
these data under the control of a set of instructions (called program) gives the result (output) and
saves output for future use. It can process both numerical and non-numerical (arithmetic and
logical) calculations.

Simply put, a computer is an electronic device, operating under the control of instructions stored
in its own memory that can accept data (input), process the data according to specified rules,
produce information (output), and store the information for future use

Computer Components

Generally speaking, any kind of computers consists of HARDWARE AND SOFTWARE. These
strands are discussed below:

Hardware

Computer hardware is the collection of physical elements that constitutes a computer system.
Computer hardware refers to the physical parts or components of a computer such as the monitor,
mouse, keyboard, computer data storage, hard drive disk (HDD), system unit (graphic cards, sound
cards, memory, motherboard and chips), etc. all of which are physical objects that can be touched,
see and feel.

Faculty of Computing 2023 Semester 1 November, 2023


4 COS111: Introduction to Computing Sciences (3 Credit Units). Lecture note: Week 2

Input Devices

Input device is any peripheral (piece of computer hardware equipment) that provide data and control
signals to an information processing system such as a computer or other information appliance.

Input device Translate data from form that humans understand to one that the computer can work
with. Most common are keyboard and mouse. Below are some examples of input device:

Note: The most common use keyboard is the QWERTY keyboard. Generally standard Keyboard has
104 keys.

Output devices

An output device is any piece of computer hardware equipment used to communicate the results of
data processing carried out by an information processing system (such as a computer) which converts
the electronically generated information into human readable form.

Faculty of Computing 2023 Semester 1 November, 2023


5 COS111: Introduction to Computing Sciences (3 Credit Units). Lecture note: Week 2

Examples of Output Devices:

Note: Basic types of monitors are

a. Cathode Ray Tube (CRT).


b. Liquid Crystal Displays (LCD).
c. light-emitting diode (LED).

Printer types:

a. Laser Printer.
b. Ink Jet Printer.
c. Dot Matrix Printer

Software

Software is a generic term for organized collections of computer data and instructions, often broken
into two major categories: system software that provides the basic non-task-specific functions of the
computer, and application software which is used by users to accomplish specific tasks.

Software Types

A. System software is responsible for controlling, integrating, and managing the individual
hardware components of a computer system so that other software and the users of the system
see it as a functional unit without having to be concerned with the low-level details such as
transferring data from memory to disk, or rendering text onto a display. Generally, system
software consists of an operating system and some fundamental utilities such as disk formatters,

Faculty of Computing 2023 Semester 1 November, 2023


6 COS111: Introduction to Computing Sciences (3 Credit Units). Lecture note: Week 2

file managers, display managers, text editors, user authentication (login) and management tools,
and networking and device control software.
B. Application software is used to accomplish specific tasks other than just running the computer
system. Application software may consist of a single program, such as an image viewer; a small
collection of programs (often called a software package) that work closely together to
accomplish a task, such as a spreadsheet or text processing system; a larger collection (often
called a software suite) of related but independent programs and packages that have a common
user interface or shared data format, such as Microsoft Office, which consists of closely
integrated word processor, spreadsheet, database, etc.; or a software system, such as a database
management system, which is a collection of fundamental programs that may provide some
service to a variety of other independent applications.

Comparing Application Software and System Software

Functions of a Computer

Any digital computer carries out four functions in gross terms. These functions are:

i. Takes data input


ii. Stores the data/instructions in its memory and uses them when required
iii. Processes the data and converts it into useful information
iv. Generates the output

Three major steps in the above functions of a computer is shown in the diagram below:

Faculty of Computing 2023 Semester 1 November, 2023


7 COS111: Introduction to Computing Sciences (3 Credit Units). Lecture note: Week 2

Brief History of a Computer

A complete history of computing would include a multitude of diverse devices such as the ancient
Chinese abacus, the Jacquard loom (1805), and Charles Babbage’s “analytical engine” (1834). It
would also include a discussion of mechanical, analog, and digital computing architectures. As late
as the 1960s, mechanical devices, such as the Merchant calculator, still found widespread
application in science and engineering. During the early days of electronic computing devices, there
was much discussion about the relative merits of analog vs. digital computers. As late as the 1960s,
analog computers were routinely used to solve systems of finite difference equations arising in oil
reservoir modeling. In the end, digital computing devices proved to have the power, economics, and
scalability necessary to deal with large-scale computations. Digital computers now dominate the
computing world in all areas ranging from the hand calculator to the supercomputer and are
pervasive throughout society. Therefore, this brief sketch of the development of scientific
computing is limited to the area of digital, and electronic computers.

The evolution of digital computing is often divided into generations. Each generation is
characterized by dramatic improvements over the previous generation in the technology used to
build computers, the internal organization of computer systems, and programming languages.
Although not usually associated with computer generations, there has been a steady improvement in
algorithms, including algorithms used in computational science. The following history has been
organized using these widely recognized generations as mileposts.

The First Generation (1940 – 1956)

Faculty of Computing 2023 Semester 1 November, 2023


8 COS111: Introduction to Computing Sciences (3 Credit Units). Lecture note: Week 2

• The first computers used vacuum tubes for circuitry and magnetic drums for memory, and
were often enormous, taking up entire rooms.

• They were very expensive to operate and in addition to using a great deal of electricity,
generated a lot of heat, which was often the cause of malfunctions

• First generation computers relied on machine language, the lowest-level programming


language understood by computers, to perform operations and they could only solve one
problem at a time.

• Input was based on punched cards and paper tape, and output was displayed on printouts

Vacuum tube

The Second Generation (1956 – 1963)

• Transistors replaced vacuum tubes and ushered in the second generation of computers.

• One transistor replaced the equivalent of 40 vacuum tubes

• Allowing computers to become smaller, faster, cheaper, more energy-efficient and more
reliable

• Still generated a great deal of heat that can damage the computer.

• Second-generation computers moved from cryptic binary machine language to symbolic, or


assembly, languages, which allowed programmers to specify instructions in words

• Second-generation computers still relied on punched cards for input and printouts for output.

• These were also the first computers that stored their instructions in their memory, which
moved from a magnetic drum to magnetic core technology

Faculty of Computing 2023 Semester 1 November, 2023


9 COS111: Introduction to Computing Sciences (3 Credit Units). Lecture note: Week 2

Transistor

The Third Generation (1964 – 1971)

• The development of the integrated circuit was the hallmark of the third generation of
computers.

• Transistors were miniaturized and placed on silicon chips, called semiconductors, which
drastically increased the speed and efficiency of computers.

• Much smaller and cheaper compared to the second generation computers.

• It could carry out instructions in billionths of a second.

• Users interact with third generation computers through keyboards and monitors and
interfaced with an operating system, which allows the device to run many different
applications at one time with a central program that monitors the memory.

• Computers for the first time became accessible to a mass audience because they were smaller
and cheaper than their predecessors

Faculty of Computing 2023 Semester 1 November, 2023


10 COS111: Introduction to Computing Sciences (3 Credit Units). Lecture note: Week 2

Integrated Circuit

The Fourth Generation (1971 – today)

• The microprocessor brought the fourth generation of computers, as thousands of integrated


circuits were built onto a single silicon chip.

• As these small computers became more powerful, they could be linked together to form
networks, which eventually led to the development of the Internet.

• Fourth generation computers also saw the development of GUIs, the mouse and handheld
devices.

Microprocessor

The Fifth Generation (Today to future) • Based


on Artificial Intelligence (AI).

• Still in development.

• The use of parallel processing and superconductors is helping to make artificial intelligence a
reality

Faculty of Computing 2023 Semester 1 November, 2023


11 COS111: Introduction to Computing Sciences (3 Credit Units). Lecture note: Week 2

The goal is to develop devices that respond to natural language input and are capable of learning and
self-organization.

Classification of Computer
Computers can be classified in various ways depending on different factors like size, functionality,
processing power, and purpose. Here’s an overview of the primary classifications:

Classification of Computer based on Size and Power

1. Super-computers: Supercomputers are the most high-performing systems. A supercomputer


is a computer with a high level of performance compared to a general-purpose computer. The
actual Performance of a supercomputer is measured in FLOPS instead of MIPS. All of the
world’s fastest 500 supercomputers run Linux-based operating systems. Additional research
is being conducted in China, the US, the EU, Taiwan, and Japan to build even faster, more
high-performing, and more technologically superior supercomputers. Supercomputers play an
important role in the field of computation and are used for intensive computation tasks in
various fields, including quantum mechanics, weather forecasting, climate research, oil and
gas exploration, molecular modeling, and physical simulations. and also, throughout history,
supercomputers have been essential in the field of cryptanalysis. For example, PARAM,
jaguar, roadrunner.

2. Mainframe computers: These are commonly called as big iron, they are usually used by big
organizations for bulk data processing such as statistics, census data processing, and
transaction processing and are widely used as servers as these systems have a higher
processing capability as compared to the other classes of computers, most of these mainframe
architectures were established in the 1960s, the research and development worked
continuously over the years and the mainframes of today are far more better than the earlier

Faculty of Computing 2023 Semester 1 November, 2023


12 COS111: Introduction to Computing Sciences (3 Credit Units). Lecture note: Week 2

ones, in size, capacity and efficiency. For example: IBM z Series, System z9 and System z10
servers.

3. Mini computers: These computers came into the market in mid 1960s and were sold at a
much cheaper price than the mainframes, they were designed for control, instrumentation,
human interaction, and communication switching as distinct from the calculation and record
keeping, later they became very popular for personal uses with evolution.
In the 60s to describe the smaller computers that became possible with the use of transistors
and core memory technologies, minimal instruction sets and less expensive peripherals such
as the ubiquitous Teletype Model 33 ASR. They usually took up one or a few inch rack
cabinets, compared with the large mainframes that could fill up a room.

4. Micro-computers: A microcomputer is a small, relatively inexpensive computer with a


microprocessor as its CPU. It includes a microprocessor, memory, and minimal I/O circuitry
mounted on a single printed circuit board. The previous computers, mainframes and
minicomputers, were comparatively much larger, hard to maintain, and more expensive. They
formed the foundation for present-day microcomputers and smart gadgets that we use in our
day-to-day lives. For example, Tablets and smartwatches.

Classification of Computer based on functionality

1. Servers: Servers are nothing but dedicated computers which are set up to offer some services
to the clients. They are named depending on the type of service they offer.

2. Workstation: These are the computers designed primarily to be used by a single user at a
time. They run multi-user operating systems. They are the ones that we use for our day-to-
day personal/commercial work.

3. Information Appliances: They are portable devices that are designed to perform a limited
set of tasks like basic calculations, playing multimedia, browsing the internet, etc. They are
generally referred to as mobile devices. They have very limited memory and flexibility and
generally run on an “as-is” basis.

4. Embedded computers: They are computing devices that are used in other machines to serve
a limited set of requirements. They follow instructions from the non-volatile memory and
they are not required to execute reboot or reset. The processing units used in such devices

Faculty of Computing 2023 Semester 1 November, 2023


13 COS111: Introduction to Computing Sciences (3 Credit Units). Lecture note: Week 2

work to those basic requirements only and are different from the ones that are used in
personal computers- better known as workstations.

Classification based on data handling

1. Analog: An analog computer is a form of computer that uses the continuously-changeable


aspects of physical fact such as electrical, mechanical, or hydraulic quantities to model the
problem being solved. Anything variable concerning time and continuous can be claimed as
analog just like an analog clock measures time using the distance travelled for the spokes of the
clock around the circular dial.

2. Digital: A computer that performs calculations and logical operations with quantities
represented as digits, usually in the binary number system of “0” and “1”, a computer capable
of solving problems by processing information expressed in discrete form. from manipulation
of the combinations of the binary digits, it can perform mathematical calculations, organize and
analyze data, control industrial and other processes, and simulate dynamic systems such as
global weather patterns.

3. Hybrid: A computer that processes both analog and digital data, a Hybrid computer is a digital
computer that accepts analog signals, converts them to digital, and processes them in digital
form.

Textbooks

1. Brands, Gilbert (2013). Introduction to Computer Science: A Textbook for Beginners in


Informatics.
Publisher: CreateSpace Independent Publishing Platform, 2013
ISBN 10:1492827843 ISBN 13: 9781492827849
https://www.abebooks.com/servlet/SearchResults?isbn=9781492827849

2. Perry Donham (2018). Introduction to Computer Science


ISBN 1516571738 (ISBN13: 9781516571734)
Publisher: Cognella Academic Publishing
https://www.goodreads.com/book/show/42834356-introduction-to-computer-science

Faculty of Computing 2023 Semester 1 November, 2023


14 COS111: Introduction to Computing Sciences (3 Credit Units). Lecture note: Week 2

Faculty of Computing 2023 Semester 1 November, 2023

You might also like