History of Computers
History of Computers
INTRODUCTION
Computer is one of the marvelous inventions of the recent times. Necessity is the mother of invention. Man always
wanted to do some counting or computing. Modern man can count or calculate small numbers with ease, but his
requirement for data processing is so enormous that he needs a machine that can instantly process the data captured
from source and provide real time response. The early devices such as abacus and slide rule cannot keep pace with
his needs. The progress of man from pebbles to personal Computer is a great march in technological history of
mankind.
The history of computers dates back to the age when man started using tools for computations. The whole history
of computing can be divided into four periods based on the technology used in computing devices.
Pictographs were turned on their sides (2800 B.C.) and then developed into actual cuneiform symbols
(2500 B.C.)
Around 2000 B.C., Phoenicians created symbols that expressed single syllables and consonants (the first
true alphabet).
The Greek later adopted the Phoenicians alphabet and added vowels; the Romans gave the letters Latin
names to create alphabet we use today
1623 – Wilhelm Shickard, a professor at the University of Tubingen, Germany, invents the first mechanical
calculator; it can works in six digits and carries digits across columns.
1642 – Blaise Pascal, a famed French philosopher and mathematician, invented the pascaline, the first
mechanical adding machine. The pascaline is a gear driven device which turns a wheel one tenth of the way
as the wheel to its right turns in a full 360 degrees. As the wheel turn move, the number for each digit position
written on the wheels also moves one digit higher.
- the pascaline just like the abacus it could only add and subtract.
1671 – Gottfried Leibniz, a German mathematician, invented a machine called the stepped reckoner that could
multiply 5 digit and 12 digit numbers yielding up to 16 digit number. He managed to build a four-fucntion
(addition, subtraction, multiplication and division) calculator.
1801 – the weaving loom was invented by Joseph Marie Jacquard and ushered the used of punch card
technology. The weaving loom was controlled by a series of punch cards; the cards arrangement sequence
dictated the weaving design to be produced by the loom.
1820 – Charles Xavier Thomas de Colmar invented the arithmometer in France. This device performed the
same type of computations as Leibniz’s Stepped Reckoner, but was more reliable.
1821 – Charles Babbage, a Cambridge professor during the mid 1800s laid down the basic concepts used in
today’s general-purpose computers.
- he invented an unfinished engine, this was the difference engine, a steam powered adding machine,
which was capable of computing mathematical tables.
- while working with the difference engine he also designed the analytical engine, a mechanical steam
driven device which would accept punched card input, automatically perform any arithmetic operation
inany sequence under the direction of a mechanically stored program of instructions, and produce either
punched card or printed output.
1842 – Lady Augusta Ada Lovelace, a friend of Charles Babbage, and she forwarded the idea about the use of
punched cards to direct Babbage’s engine to perform repeated instructions.
- she invented and recognize programming concepts, such as the use of subroutines, loopings and
conditional jumps.
- she was the first computer programmer.
- the US department of defense developed a programming language and named it after her in honor. This
was the ADA programming
a. Voltaic Battery
Alessandro Volta – invented the voltaic pile, the first electric battery in 1800. Voltaic pile consisted of a stack
of alternating discs of zinc and copper or silver separated by felt soaked in brine.
b. Telegraph
Guglielmo Marconi (1894) – discovered that electrical waves travel through space and can produce an effect
far from the point at which they originated.
George Boole (1852) – develops binary algebra. This became known as Boolean algebra and became important
in the 20th century when binary computers were developed.
2. Electromechanical Computing
Pehr and Edvard Scheutz (1853) – developed a tabulating machine, capable of processing fifteen-digit
numbers, printing out results, and rounding off to eight digits.
Dorr Felt (1855) – invented the comptometer, a key driven adding and subtracting calculator.
Herman Hollerith (1890) – the first person to successfully use punched cards – specifically for census taking.
- Herman Hollerith was a statistician commissioned y the US Census Bureau to assist in completing the
census. It took the US Census over 7 years to manually calculate and analyze its 1880 data. But with his
effort and machine it took the US Census for only 3 years in completing the census and saved the Bureau
over 5 million dollars in the process.
1911 - the Tabulating Machine Company combined to other companies to form the Computing-Tabulating-
Recording Company
Otto Shweiger (1893) – a Swiss Engineer invented the millionaire the first efficient four-function calculator.
Lee De Forest – developed the vacuum tube in 1906. This was important because it provided and electrically
controlled switch; a necessity for digital electronic computer.
1924 – Thomas J. Watson, the general manager of the Computing Tabulating Recording Company, changed
the company’s name to International Business Machine Corporation (IBM)
1920’s through the mid 1950’s punch card technology was developed with greater sophistication and so the
EAM – Electromechanical Accounting Machine) was developed.
Konrad Zuse (1941) – built the first programmable computer called the Z3. The Z3 was presented on May 12,
1941 to an audience of scientists in Berlin.
Howard Aiken (1942) – a Ph.D. student at Harvard University built the Mark I “the first stored-program
computer”. 8 feet tall, 51 feet long, 2 feet thick, weighed 5 tons, used about 750,000 parts, 500 miles of wires,
3–5 seconds per calculations.
Dr. John Atanasoff (1942) – a professor at the Iowa State University, and one of his graduate students, Clifford
E. Berry created the Atanasoff Berry Computer (ABC).
Atanasoff Berry Computer (ABC) – world’s first automatic electronic digital computer.
The Electronic Numerical Integrator And Computer (ENIAC) – was first fully functional electronic digital
computer. Instead of using electromechanical relay switches, ENIAC contained over 18,000 electronic vacuum
tubes. ENIAC was built by Dr. John Mauchly and Presper Eckert Jr. for the US Army to compute trajectory
tables.
The ENIAC weighed 30 tons and occupies 1500 sq feet of space and requires a tremendous amount of
electricity to power the thousands of vacuum tubes.
The ENIAC was 1000x faster than the Mark 1 and it can approximately perform 5,000 arithmetic operations
a second. It could compute a problem in 90 seconds that in comparison would take a skilled technician with a
desk calculator for 20 – 24 hours.
The only major concept that the ENIAC did not possess was having s stored program control the sequencing
of instructions.
1950 – John Von Neuman finished building the first computer that used the first Binary Numbering System of
ones and zeros which was developed by George Boole. This allowed the computer’s instructio0ns to be
encoded into a form which can be stored in the computer’s memory.
UNIVAC 1 – the first commercially viable electronic digital computers. It was created by the same team
which made the ENIAC. It handles alphabetical characters instead just a numbers. It was sold by the
Remington-Rand Company to the US Census Bureau.
1951 – 1953 – magnetic core memory was developed. This memory consist of tiny ferrite “donuts” that were
arranged on a lattice of wires.
1953 – IBM 701 (International Business Machine) first electronic business computer. Marked the initial foray
into the market.
- IBM 650 – IBM’s first commercially successful computer.
1961 – Grace Hopper, the women that found the first computer bug, finishes developing COBOL (Common
Business Oriented Language).
1964 – Digital Equipment Corporation (DEC) founded by Ken Olsen, released the first minicomputer, the
PDP-8
IBM unveils the System/360, the first family of computers. The IBM 360 is introduced in April and quickly
becomes the standard institutional mainframe computer.
1965 – Thomas Kurtz and John Kemeny of Dartmouth College developed BASIC (Beginners All Purpose
Symbolic Instruction Code) as a computer language to help teach people how to program.
1958 – Jack Kilby joined Texas Instrument Inc. in Dallas where he was responsible for integrated circuit
development and applications. In 1959 electrical engineer Jack S. Kilby invented the monolithic integrated
circuit, which is still widely used in electronic systems.
1968 – Intel was founded by Robert Noyce (one of the inventors of the integrated circuit) and others.
1972 – The C programming language is developed at AT & T Bell Labs by Brian Kernighan and Dennis
Ritche. The UNIX operating system, also written at Bell Labs, is written using C. This later makes UNIX one
of the most portable operating systems.
Microprocessor is an integrated circuit built on a tiny piece of silicon. One of the common task
microprocessors perform is to serve as the “brain” inside personal computers, but they deliver “intelligence”
to countless other devices as well.
The use of LSI (Large Scale Integrator) semiconductor circuits for both the logic and memory circuitry of
the computer is a major technological development of the fourth generations.
1971 – is generally regarded as the year which ushered the fourth generation of computers. The invention of
the microprocessor prepared the way for the development of the small, desk-sized computer and portable
terminals by M.E. Hoff of the Intel.
1972 – several models of the IBM System/370 computer series became the first electronic computers with their
main memories composed entirely of LSI semiconductor circuits.
1975 – the computer name ALTAIR 8080 introduced personal computing to individuals and small companies.
April 1976 – Steven Jobs and Steve Wozniak founded Apple Computers
Apple II personal computer – first personal computer to come in a plastic case and include color graphics.
1979 – Wordstar (first microcomputer word processor) is released. This program later developed into
WordPerfect.
1981 – IBM enters the personal computer market with the PC. It came with DOS, an operating system based
on CP/M.
1984 – IBM develops a one-million bit RAM. The Apple Macintosh debuts in 1984.
1985 – the Amiga introduced the world to multimedia. Amiga was the first multimedia computer, but in those
days it was viewed largely as a games machine because few people grasped the importance of advanced
graphics and sound combined with a multitasking operating system with graphical user interface.
1990 – Windows 3 was launched. It was still 16-bit, but the user interface was completely revamped to mimic
the look and feel of IBM’s as yet unreleased OS/2 with its 3D sculpted buttons.
1993 – Intel introduced the Pentium Processor, a microprocessor with 3.1 million transistor.
1994 – Apple announced the PowerMac family. The PowerPC processor allowed Macs to compete with and
in many cases beat, the Intel’s newer processor.
1995 – After at least eighteen months of pre-release hype, Microsoft finally released Windows 95 on August
24,1995.
Classes of Computers
Classes by size
Microcomputers (personal computers)
Microcomputers are the most common kind of computers in use as of 2014. The term “microcomputer” was
introduced with the advent of systems based on single chip microprocessors. The best-known early system was
the Altair 8800, introduced in 1975. The term "microcomputer" has practically become an anachronism.
These computers include:
Desktop computers – A case and a display, put under and on a desk.
In-car computers (carputers) – Built into a car, for entertainment, navigation, etc.
Game consoles – Fixed computers specialized for entertainment purposes (video games).
Smaller microcomputers are also called mobile devices:
Laptops and notebook computers – Portable and all in one case.
Tablet computer – Like laptops, but with a touch-screen, entirely replacing the physical keyboard.
Smartphones, smartbooks, PDAs and palmtop computers – Small handheld computers with limited hardware.
Programmable calculator– Like small handhelds, but specialized on mathematical work.
Handheld game consoles – The same as game consoles, but small and portable.
Minicomputers (midrange computers)
Minicomputers (colloquially, minis) are a class of multi-user computers that lie in the middle range of the
computing spectrum, in between the smallest mainframe computers and the largest single-user systems
(microcomputers or personal computers). The term superminicomputer or supermini was used to distinguish more
powerful minicomputers that approached mainframes in capability. Superminis were usually 32-bit at a time when
most minicomputers were 16-bit. The contemporary term for minicomputer is midrange computer, such as the
higher-end SPARC, POWER and Itanium-based systems from Oracle Corporation, IBM and Hewlett-Packard.
Mainframe computers
The term mainframe computer was created to distinguish the traditional, large, institutional computer intended to
service multiple users from the smaller, single user machines. These computers are capable of handling and
processing very large amounts of data quickly. Mainframe computers are used in large institutions such as
government, banks and large corporations. They are measured in MIPS (million instructions per second) and
respond to up to 100s of millions of users at a time.
Supercomputers
A Supercomputer is focused on performing tasks involving intense numerical calculations such as weather
forecasting, fluid dynamics, nuclear simulations, theoretical astrophysics, and complex scientific computations. A
supercomputer is a computer that is at the front-line of current processing capacity, particularly speed of
calculation. The term supercomputer itself is rather fluid, and the speed of today's supercomputers tends to become
typical of tomorrow's ordinary computer. Supercomputer processing speeds are measured in floating point
operations per second, or FLOPS. An example of a floating point operation is the calculation of mathematical
equations in real numbers. In terms of computational capability, memory size and speed, I/O technology, and
topological issues such as bandwidth and latency, supercomputers are the most powerful, are very expensive, and
not cost-effective just to perform batch or transaction processing. Transaction processing is handled by less
powerful computers such as server computers or mainframes.
Classes by function