The History of Computer
The History of Computer
THE COMPUTER
JODI WALLACE
TABLE OF CONTENTS
Introduction ………………………………………………………………………………….. I
Bibliography………………………………………………………………………………….…11
INTRODUCTION
Considering the advanced stage of the modern computer and technology, it would only be fair for one to know exactly
how it all began, because there is an old saying and goes like this; ‘you do not know where you are going unless you know
where you are coming from’. So the main aim of the project is to shine some light on the history and development of the
computer. It is designed to educate and enlighten the reader about how the idea of an electronic device came about and
who invented these devices.
The truth is, however, there is no one inventor, hence the statement; ‘great minds think alike’. Therefore in the end there is
only one common ground and that is, just like understanding the full potential of the computer, so it is when it come
understanding who invented it. Fortunately, I have covered the major topics that will assist the reader in concluding
where we are today in terms of technological achievements. Also the dates and names of the inventors will allow you, the
reader, to get more detailed information these event and creations.
COMPUTER
The word computer is usually applied to electronic machines, which work at high speeds to process the
information/data imputed or entered, these devices work at high speeds by controlling the movement of the tiny
particles of electricity called electrons. With all this in mind it is safe to conclude that a perfectly acceptable
definition for the word computer is, an electronic, programmable device operating under the control of
instructions stored in its own memory. A computer is a machine that is programmable which receives input,
stores and manipulates data, and provides output in an effective format, a machine for performing calculations
automatically. Before electronic computers became commercially available, the term "computer", in use from the mid
17th century, literally meant "one who computes": a person performing mathematical calculations.
A computer has the ability to take instructions, often known as programs, and execute them, which distinguishes a
computer from a mechanical calculator. While both are able to make computations, a calculator responds simply to
immediate input. In fact, most modern calculators are actually computers, with a number of pre-installed programs to aid
in complex tasks. Modern electronic computers are able to add together two very large numbers in 1/1000000 th of a
second.
The computer is one of the most powerful innovations in human history. With the use of computers, people are suddenly
able to perform a staggering amount of computations at miraculous speeds. Information can be compiled, organized, and
displayed in the blink of an eye. As technology continues to advance, the computer may even become more omnipresent.
HISTORY OF THE COMPUTER
Since the invention of the computer, there has been a tremendous increase its use. People have become so reliant on these
devices and yet many do not even have the slightest idea as to how it all began, who invented it, the when, the where, etc.
With this in mind we would like to explore ‘the history of computer’.
Based on the computer history time line, it is said that in 500 BC the forerunner to modern language theory was
introduced by Panini, shortly after, Pingala created the binary number system which was followed by the Antikythera
Mechanism, this device was used to track the movement of stars. It was said to be the first complex, analog clock which
was used to calculate astronomical positions. The name of the inventor was Hipparchus.
In 60 AD, Heron of Alexandria, a Greek mathematician, brings invented a machine that is operated by a series of
instructions. Then in 724 the first fully mechanical clock made by a Chinese engineer- Liang Ling Can, a water driven
contraption with parts made the the clock to produce a ticking sound.
A series of inventions continued, such as Leonardo da Vinci, who had drawings to reveal inventions such as flying
machines, like the helicopter, one of the first programmable robots and the mechanical clock in 1492. Decades later, in
1614, a mathematician , John Napier of Scotland came up with a system of moveable rods used for calculations like
addition etc. these inventions continued over the years and as they went ahead each inventions got better and better. The
logarithmic slide rule by William Oughtred , the calculating clock in 1623 by Wilhelm Schickard- a German
Astronomer, used to add and subtract 6 digit numbers.
In 1642 Blaise Pascal invents the Pascaline; this was a mechanical adding machine. Gottfrid Leibniz is known as one the
founders of calculus. The Arithmometer was the first mass produced calculator was invented by Charles Xavier Thomas
de Colmer in 1820.
Figure 2 The Pascaline- a mechanical adding machine
Considering what is used in present as computer and its definition going as far back as these dates, one would be
confused. The history of the modern computer, as stated in its timeline, begins with two separate technologies—
automated calculation and programmability. Based on research done, no one device in the earlier times can be identified
as the ‘first’ computer. Here is one of the reasons, it is said to believe that the inventor was J.H. Smith and this idea came
about in 1782, but this idea was left untouched until in the 1800s, about 1822 to be exact, when an English mathematician
by the name of Charles Babbage resurrected the idea. Babbage was referred to as “the pioneer of the computer”. It was
Charles Babbage who first realized that a really useful calculating machine would need to store inside itself a list of the
sums to be done. This machine could complete one sum and move on to the next without waiting for an operator to turn
its handles.
Babbage designed his analytical engine in1833 but this creation was not completed, due to the fact that engineers in his
time were lacking the resources sand tools he needed to complete his creation. Later, between about 1834 and 1869,
Babbage improved his invention to what is known as the “Difference Engine”.
Fig. 5 Fig. 6
Fig. 7 More recent models of computer
Looking at modern computers, many would say that the “Difference Engine No. 2” is no way connected to the first
electronic computer. Another paradigm shift is, it has also been said that the first electronic computer was designed by an
unknown person and that it is believed he was the inventor is the computer, which was known as the Antikythera
Machine. This constituted one of the objects found in the remains of a wreck, approximately about 250 BC in the
Mediterranean Sea. It is also said that a German Engineer, Konrad Zuse invented the first computer in 1941.
After the theory of Charles Babbage’s invention there were follow ups such as; Samuel Morse with his invention about
1835, of the Morse Code which was a telegraphic device used to transmit messages, a telegraphy system; the Boolean
Algebra invented in 1848 by George Boole, a device used to reduce logical expressions in mathematics. These mentioned
are just a few of the of the ninetieth century inventions.
The Tabulating Machine introduced by Georg Scheutz and his son Edward in 1853; in 1869 Williams Stanley Jevons
designed a practical logic machine; Ramon Verea invents a fast calculator with an internal multiplication table; the
telephone called the Photophone which was invented by Alexander Graham Bell in 1880 and just to mention a few. It is
also said that Alan Turning develops the concept of a theoretical computing machine in 1937.
The main objective of the timeline in computer history is to give a basic understanding of the generation of the
computers’ past, present and future. It describes the events as it relates to the history and step by step development of
early computers leading modern day creations. Over the years past, computers have been going through various changes
and improvements. The computer has gone through different stages and generations. At present, there are five
generations of modern computers. The first generations of computers were made with vacuum tubes for circuitry and
used magnetic drums for memory; these devices were not cost effective and could only solve one problem at a time.
During the early stages of World War 2 the process of computer development was on faster pace due to the fact that
the government was funding the operation with the intention of utilizing the creation as a strategy to maximize their
potential. A German engineer by the name of Konrad Zuse developed a computer and named it Z3. This was used
design airplanes and missiles.
Then in 1942 John Astanasoff & Clifford Berry invented the ABC Computer- was the first electronic digital computer.
Shortly after in 1943 a device called Colossus built to decode German messages, but this break-through was kept a
secret.
By 1944 Howard H Aiken, a Harvard engineer, and Grace Harper was successful in making an all electronic
calculator. This device was used to create ballistic charts for the U.S. Navy and was also called The Harvard Mark 1
Computer. It was about half as long as a football field and contained about 500 miles of wiring. The Harvard-IBM
Automatic Sequence Controlled Calculator, or Mark I for short, was an electronic relay computer. Another computer
development that was spurred by the war was the Electronic Numerical Integrator and Computer (ENIAC). Its inventors
were John Presper Eckert & John W. Mauchly in the year of 1946. Such an invention was not like the preceeders, the
Mark 1 and the Colossus, it was a general purpose computer and over 1000 times faster than Mark 1.
Johnvon Neumann, in 1945, created an Electronic Discrete Variable Automatic Computer (EDVAC) which had a
memory designed to hold both a stored program as well as data and had conditional control transfer that allowed the
computer to be stopped and resumed at any point. What stood out in Neumann’s creation was the central processing
unit, which allowed through a single source. The UNIVAC 1 (Universal Automatic Computer) was branded as one of
the first commercially available computers to take advantage of these advances and this was in 1951.
The transistor was invented in 1947, after this invention the vacuum tube in televisions, radios and computers were
replaced which in turn contributed to the constant downsizing of the machines itself. By 1956 the 2 nd generation of
computers started getting smaller, faster and even more reliable. Second-generation computers moved from cryptic
binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in
words.
Some 2nd generation computers were: Stanford Research Institute, Bank of America and General Electric – the first Bank
Industry computer -also MICR (Magnetic Ink Character Recognition) for reading checks.
With the help of technology, instead of continuing to use punch cards and printouts, the introduction of keyboards and
monitors made life easier. The use of computers began to be a lot easier and even though the issue of overheating was
not yet completely eliminated until the quartz rock finally eliminated this issue. In 1958 the Integrated Circuit by Jack
Kilby & Robert Noyce (IC), this combined three electronic components onto a small disc. It was during this time that the
use of the operating system was introduced which allowed different programs to be running at once.
The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a
single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel
developed in 1971, took the IC a step further by locating all the components of a computer on a chip. In the years the
followed the 1970’s things like The IBM PC- Home Computer in 1981, MS-Dos Computer, then in 1983 Apple came
about with Lisa computer which was the first home computer with a (GUI) Graphical User Interface.
There is not very much to say about the fifth generation as it stands as it is still in its developing stage. The most
common example of the 5th generation of computer is the fictional HAL 9000 from author C.Clarke’s novel. Then there
is though there are some applications, such as voice recognition, that are being used today. The use of parallel processing
and superconductors is helping to make artificial intelligence a reality. The main aim of this generation is to continue and
maintain the technological consistencies and advanced creations.
Rául Rojas, Ulf Hashagen. “Timeline of computer history”. http://books.google.com/books?hl. October 02,
2010.