History_of_computer_and_its_generations
History_of_computer_and_its_generations
net/publication/336700280
CITATION READS
1 892,161
2 authors, including:
Ishaq Zakari
Umaru Musa Yar'adua University
10 PUBLICATIONS 1 CITATION
SEE PROFILE
All content following this page was uploaded by Ishaq Zakari on 21 October 2019.
By:
ISHAQ ZAKARI
BSc. Computer Science
Question:
The world is becoming a global village through the use of computer, thus
there is the need for everyone to be computer illiterate.
What is Computer?
2
iii. Process: To calculate compare arrange.
This progressed up to the period 1760 – 1830 which was the period of the
industrial revolution in Great Britain where the use of machine for production
altered the British society and the Western world. During this period Joseph
Jacquard invented the weaving loom (a machine used in textile industry).
3
The computer was born not for entertainment or email but out of a need to
solve a serious number-crunching crisis. By 1880, the United State (U.S)
population had grown so large that it took more than seven years to tabulate the
U.S. Census results. The government sought a faster way to get the job done,
giving rise to punch-card based computers that took up entire rooms. Today, we
carry more computing power on our smart phones than was available in these
early models. The following brief history of computing is a timeline of how
computers evolved from their humble beginnings to the machines of today that
surf the Internet, play games and stream multimedia in addition to crunching
numbers. The followings are historical events of computer.
1623: Wilhelm Schickard designed and constructed the first working mechanical
calculator.
1801: In France, Joseph Marie Jacquard invents a loom that uses punched
wooden cards to automatically weave fabric designs. Early computers would
use similar punch cards. Home / News / Tech / Health / Planet Earth / Strange
News / Animals / History / Culture / Space & Physics.
1885: Herman Hollerith invented the tabulator, which used punched cards to
process statistical information; eventually his company became part of IBM.
1890: Herman Hollerith designs a punch card system to calculate the 1880
census, accomplishing the task in just three years and saving the government $5
million. He establishes a company that would ultimately become IBM.
1936: Alan Turing presents the notion of a universal machine, later called the
Turing machine, capable of computing anything that is computable. The central
concept of the modern computer was based on his ideas.
1937: One hundred years after Babbage's impossible dream, Howard Aiken
convinced IBM, which was making all kinds of punched card equipment and
was also in the calculator business to develop his giant programmable
calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine,
which itself used cards and a central computing unit. When the machine was
finished, some hailed it as "Babbage's dream come true".
1941: Atanasoff and his graduate student, Clifford Berry, design a computer that
can solve 29 equations simultaneously. This marks the first time a computer is
able to store information on its main memory.
5
1943-1944: Two University of Pennsylvania professors, John Mauchly and J.
Presper Eckert, build the Electronic Numerical Integrator and Calculator
(ENIAC). Considered the grandfather of digital computers, it fills a 20-foot by
40-foot room and has 18,000 vacuum tubes.
1946: Mauchly and Presper leave the University of Pennsylvania and receive
funding from the Census Bureau to build the UNIVAC, the first commercial
computer for business and government applications.
1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories
invent the transistor. They discovered how to make an electric switch with solid
materials and no need for a vacuum.
1953: Grace Hopper develops the first computer language, which eventually
becomes known as COBOL. Thomas Johnson Watson Jr., son of IBM CEO
Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United
Nations keep tabs on Korea during the war.
1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the
computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his
work.
6
UNIX was portable across multiple platforms and became the operating system
of choice among mainframes at large companies and government entities. Due
to the slow nature of the system, it never quite gained traction among home PC
users. 1970: The newly formed Intel unveils the Intel 1103, the first Dynamic
Access Memory (DRAM) chip.
1971: Alan Shugart leads a team of IBM engineers who invent the "floppy
disk," allowing data to be shared among computers.
1973: Robert Metcalfe, a member of the research staff for Xerox, develops
Ethernet for connecting multiple computers and other hardware.
1974 -1977: A number of personal computers hit the market, including Scelbi &
Mark-8 Altair, IBM 5100, Radio Shack's TRS-80 — affectionately known as
the "Trash 80" — and the Commodore PET.
1975: The January issue of Popular Electronics magazine features the Altair
8080, described as the "world's first minicomputer kit to rival commercial
models." Two "computer geeks," Paul Allen and Bill Gates, offer to write
software for the Altair, using the new Beginners All Purpose Symbolic
Instruction Code (BASIC) language. On April 4, after the success of this first
endeavor, the two childhood friends form their own software company,
Microsoft.
1976: Steve Jobs and Steve Wozniak start Apple Computers on April Fool's
Day and roll out the Apple I, the first computer with a single-circuit board,
according to Stanford University.
1977: Radio Shack's initial production run of the TRS-80 was just 3,000. It sold
like crazy. For the first time, non-geeks could write programs and make a
computer do what they wished.
7
1977: Jobs and Wozniak incorporate Apple and show the Apple II at the first
West Coast Computer Faire. It offers color graphics and incorporates an audio
cassette drive for storage.
1983: Apple's Lisa is the first personal computer with a graphical user interface
(GUI). It also features a drop-down menu and icons. It flops but eventually
evolves into the Macintosh. The Gavilan SC is the first portable computer with
the familiar flip form factor and the first to be marketed as a "laptop." The TRS-
80, introduced in 1977, was one of the first machines whose documentation was
intended for non-geeks (Image: © Radioshack)
1986: Compaq brings the “Deskpro 386” to market. Its 32-bit architecture
provides as speed comparable to mainframes.
1993: The Pentium microprocessor advances the use of graphics and music on
PCs.
1994: PCs become gaming machines as "Command & Conquer," "Alone in the
Dark 2," "Theme Park," "Magic Carpet," "Descent" and "Little Big Adventure"
are among the games to hit the market.
1996: Sergey Brin and Larry Page develop the Google search engine at
Stanford University.
1997: Microsoft invests $150 million in Apple, which was struggling at the
time, ending Apple's court case against Microsoft in which it alleged that
Microsoft copied the "look and feel" of its operating system.
1999: The term Wi-Fi becomes part of the computing language and users begin
connecting to the Internet without wires.
2001: Apple unveils the Mac OS X operating system, which provides protected
memory architecture and pre-emptive multi-tasking, among other benefits. Not
9
to be outdone, Microsoft rolls out Windows XP, which has a significantly
redesigned graphical user interface GUI.
2003: The first 64-bit processor, AMD's Athlon 64, becomes available to the
consumer market.
2006: Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile
computer, as well as an Intel-based iMac. Nintendo's Wii game console hits the
market.
2007: The iPhone brings many computer functions to the smart phone.
2010: Apple unveils the iPad, changing the way consumers view media and
jumpstarting the dormant tablet computer segment.
2011: Google releases the Chromebook, a laptop that runs the Google Chrome
OS.
2015: Apple releases the Apple Watch. Microsoft releases Windows 10.
2016: The first reprogrammable quantum computer was created. "Until now,
there hasn't been any quantum-computing platform that had the capability to
program new algorithms into their system. They're usually each tailored to
10
attack a particular algorithm," said study lead author Shantanu Debnath, a
quantum physicist and optical engineer at the University of Maryland, College
Park.
In 1937 the first electronic digital computer was built by Dr. John V.
Atanasoff and Clifford Berry. It was called the Atanasoff-Berry Computer
11
(ABC). In 1943 an electronic computer name the Colossus was built for the
military. Other developments continued until in 1946 the first general– purpose
digital computer, the Electronic Numerical Integrator and Calculator (ENIAC)
was built. It is said that this computer weighed 30 tons, and had 18,000 vacuum
tubes which was used for processing. When this computer was turned on for the
first time lights dim in sections of Philadelphia. Computers of this generation
could only perform single task, and they had no operating system.
Characteristics:
Examples:
12
Second Generation of Computer (1947 – 1962):
Characteristics:
i. The computers were still large, but smaller than the first generation of
computers.
ii. They use transistor in place of Vacuum Tubes to perform calculation.
iii. They were produced at a reduced cost compared to the first generation
of computers.
iv. Possession of magnetic tapes as for data storage.
v. They were using punch cards as input and output of data and
information. The use of keyboard as an input device was also
introduced.
vi. These computers were still generating a lot of heat in which an air
conditioner is needed to maintain a cold temperature.
vii. They have about one thousand circuits per cubic foot.
Example:
Characteristics:
i. They used large-scale integrated circuits, which were used for both
data processing and storage.
ii. Computers were miniaturized, that is, they were reduced in size
compared to previous generation.
iii. Keyboard and mouse were used for input while the monitor was used
as output device.
iv. Use of programming language like COBOL and FORTRAN were
developed.
v. They have hundred thousand circuits per cubic foot.
Examples:
14
Transistors on one chip were capable performing all the functions of a
computer’s central processing unit.
Characteristics:
Examples:
Characteristics:
15
v. Ability of computers to mimic human intelligence, e.g. voice
recognition, facial face detector, thumb print.
vi. Satellite links, virtual reality.
vii. They have billions of circuits per cubic.
Examples:
i. Super computers
ii. Robots
iii. Facial face detector
iv. Thumb print.
Conclusion:
The earliest foundations of what would become computer science predate the
invention of the modern digital computer. Machines for calculating fixed
numerical tasks such as the abacus have existed Charles Babbage, sometimes
referred to as the "father of computing". Ada Lovelace is often credited with
publishing the first algorithm intended for processing on a computer.
In1980 Microsoft Disk Operating System (MS-Dos) was born and in 1981
IBM introduced the personal computer (PC) for home and office use. Three
years later Apple gave us the Macintosh computer with its icon driven interface
and the 90s gave us Windows operating system. As a result of the various
improvements to the development of the computer we have seen the computer
being used in all areas of life. It is a very useful tool that will continue to
experience new development as time passes.
16
REFERENCES
Keates, Fiona (June 25, 2012). "A Brief History of Computing" . The
Repository. The Royal Society.
"In this sense Aiken needed IBM, whose technology included the use of punched
cards, the accumulation of numerical data, and the transfer of numerical
data from one register to another", Bernard Cohen, p.44 (2000).
17