Generation of Computers
Generation of Computers
The history of computer development is often referred to in reference to the different generations of
development that fundamentally changed the way computers operate, resulting in increasingly
The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often
enormous, taking up entire rooms. They were very expensive to operate and in addition to using a
great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.
First generation computers relied on machine language, the lowest-level programming language
understood by computers, to perform operations, and they could only solve one problem at a time.
Input was based on punched cards and paper tape, and output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The
UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau
in 1951.
Transistors replaced vacuum tubes and ushered in the second generation of computers. The
transistor was invented in 1947 but did not see widespread use in computers until the late 1950s.
The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster,
cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though
the transistor still generated a great deal of heat that subjected the computer to damage, it was a
1
vast improvement over the vacuum tube. Second-generation computers still relied on punched
programming languages were also being developed at this time, such as early versions of COBOL
and FORTRAN. These were also the first computers that stored their instructions in their memory,
The first computers of this generation were developed for the atomic energy industry.
The development of the integrated circuit was the hallmark of the third generation of computers.
Transistors were miniaturized and placed on silicon chips, called semiconductors, which
Instead of punched cards and printouts, users interacted with third generation computers through
keyboards and monitors and interfaced with an operating system, which allowed the device to run
many different applications at one time with a central program that monitored the memory.
Computers for the first time became accessible to a mass audience because they were smaller and
The microprocessor brought the fourth generation of computers, as thousands of integrated circuits
were built onto a single silicon chip. What in the first generation filled an entire room could now
fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of
2
the computer—from the central processing unit and memory to input/output controls—on a single
chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the
Macintosh. Microprocessors also moved out of the realm of desktop computers and into many
areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks,
which eventually led to the development of the Internet. Fourth generation computers also saw the
Fifth generation computing devices, based on artificial intelligence, are still in development,
though there are some applications, such as voice recognition, that are being used today. The use
Quantum computation and molecular and nanotechnology will radically change the face of
computers in years to come. The goal of fifth-generation computing is to develop devices that
respond to natural language input and are capable of learning and self-organization.
3
Places Where Computers Are Used
The role of the computer runs through the day to day activities of man. Hence, it is safe to say the
computers are virtually used in all spheres of life. For the purposes of mention, the following are
listed:
1. Educational institutions
3. Transport
7. Communication
8. Entertainment