It Act 2&3
It Act 2&3
1. Research Assignment:
o Use the internet to research the history of computers. Focus on the
following key milestones:
Electronic digital computers were first introduced in the world with the
Atanasoff-Berry Computer (ABC). Iowa State University's computer was created
between 1937 and 1942 by John Vincent Atanasoff, a former physics and
mathematics professor at the university, and Clifford Berry, an alumnus of the
department who studied physics and electrical engineering.
The ABC was not like modern computers at all: it weighed 750 pounds, was
the size of a large desk, and included glowing vacuum tubes, rotating drums for
memory, and a read/write mechanism that recorded numbers by burning markings
on cards.
However, the device also pioneered a number of innovations that are still
present in modern computers today: a binary arithmetic system, independent
memory and computing functions, regenerative memory, parallel processing, on-off
switches made of electronic amplifiers, logical addition and subtraction circuits,
clocked control of electronic operations, and a modular design.
▪ Key figures in computer history (e.g., Charles Babbage, Alan Turing, Bill
Gates, Steve Jobs).
Charles Babbage
George Boole
With the invention of the binary algebraic system by George Boole, any
mathematical equation could be simply defined as either "true" or "false". Processors
use this basic language, known as Boolean logic, to perform billions of operations
every second. It is most likely the most significant figure in computer history. He
passed away on December 8, 1864.
Walter Brattain
Walter Brattain invented the transistor in 1948, which proved essential to the
development of computers today as well as to all contemporary products and
telecommunications networks. A modern computer chip can contain tens of millions
of transistors, but the original transistor was little more than a centimeter. Oct. 13,
1987, was his death date.
Alan Turing
Steve Jobs
Steve Jobs, Ronald Wayne, and Stephen Wozniak started Apple in 1976.
Despite the company's early success with the Apple and especially the Apple II
computers, the original Macintosh (1984) was revolutionary because it was the first
computer to employ a mouse and graphical user interface in place of a command
line interface. He passed away on February 5th, 2011.
Bill Gates
1936 – The Turing machine was the name given to Alan Turing's invention of a
universal machine that could calculate anything that could be computed.
1939 – Bill Hewlett and David Packard found Hewlett-Packard in a Palo Alto,
California garage.
1941 – The first digital computer was created by German engineer and inventor
Konrad Zuse with his Z3 machine. But the device was destroyed when bombs were
dropped on Berlin during World War II.
1941 –In collaboration with graduate student Clifford Berry, J.V. Atanasoff creates a
computer that can simultaneously solve 29 equations. the initial instance in which a
computer's primary memory can hold data.
1945 – John Mauchly and J. Presper Eckert, two professors at the University of
Pennsylvania, develop the Electronic Numerical Integrator and Calculator (ENIAC).
The "Grandfather of computers" was able to solve "a vast class of numerical
problems" through reprogramming and was Turing-complete.
1946 – The first general-purpose electronic digital computer created in the United
States specifically for business use was called the UNIVAC I (Universal Automatic
Computer).
1949 – The University of Cambridge team created the Electronic Delay Storage
Automatic Calculator (EDSAC), which is considered the "first practical stored-
program computer."
1950 – The first completed stored-program computer in the United States was the
Standards Eastern Automatic Computer (SEAC), constructed in Washington, DC.
1958 – Robert Noyce and Jack Kirby invented the integrated circuit, also referred to
as the computer chip.
1962 – Introducing Atlas, the computer. The idea of "virtual memory" was first
introduced by this computer, which at the time was the fastest in the world.
1970 – Intel introduces the first Dynamic Access Memory (DRAM) processor, the
Intel 1103.
1971 – Alan Shugart and a group of IBM engineers devised the floppy disc. A new
era in computer printing was ushered in when Xerox created the first laser printer
that same year, generating billions of dollars in revenue.
1973 – Ethernet was invented by Robert Metcalfe, a researcher at Xerox, and is
used to connect a lot of computers and other devices.
1974 – The market was unveiled to personal computers. The IBM 5100, Radio
Shack's TRS-80, and Altair Scelbi & Mark-8 were the first.
1975 – In January, Popular Electronics magazine declared the Altair 8800 to be the
first minicomputer kit ever made. Bill Gates and Paul Allen offer to develop software
for the Altair in the BASIC language.
1976 – The world was introduced to the Apple I, the first computer with a single
circuit board, by Steve Jobs and Steve Wozniak, who also founded Apple
Computers.
1977 – The Apple II, including colour graphics and a cassette drive for music
storage, is unveiled by Jobs and Wozniak at the inaugural West Coast Computer
Faire.
1981 – IBM introduces the Acorn, their first home computer, including two floppy
disks, a colour display, and an Intel CPU. Acorn runs on Microsoft's MS-DOS
operating system.
1983 – The 550-megabyte CD-ROM was released onto the market. It contained pre-
recorded material. Another first for this year was the Gavilan SC, which debuted as
the first flip-form portable computer and was marketed as a "laptop."
1984 – During the Super Bowl XVIII ad, Apple unveiled the Macintosh. It cost
$2,500.
1985 – Using a graphical user interface, Microsoft presents Windows, allowing for
multitasking. Furthermore, the release of the C++ programming language has
occurred.
1990 – HyperText Markup Language, or HTML, was invented by English scientist
and programmer Tim Berners-Lee. Along with creating HTML, URLs, servers, and
the first browser, he also invented the term "WorldWideWeb."
1993 – The Pentium CPU enhances the use of music and graphics on home
computers.
1995 – The Windows 95 operating system from Microsoft was introduced. To spread
the word, a $300 million marketing campaign was started. Java 1.0 is introduced by
Sun Microsystems, and JavaScript by Netscape Communications follows.
1996 – The Google search engine was developed by Larry Page and Sergey Brin at
Stanford University.
1999 – The term "wireless fidelity," or Wi-Fi, was invented and has a 300-foot initial
coverage range.
21st Century
2000 – 2000 saw the debut of the USB flash drive. Compared to previous storage
media types, they were faster and offered greater storage capacity for storing data.
2001 – Apple launches Mac OS X, which is the replacement for its traditional Mac
operating system. It was subsequently renamed OS X and then just macOS.
2003 – The first 64-bit CPU designed for consumer PCs, AMD's Athlon 64, was
available to consumers.
2006 – The MacBook Pro from Apple was available. The Pro was the first mobile
Intel-based dual-core computer made by the business.
2007 – Apple created the first iPhone, putting a variety of computer functions in our
hands. One of the earliest electronic reading devices, the Kindle, was also
introduced by Amazon in 2007.
2014 – The smallest computer in the world, the University of Michigan Micro Mote
(M3), was built.
2015 – Apple makes the Apple Watch available. Microsoft released Windows 10 as
well.
2. Create a Timeline:
o Create a timeline that includes at least 10 significant events in the history of
computers.
o For each event, include the date, a brief description, and its significance.
The first computers with an electronic calculator were the Colossus machines.
The Colossus, initially created by Tommy Flowers, was one of the earliest digital
computers because the British used them to decipher German encrypted messages
during World War II.
• 1954: First prototype of desktop calculators
The first electronic calculator was introduced by IBM in the United States in
1954. It was a large, potentially $80,000 device, and it was built using transistors,
which was very innovative and technological for the time. However, in a short period
of time, more commercial models were introduced, which were more dependable
and more reasonably priced as a result of the development.
In order to distinguish between the names of the user and their machine, the
"@" symbol was first used in a message transmitted across the network by Ray
Tomlinson of BBN in 1971. He transmitted a message between two PDP-10
computers made by Digital Equipment Corporation. Placed next to each other were
the two machines.
With the introduction of the personal computer, or PC, to the general public,
IBM accomplished a significant milestone in the history of humanity and computing in
particular. It was able to transform computing from a mostly unexplored field of study
into something that was helpful and practical for all.
Link: http://info.cern.ch/hypertext/WWW/TheProject.html
Larry Page and Sergey Brin first crossed paths at Stanford as coworkers in
1995. They worked together on a search engine called BackRub while still
computing students, which ran on Stanford computers until its bandwidth
requirements became too great for the university to handle. They investigated
several ideas in 1997 after deciding that the BackRub search engine needed a
makeover. One of the ideas was to play a game of "googol," which is a mathematical
term that is symbolized by the number 1 followed by 100 zeros. In 1998, Andy
Bechtolsheim, who was Sun's co-founder at the time, wrote out a cheque for roughly
$100,000 to a freshly registered business that would bring us all joy: Google Inc.
Activity 3: Understanding Netiquette
1. Research Assignment:
o Research the term "netiquette" and understand its importance in online
communication.
o Identify the main principles of netiquette, such as:
Social media sites have privacy settings that let users choose who sees their
content and how much of it. This is crucial because, particularly if you have little
control over who sees your posts and stories, disclosing excessive amounts of
personal information online might jeopardize your privacy. This means that you
should value other people's online privacy just as much as how you want others to
respect yours.
Avoid speaking and doing things on the internet that you wouldn't say or do in
person. Using derogatory and hurtful language about someone or something online
can lead to cyberbullying.
For some people, typing everything in uppercase may indicate that you're
yelling or furious. Refrain from doing this when talking with your parents, the elderly,
and teachers online. Be courteous and formal.
2. Scenario Analysis:
o Write down five different online scenarios where netiquette should be
applied.
o For each scenario, explain the appropriate netiquette behaviour.
Because the internet is a massive, global space and whatever you post
affects your reputation and digital identity, people need to realize that the
images and information they share online will remain there forever. Because
you cannot completely control your audience, sharing unnecessary
information online may compromise your privacy. That’s why you need to be
careful of what you post online just as the saying goes “Think before you
click.”
Fake news is very prevalent nowadays and could look true to the
inexperienced eye. Before sharing information online, fact check first and read
from reliable and credible sources to avoid taking part in spreading false
information.
Act as you would in person when you're online. If you behave well in real
life but post inappropriate contents online, it could damage your reputation
and the way other people perceive you, particularly if you know them both in
person and online. Take into account your audience as well, as your posts
may be viewed by minors or your boss that can put your job at risk. Try
posting relevant information online that could be helpful rather than useless
stuff.