0% found this document useful (0 votes)
35 views

It Act 2&3

Computers

Uploaded by

samvllrz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views

It Act 2&3

Computers

Uploaded by

samvllrz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 12

Activity 2: History of Computers

1. Research Assignment:
o Use the internet to research the history of computers. Focus on the
following key milestones:

▪ The invention of the first computer.

English mathematician, philosopher, and inventor Charles Babbage first


proposed the idea of a computer in 1822; however, he referred to it as the Difference
Engine. And he went on to create an analytical engine, a general-purpose computer,
in 1833. It included the idea of integrated memory, an ALU, and some fundamental
flow chart concepts.

▪ Development of the first electronic digital computer.

Electronic digital computers were first introduced in the world with the
Atanasoff-Berry Computer (ABC). Iowa State University's computer was created
between 1937 and 1942 by John Vincent Atanasoff, a former physics and
mathematics professor at the university, and Clifford Berry, an alumnus of the
department who studied physics and electrical engineering.

The ABC was not like modern computers at all: it weighed 750 pounds, was
the size of a large desk, and included glowing vacuum tubes, rotating drums for
memory, and a read/write mechanism that recorded numbers by burning markings
on cards.

However, the device also pioneered a number of innovations that are still
present in modern computers today: a binary arithmetic system, independent
memory and computing functions, regenerative memory, parallel processing, on-off
switches made of electronic amplifiers, logical addition and subtraction circuits,
clocked control of electronic operations, and a modular design.

▪ Introduction of personal computers (PCs).

A breakthrough that made computers accessible to everyone was the


introduction of the personal computer in 1975. Computers were mostly used by
major companies at the time, and they were very enormous and costly. Due to their
extensive theoretical and technical basis, the first modern computers were
developed in the 1950s. Our society, as well as the ways in which we interact,
conduct business, learn, and play, have been significantly impacted by computer
use. As with other communication networks, its use has extended to every literate
region of the planet.

▪ Key figures in computer history (e.g., Charles Babbage, Alan Turing, Bill
Gates, Steve Jobs).

Charles Babbage

The concept of a mechanical computer was initially conceived by Charles


Babbage. Babbage invented the Analytical Engine, the first computer ever to be
programmed, and the Difference Engine, a device that could automatically calculate
the values of polynomial functions. He even created the first printer's blueprints.
October 18, 1871, was his death date.

George Boole

With the invention of the binary algebraic system by George Boole, any
mathematical equation could be simply defined as either "true" or "false". Processors
use this basic language, known as Boolean logic, to perform billions of operations
every second. It is most likely the most significant figure in computer history. He
passed away on December 8, 1864.

Walter Brattain

Walter Brattain invented the transistor in 1948, which proved essential to the
development of computers today as well as to all contemporary products and
telecommunications networks. A modern computer chip can contain tens of millions
of transistors, but the original transistor was little more than a centimeter. Oct. 13,
1987, was his death date.

Alan Turing

Alan Turing's primary field of expertise was mathematics. The Bombe, an


electromechanical device designed to aid codebreakers in figuring out the day's key
that the Germans were using on their Enigma machines, was the first Turing
machine ever developed. Turing, who gained prominence after the war, developed
the Turing Test, a technique for evaluating artificial intelligence. June 7th, 1954 was
his death date.

Steve Jobs

Steve Jobs, Ronald Wayne, and Stephen Wozniak started Apple in 1976.
Despite the company's early success with the Apple and especially the Apple II
computers, the original Macintosh (1984) was revolutionary because it was the first
computer to employ a mouse and graphical user interface in place of a command
line interface. He passed away on February 5th, 2011.

Bill Gates

The most well-known accomplishment of Bill Gates is starting Microsoft.


Microsoft went on to produce Windows, which is now the most widely used operating
system in history and is used by most people worldwide.

▪ Evolution of computer technology from the 20th century to the present.

Early 20th Century


1930 – Vannevar Bush devised and constructed the first large-scale automatic
general-purpose mechanical analogue computer, known as the Differential Analyzer.

1936 – The Turing machine was the name given to Alan Turing's invention of a
universal machine that could calculate anything that could be computed.

1939 – Bill Hewlett and David Packard found Hewlett-Packard in a Palo Alto,
California garage.

1941 – The first digital computer was created by German engineer and inventor
Konrad Zuse with his Z3 machine. But the device was destroyed when bombs were
dropped on Berlin during World War II.

1941 –In collaboration with graduate student Clifford Berry, J.V. Atanasoff creates a
computer that can simultaneously solve 29 equations. the initial instance in which a
computer's primary memory can hold data.

1945 – John Mauchly and J. Presper Eckert, two professors at the University of
Pennsylvania, develop the Electronic Numerical Integrator and Calculator (ENIAC).
The "Grandfather of computers" was able to solve "a vast class of numerical
problems" through reprogramming and was Turing-complete.

1946 – The first general-purpose electronic digital computer created in the United
States specifically for business use was called the UNIVAC I (Universal Automatic
Computer).
1949 – The University of Cambridge team created the Electronic Delay Storage
Automatic Calculator (EDSAC), which is considered the "first practical stored-
program computer."

1950 – The first completed stored-program computer in the United States was the
Standards Eastern Automatic Computer (SEAC), constructed in Washington, DC.

Late 20th Century

1953 – The first computer language, known as COBOL (COmmon, Business-


Oriented Language), was invented by computer scientist Grace Hopper. Instead than
using numbers to communicate commands to the computer, a user may now do so
using English-like phrases.

1954 – The FORTRAN programming language, which stands for FORmula


TRANslation, was developed by John Backus and two IBM programmers. The 650
was also created by IBM.

1958 – Robert Noyce and Jack Kirby invented the integrated circuit, also referred to
as the computer chip.

1962 – Introducing Atlas, the computer. The idea of "virtual memory" was first
introduced by this computer, which at the time was the fastest in the world.

1964 – A mouse-and-graphical user interface (GUI) combination is proposed by


Douglas Engelbart as a modern computer prototype.

1969 – UNIX, an operating system created in the C programming language to


address issues with software compatibility, was unveiled by Bell Labs developers
under the direction of Ken Thompson and Dennis Ritchie.

1970 – Intel introduces the first Dynamic Access Memory (DRAM) processor, the
Intel 1103.

1971 – Alan Shugart and a group of IBM engineers devised the floppy disc. A new
era in computer printing was ushered in when Xerox created the first laser printer
that same year, generating billions of dollars in revenue.
1973 – Ethernet was invented by Robert Metcalfe, a researcher at Xerox, and is
used to connect a lot of computers and other devices.

1974 – The market was unveiled to personal computers. The IBM 5100, Radio
Shack's TRS-80, and Altair Scelbi & Mark-8 were the first.

1975 – In January, Popular Electronics magazine declared the Altair 8800 to be the
first minicomputer kit ever made. Bill Gates and Paul Allen offer to develop software
for the Altair in the BASIC language.

1976 – The world was introduced to the Apple I, the first computer with a single
circuit board, by Steve Jobs and Steve Wozniak, who also founded Apple
Computers.

1977 – The Apple II, including colour graphics and a cassette drive for music
storage, is unveiled by Jobs and Wozniak at the inaugural West Coast Computer
Faire.

1978 – VisiCalc, the first computerized spreadsheet program, is released.

1979 – MicroPro International releases WordStar, a word processing program.

1981 – IBM introduces the Acorn, their first home computer, including two floppy
disks, a colour display, and an Intel CPU. Acorn runs on Microsoft's MS-DOS
operating system.

1983 – The 550-megabyte CD-ROM was released onto the market. It contained pre-
recorded material. Another first for this year was the Gavilan SC, which debuted as
the first flip-form portable computer and was marketed as a "laptop."

1984 – During the Super Bowl XVIII ad, Apple unveiled the Macintosh. It cost
$2,500.

1985 – Using a graphical user interface, Microsoft presents Windows, allowing for
multitasking. Furthermore, the release of the C++ programming language has
occurred.
1990 – HyperText Markup Language, or HTML, was invented by English scientist
and programmer Tim Berners-Lee. Along with creating HTML, URLs, servers, and
the first browser, he also invented the term "WorldWideWeb."

1993 – The Pentium CPU enhances the use of music and graphics on home
computers.

1995 – The Windows 95 operating system from Microsoft was introduced. To spread
the word, a $300 million marketing campaign was started. Java 1.0 is introduced by
Sun Microsystems, and JavaScript by Netscape Communications follows.

1996 – The Google search engine was developed by Larry Page and Sergey Brin at
Stanford University.

1998 – An all-in-one desktop Macintosh computer from Apple is introduced: the


iMac. These $1,300 computers included a 15-inch monitor, a CD-ROM, a 4GB hard
drive, and 32MB of RAM.

1999 – The term "wireless fidelity," or Wi-Fi, was invented and has a 300-foot initial
coverage range.

21st Century

2000 – 2000 saw the debut of the USB flash drive. Compared to previous storage
media types, they were faster and offered greater storage capacity for storing data.

2001 – Apple launches Mac OS X, which is the replacement for its traditional Mac
operating system. It was subsequently renamed OS X and then just macOS.

2003 – The first 64-bit CPU designed for consumer PCs, AMD's Athlon 64, was
available to consumers.

2004 – Facebook started off as a social media platform.

2005 – Android, a Linux-based mobile operating system, is bought by Google.

2006 – The MacBook Pro from Apple was available. The Pro was the first mobile
Intel-based dual-core computer made by the business.
2007 – Apple created the first iPhone, putting a variety of computer functions in our
hands. One of the earliest electronic reading devices, the Kindle, was also
introduced by Amazon in 2007.

2009 – Microsoft launched Windows 7.

2011 – Google releases Google Chrome OS on the Chromebook.

2014 – The smallest computer in the world, the University of Michigan Micro Mote
(M3), was built.

2015 – Apple makes the Apple Watch available. Microsoft released Windows 10 as
well.

2016 – The first reprogrammable quantum computer ever created is constructed.

2. Create a Timeline:
o Create a timeline that includes at least 10 significant events in the history of
computers.
o For each event, include the date, a brief description, and its significance.

• 1834: The analysis engine was announced by Charles Babbage.

Charles Babbage (1791–1871), a British computer pioneer, built this analytical


engine, the first completely automatic calculator. Babbage first thought of the
concept for an advanced calculator in 1812, one that could calculate and print
mathematical tables. Designed to analyze any mathematical formula and possess
even more analytical capabilities than his 1820s Difference engine, this machine was
conceived by him in 1834. Before Babbage's death in 1871, just a portion of the
machine had been finished as a trial piece.

• 1943: The birth of Mark | Colossus

The first computers with an electronic calculator were the Colossus machines.
The Colossus, initially created by Tommy Flowers, was one of the earliest digital
computers because the British used them to decipher German encrypted messages
during World War II.
• 1954: First prototype of desktop calculators

The first electronic calculator was introduced by IBM in the United States in
1954. It was a large, potentially $80,000 device, and it was built using transistors,
which was very innovative and technological for the time. However, in a short period
of time, more commercial models were introduced, which were more dependable
and more reasonably priced as a result of the development.

• 1969: The creation of ARPANET

For the benefit of the US Department of Defense, a modest computer network


called ARPANET was constructed. The different agencies in the nation used this as
a means of communication. This is the prototype network that would eventually grow
into the Internet that we know today. ARPANET finally shut down in 1990.

• 1971: The first e-mail is sent

In order to distinguish between the names of the user and their machine, the
"@" symbol was first used in a message transmitted across the network by Ray
Tomlinson of BBN in 1971. He transmitted a message between two PDP-10
computers made by Digital Equipment Corporation. Placed next to each other were
the two machines.

• 1981: IBM launches a PC

With the introduction of the personal computer, or PC, to the general public,
IBM accomplished a significant milestone in the history of humanity and computing in
particular. It was able to transform computing from a mostly unexplored field of study
into something that was helpful and practical for all.

• 1990: Tim Berners-Lee writes the first website

Link: http://info.cern.ch/hypertext/WWW/TheProject.html

British scientist Tim Berners-Lee uploaded this page to CERN's servers on


December 20, 1990, precisely, with the intention of elucidating the fundamental
ideas behind the contemporary web.

1991: The World Wide Web


The world wide web was designed to transmit information in a matter of
seconds when it was first established on August 6, 1991. Many websites currently
utilize it, and it supports a wide range of applications.

• 1997: The machine defeats the man, in chess


IBM's Deep Blue achieved a feat that no machine had ever achieved before in
1997. It was the first computer program to beat the current world chess champion in
a match played under conventional tournament rules in May of that year.

• 1998: Google was founded

Larry Page and Sergey Brin first crossed paths at Stanford as coworkers in
1995. They worked together on a search engine called BackRub while still
computing students, which ran on Stanford computers until its bandwidth
requirements became too great for the university to handle. They investigated
several ideas in 1997 after deciding that the BackRub search engine needed a
makeover. One of the ideas was to play a game of "googol," which is a mathematical
term that is symbolized by the number 1 followed by 100 zeros. In 1998, Andy
Bechtolsheim, who was Sun's co-founder at the time, wrote out a cheque for roughly
$100,000 to a freshly registered business that would bring us all joy: Google Inc.
Activity 3: Understanding Netiquette

1. Research Assignment:
o Research the term "netiquette" and understand its importance in online
communication.
o Identify the main principles of netiquette, such as:

▪ Respecting others' privacy.

Social media sites have privacy settings that let users choose who sees their
content and how much of it. This is crucial because, particularly if you have little
control over who sees your posts and stories, disclosing excessive amounts of
personal information online might jeopardize your privacy. This means that you
should value other people's online privacy just as much as how you want others to
respect yours.

▪ Being polite and respectful.

Avoid speaking and doing things on the internet that you wouldn't say or do in
person. Using derogatory and hurtful language about someone or something online
can lead to cyberbullying.

▪ Avoiding the use of ALL CAPS (considered shouting).

For some people, typing everything in uppercase may indicate that you're
yelling or furious. Refrain from doing this when talking with your parents, the elderly,
and teachers online. Be courteous and formal.

▪ Properly citing sources and giving credit.

Citing sources and borrowing ideas from other individuals is acceptable as


long as they are given due credit. Plagiarism is when you just borrow or utilize
someone else's work and pass it off as your own.

▪ Being cautious with humour and sarcasm to avoid misunderstandings.

The humour of Generation Z is highly advanced and may be challenging to


grasp if you're not up to date on the latest internet memes and trends. In this
instance, introducing sarcasm and the newest memes into discussions that call for
formality might be rude and cause misunderstandings. It's preferable to keep jokes
and sarcasm to yourself when conversing with friends and other individuals you can
be casual to.

2. Scenario Analysis:
o Write down five different online scenarios where netiquette should be
applied.
o For each scenario, explain the appropriate netiquette behaviour.

1. Sharing too much personal information and pictures online (school,


birthday, day-to-day happenings, alcohol, sexy pictures.)

Because the internet is a massive, global space and whatever you post
affects your reputation and digital identity, people need to realize that the
images and information they share online will remain there forever. Because
you cannot completely control your audience, sharing unnecessary
information online may compromise your privacy. That’s why you need to be
careful of what you post online just as the saying goes “Think before you
click.”

2. When you start typing in uppercase letters to show your excitement


when discussing your birthday with your mother, but she interprets it as
you’re angry at her.

Some people can interpret typing in all caps as an indication of rage or


anger. When conversing online with your parents, avoid typing in all caps and
use respectful language such as “po” and “opo” to show courtesy.

3. Spamming your family and friends with unwanted messages out of


boredom.

People often receive spams from unknown numbers, scammers and


companies but there are some who tend to do it on their relatives. Spamming
others can lead to them skipping an important message just because your
messages have accumulated in their chats, or the app started to malfunction
because it's being bombarded with plenty of notifications and messages.
There are numerous situations that can happen when spamming others and
some people might see spamming as rude and insensitive. That's why it’s
better to send a message only when it’s urgent or if it’s about something that
concerns the other party.
4. Resharing a bogus news video or website.

Fake news is very prevalent nowadays and could look true to the
inexperienced eye. Before sharing information online, fact check first and read
from reliable and credible sources to avoid taking part in spreading false
information.

5. Sharing inappropriate and explicit contents online without taking in


consideration your audience (underage mutuals, colleagues,
workmates, relatives etc.)

Act as you would in person when you're online. If you behave well in real
life but post inappropriate contents online, it could damage your reputation
and the way other people perceive you, particularly if you know them both in
person and online. Take into account your audience as well, as your posts
may be viewed by minors or your boss that can put your job at risk. Try
posting relevant information online that could be helpful rather than useless
stuff.

You might also like