0% found this document useful (0 votes)
40 views

Cybercrime Lab 1

The document provides a timeline of the development of computers from their origins in the 19th century to modern times. It describes early mechanical calculators and how punch cards were used for data storage. The timeline details important innovations like the creation of programming languages and the microchip. It also outlines the development of personal computers and the rise of the internet and world wide web.

Uploaded by

antag
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views

Cybercrime Lab 1

The document provides a timeline of the development of computers from their origins in the 19th century to modern times. It describes early mechanical calculators and how punch cards were used for data storage. The timeline details important innovations like the creation of programming languages and the microchip. It also outlines the development of personal computers and the rise of the internet and world wide web.

Uploaded by

antag
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Laboratory Exercise No.

1
Name: Lammatao, Trizia Nicole M. Date: August 3,2021
Course & Year: 4th Year – Introduction to Cyber Crime Lab.
OVERVIEW OF CYBERCRIME
Objectives:
1. Describe the development of Computer and Cybercrime.
Procedure:
1. Provide a timeline of the beginning of computer and its development; and
2. Describe the first recorded cybercrime in history, and how does it committed.

The computer was created to tackle a severe number-crunching situation, not for fun or
email. By 1880, the US population had grown so large that tabulating the findings took
over seven years. The government needed a faster way to do things, so punch-card
computers took up entire rooms.

Our smartphones now have greater computer power than these early devices. The
following timeline shows how computers grew from simple calculators to machines that
can surf the internet, play games, and stream multimedia.
1801: Joseph Marie Jacquard invents a loom that weaves fabric designs using punched
wooden cards. Early computers used punch cards.

1822: English scientist Charles Babbage devises a steam-powered calculating engine


capable of calculating tables. The English government-funded project is a flop. But it took
almost a century for the world's first computer to be developed

1890: Herman Hollerith creates a punch card technique to calculate the 1880 census,
saving the government $5 million. He founds what would become IBM.

1936: Alan Turing proposes the Turing machine, a universal machine capable of
computing everything. His thoughts shaped the contemporary computer's core notion.

1937: J.V. Atanasoff, of Iowa State University, builds the first computer without gears,
cams, belts, or shafts.

The Computer History Museum says David Packard and Bill Hewlett launched HP in a
Palo Alto garage in 1939.

1941: Atanasoff and his graduate student Clifford Berry design a 29-equation computer.
This is the first time a computer can store data in main memory.
1943-1944: John Mauchly and J. Presper Eckert of Penn create the Electronic Numerical
Integrator and Calculator (ENIAC). It has 18,000 vacuum tubes and fills a 20-by-40-foot
space.

1946: Mauchly and Presper depart Penn to construct the UNIVAC, the first commercial
computer for business and government.

The transistor is invented by Bell Laboratories' William Shockley, John Bardeen, and
Walter Brattain. They learned how to build an electric switch without a vacuum.

1953: Grace Hopper creates the first computer language, COBOL. The IBM 701 EDPM
was created by Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr.

1954: A team of IBM programmers led by John Backus develops the FORTRAN
programming language.

1958: Jack Kilby and Robert Noyce introduce the computer chip. Kilby's work earned him
the 2000 Nobel Prize in Physics.

Douglas Engelbart shows a contemporary computer with a mouse and a graphical user
interface in 1964. (GUI). This represents the transition of the computer from a specialized
tool for scientists and mathematicians to a more universal one.

The first version of UNIX was released in 1969 by Bell Labs to address compatibility
difficulties. UNIX, written in C, was portable across platforms and became the preferred
operating system for large corporate and government mainframes. The system's slowness
kept it from gaining momentum among home PC users.

1970: Intel introduces the Intel 1103, the first DRAM chip. Alan Shugart leads an IBM
team that invents the "floppy disk," allowing computers to share data.

1973: Xerox researcher Robert Metcalfe develops Ethernet for connecting computers and
other gear.

1974-1977: Scelbi & Mark-8 Altair, IBM 5100, Radio Shack's TRS-80 (nicknamed "Trash
80"), and Commodore PET all hit the market.

"World's First Minicomputer Kit to Rival Commercial Models" is featured in January


1975's Popular Electronics magazine. Paul Allen and Bill Gates offer to build BASIC
software for the Altair. On April 4, after the success of their first venture, the two boyhood
friends founded Microsoft.
1976: Steve Jobs and Steve Wozniak launch Apple Computers on April 1st, releasing the
Apple I, the first computer with a single-circuit board.

1977: Radio Shack produced only 3,000 TRS-80s. It went crazy. For the first time, non-
geeks could program a computer.

The Apple II is introduced at the first West Coast Computer Faire in 1977. It has color
graphics and a storage audio cassette drive.

1978: VisiCalc, the first computerized spreadsheet program, is released.

1979: MicroPro International launches WordStar. in 2000, creator Rob Barnaby told Mike
Petrie that margins and word wrap were the distinctive changes. "The command mode was
removed and a print function was added. Technically, I was the brains; I worked it out, did
it, and documented it."
1981: IBM releases the "Acorn" personal computer. It runs MS-DOS from Microsoft. Intel
chip, 2 floppy disks, optional color monitor The machines are sold by Sears and
Computerland, marking the first time a computer is sold by third parties. It also
popularized PC.

1983: Apple's Lisa is the first GUI-equipped PC. It also has a menu and icons. It fails and
evolves into the Macintosh. The Gavilan SC was the first "laptop" computer with the
famous flip form factor.

Microsoft announces Windows in 1985, according to EB. It was an answer to Apple's GUI.
Commodore introduces the Amiga 1000, a multimedia computer.

On March 15, 1985, the first dot-com domain name is registered, years before the World
Wide Web. Symbolics Computer Company of Massachusetts registers Symbolics.com.
After two years, only 100 dot-coms were registered.

Compaq releases the Deskpro 386 in 1986. Its 32-bit architecture rivals mainframe speed.

Tim Berners-Lee, a researcher at CERN in Geneva, creates HTML, giving rise to the
World Wide Web. It was released in 1993, and it improved visuals and music on PCs.
Command & Conquer, Alone in the Dark 2, Theme Park, Magic Carpet, Descent, and
Little Big Adventure are among the games released in 1994.

In 1996, Sergey Brin and Larry Page created Google at Stanford.

1997: Microsoft invests $150 million in struggling Apple, terminating Apple's lawsuit
accusing Microsoft of copying its operating system's "look and feel."
1999: The word Wi-Fi enters the computing lexicon, and wireless Internet connections
begin.

2001: Apple releases Mac OS X, which has protected memory architecture and preemptive
multi-tasking. Microsoft responds with Windows XP, which has a completely revamped
GUI.

2003: AMD's Athlon 64 CPU becomes available to consumers.

2002-2004: Mozilla's Firefox 1.0 The popular web browser, Internet Explorer. Launch of
Facebook. YouTube is founded in 2005. Google buys Android, a Linux-based smartphone
OS.

2006: Apple releases the MacBook Pro, its first dual-core Intel mobile computer, and an
Intel iMac. The Wii game console is released. Added computer functions to the iPhone in
2007.

2009: Microsoft releases Windows 7, which includes taskbar pinning and improved touch
and handwriting recognition, among other improvements. A stagnant tablet computer
category gets a new lease of life when Apple debuts the iPad in 2010.

2011: Google releases the Chromebook, a Chrome OS laptop.

2015: Apple Watch released. Windows 10 is released.

Reprogrammable quantum computer created in 2016. "Until today, no quantum-computing


platform could program new algorithms into its system. They're usually designed to target
specific algorithms "Shantanu Debnath, a quantum physicist and optical engineer at
UMCP, led the work.

2017: DARPA is creating a "Molecular Informatics" program that employs molecules as


computers. Anne Fischer, program manager at DARPA's Defence Sciences Office, said
chemistry has a wealth of qualities that might be used to store and analyse data quickly and
efficiently. "There are millions of molecules, each with its own three-dimensional atomic
structure and changeable form, size, and color. This diversity opens up new possibilities for
encoding and processing data beyond the binary 0s and 1s of present logic-based digital
structures." [Future computers may be tiny molecular machines]
The earliest reported instance of cybercrime occurred in the year 1820! The
abacus, which is regarded to be the earliest type of a computer, has been
around in India, Japan, and China since 3500 B.C. But the era of modern
computers began with Charles Babbage's analytical engine. The loom was
created in 1820 by Joseph Marie Jacquard of France. This mechanism allowed
the weaving of specific fabrics to be repeated. Employees at Jacquard feared
losing their traditional jobs and livelihoods. They sabotaged Jacquard's use of
the new technology.

You might also like