0% found this document useful (0 votes)
31 views

Com. Org

The document discusses trends in computing from various perspectives over the last 20 years: - Cost of hardware has decreased significantly due to intense competition and improvements in manufacturing processes. Processor prices have dropped from over $1000 to under $500. - Memory size has increased dramatically, with Dynamic RAM capacities now being 4-8 times larger than Static RAM. Memory transistors and refresh requirements have also decreased. - Hardware speed has risen enormously, with supercomputers now capable of over 100 quadrillion floating point operations per second compared to millions of instructions per second previously. More powerful exascale supercomputers are being developed. - The number of processing elements in systems has grown due to improvements like simultaneous multithreading and parallel

Uploaded by

Joy Tigno
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views

Com. Org

The document discusses trends in computing from various perspectives over the last 20 years: - Cost of hardware has decreased significantly due to intense competition and improvements in manufacturing processes. Processor prices have dropped from over $1000 to under $500. - Memory size has increased dramatically, with Dynamic RAM capacities now being 4-8 times larger than Static RAM. Memory transistors and refresh requirements have also decreased. - Hardware speed has risen enormously, with supercomputers now capable of over 100 quadrillion floating point operations per second compared to millions of instructions per second previously. More powerful exascale supercomputers are being developed. - The number of processing elements in systems has grown due to improvements like simultaneous multithreading and parallel

Uploaded by

Joy Tigno
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

COMPUTER

ORGANIZATION
Name: Johann B. Tigno

Year/section: BSIT-1B

Instructor: Rodolfo R. Raborar

Activity:

1. What has been the trend in computing from the following points of view?

A. Cost of Hardware

Chop Area Mig Price Multiplier Comment


mm2 cost

386DX 43 $9 $31 3.4 Intense Competition

486DX2 81 $35 $245 7.0 No Competition

PowerPC 601 121 $77 $280 3.6

DEC Alpha 234 $202 $1231 6.1 Recoup R&D?

Pentium 296 $473 $965 2.0 Early in shipments

B. Size of Memory

Technology - DRAM & SDRAM


Nature- Static & Dynamic
Cost- 8-16 times of a DRAM
-less
Transistors/bit- 4 to 6
-1
Refreshing- Not needed
- 2ms.typically

Capacity- less
- 4-8 times of a SDRAM

C. Speed of Hardware
A super computer is a computer with high level of performance as compared to a general-
purpose computer. The performance of a supercomputer is commonly measured in floating-point
operations per second (FLOPS) instead of million instruction per second (MIPS). Since 2017,
there are supercomputers which can perform over a hundred quadrillion FLOPS (petaFLOPS).
Since November 2017, all of the world’s fastest 500 supercomputers run Linux-based operation
system. Additional research is being conducted in China, the United States, European Union,
Taiwan and Japan to build faster, more powerful and technologically superior exascale
supercomputers.

D. Number of processing elements


The Intel Pentium was 4 the first modern desktop processor to implement simulation
multithreading starting from the 3.06 GHz model released in 2002 and since introduces into a
number of their processors. Intel calls the functionality Hyper-Threading Technology (HTT), and
provides a basic two-thread SMT (simultaneous multithreading) engine Intel claims up to a 30%
speed improvement compared against an otherwise identical, non-SMT Pentium 4.
Parallel computing, on the other hand, uses multiple processing elements simultaneously to solve
a problem. This is accomplished by breaking the problem into independent parts so that each
processing element can be devise and include resources such as single computer with multiple
processors, several networked computers, specialized hardware, or any combination of the
above.

E. Geographical location of system components


The tatukGIS Developer Kernel (DK) is a professional grade, general-purpose GIS software
development kit (SDK) used by costumers in a wide range of industries to develop custom GIS
applications or add GIS functionality to existing products. A Developer Kernel product edition is
available for just about every development platform, providing the means develop GIS for just
about any operation system.
2. Give the trend in Computing the last 20 years. What are your prediction for the future of
computing?

1. Big data- Big data begins at the point when your organization's data grows faster than your IT
department's ability to manage it. Computer department staffers used to leave work on time,
except maybe when they were extinguishing a fire or writing code. Now, data management is a
specialty field.
2. The Ericssons R380, the first phone running Symbian OS was released on March 18 1999.
3. Transmeta released the Cruseo microprocessors on January 19 2000. The Cruseo was
intended for laptops and consumed significantly less electricity than most microprocessors of the
time, while providing comparable performance to the mid-range Pentium ll microprocessors.
4. February 8, 1999, VMware introduced VMware Virtual Platform for the Intel IA-32
architecture
5.

You might also like