Computer Processor History
Computer Processor History
htm
http://www.clientsfirst-us.com/blog/partners-perspective/tech-tips/software-and-tech-industry-acronyms/
http://www.computerhistory.org/timeline/?year=1950
3 Global Foundries Sanjay Kumar Jha Sanjay Kumar Jha who hails from Bihar is now the
CEO of Global Foundries, the world's first full-service
semiconductor foundry. he was earlier the chief
executive officer of Motorola’s mobile devices business.
Prior to this, he served as the chief operating officer at
Qualcomm.He has a PhD in electronics engineering
from University of Strathclyde, Scotland.
4 Adobe Shantanu Narayen Shantanu Narayen who’s from Hyderabad heads one of the
world’s leading software companies, Adobe. Narayen holds
an electronics engineering degree from Osmania University,
MBA from the University of California, Berkeley, and
Masters in computer science from Bowling Green State
University, Ohio
5 SoftBank Internet and Nikesh Arora Nikesh was a telecom analyst at Putnam InvestmentsNikesh
Media Inc who graduated as an Electrical Engineer from IIT-BHU, has
an MBA from Northeastern University, a Masters degree in
Finance from Boston College
6 Cognizant Francisco D'Souza Francisco, who hails from Goa, is a member of the
Board of Directors of General Electric Company. He
serves on the Board of Trustees of Carnegie Mellon
University, the Board of Trustees of The New York Hall
of Science and the Board of Trustees of the US India
Business Council. He received his Bachelor's degree in
Business Administration from the University of East
Asia and his MBA from Carnegie-Mellon University.
7 Harman International Dinesh Paliwal Dinesh is from Agra & did his BE from the Indian
Institute of Technology, Roorkee; MS in Applied
Science and Engineering and MBA in Finance from
MiamiUniversity (Oxford, Ohio).
8 SanDisk Corporation Sanjay Mehrotra Mehrotra received his Bachelor's and Master's degrees in
Electrical Engineering and Computer Sciences from the
University of California, Berkeley, and is a graduate of the
Stanford Graduate School of Business Executive Program
(SEP).
9 Nokia Rajeev Suri Rajeev has a Bachelor of Engineering (Electronics and
Communications) from Manipal Institute of Technology,
India.
Born in India in 1967, he is a Singaporean citizen and is
based in Espoo, Finland.
10 NetApp George Kurian Prior to joining NetApp, George was vice president and
general manager of the Application Networking and
Switching Technology Group at Cisco Systems. George did
his schooling at St Joseph's Boys High School, Bengaluru.
After that he joined IIT Madras. But he left IIT to pursue a
degree in electrical engineering from Princeton University
and a master’s degree in business administration from
Stanford University.
As computational power rises exponentially, not linearly, so does the rate of change -- and that means the next 10
years should pack in far more technological change than the last 10.
Disruptive technology is, by its very nature, unpredictable, but it is still possible to look at the work being done by
R&D labs around the world and see clues as to what the future holds. That's the full-time job of Dave Evans,
Cisco's chief futurist and chief technologist for the Cisco Internet Business Solutions Group (IBSG).
At Cisco Live, Evans outlined what he believed to be the top 10 trends that will change the world in 10 years. Here
is his list, with commentary augmented by yours truly based on interviews in the past year with numerous other
industry analysts and visionaries.
No. 1: The Internet of Things
We have passed the threshold where more things are connected to the Internet than people. The transition to IPv6
also supports seemingly limitless connectivity. Cisco IBSG predicts the number of Internet-connected things will
reach 50 billion by 2020, which equates to more than six devices for every person on Earth. Many of us in the
developed world already have three or more full-time devices connected to the Internet when factoring in PCs,
smartphones, tablets, television devices and the like. Next up are sensor networks, using low-power sensors that
"collect, transmit, analyze and distribute data on a massive scale," says Evans.
In partnership with tech training provider Pluralsight, Network World offers a free online course on
Read Now
Such sensors, based on standards like Zigbee, 6LoWPAN and Z-wave, are currently being used in both predictable
and surprising ways. Zigbee is being embedded in smart appliances and smart meters. 6LoWPAN (over IPv6) is
used by Vint Cerf for his wine cellar climate-monitoring system. Z-Wave is the basis for Verizon's smart home
automation service. But more creative uses are emerging, too. Sparked, a Dutch startup, implants sensors in the ears
of cattle to monitor cows' health and whereabouts. Sensors are being embedded in shoes, medicine like asthma
inhalers, and medical exploratory surgery devices. There's even a tree in Sweden wired with sensors that tweets its
mood and thoughts, with a bit of translation help from an interpretive engine developed by Ericsson
(@connectedtree or #ectree).
About 5 exabytes of unique information were created in 2008. That's 1 billion DVDs. Fast forward three years and
we are creating 1.2 zettabytes, with one zettabyte equal to 1,024 exabytes. "This is the same as every person on
Earth tweeting for 100 years, or 125 million years of your favorite one-hour TV show," says Evans. Our love of
high-definition video accounts for much of the increase. By Cisco's count, 91% of Internet data in 2015 will be
video.
VISUALIZE: Volume of data darn near indescribable ... without the iPad
Much of Cisco's development focus (not to mention its marketing) preaches that the so-called "zettaflood" will
require vastly improved networks to move more data, and not drop the ball (or the packets) of our beloved video.
Resources
Video/Webcast
Sponsored
White Paper
See All
Much of the zettaflood of data will be stored in the cloud. Certainly, most of it is being accessed by the cloud,
rather than only on private networks. By 2020, one-third of all data will live in or pass through the cloud, Cisco
predicts. Global cloud services revenue will jump 20% per year, and IT spending on innovation and cloud
computing could top $1 trillion by 2014. That's enough to create the next Google. "Already, the cloud is powerful
enough to help us communicate through real-time language translation, increase our knowledge from access to
powerful supercomputers such as Wolfram Alpha, and improve our health using computing platforms like IBM's
Watson in new ways," says Evans. "We're able to communicate in much richer ways."
In addition to video, the computing power of the cloud delivered to endpoint devices changes our ability to
communicate with things like real-time translation. Right now, the voice search on an Android phone sends the
query to the Google cloud to decipher and return results. "We'll see more intelligence built into communication.
Things like contextual and location-based information."
With an always-connected device, the network can be more granular with presence information, tapping into a
personal sensor to know that a person's asleep, and route an incoming call to voicemail. Or knowing that person is
traveling at 60 mph in a car, and that this is not the time for a video call. (Of course, by then, we'll probably all be
using driverless Google cars, and be free to chat while our cars drive us around.)
Evans talks about his home as an example of the speed of network improvements. Network performance has
increased by 170,000 times since 1990, when he had just one telnet connection.
Top News
This is how the future looks with IBM Watson and 'perfect data'
Today, Evans has 38 always-on connections and more than 50Mbps of bandwidth, enough for telepresence,
streaming movies and online games at the same time. Over the next 10 years, Evans expects the speed to his home
to increase by 3 million times.
While most of the industry is focused on 40G and 100G, whole new forms of networks are also being created. Vint
Cerf discusses the new protocols needed to build an interplanetary network, which can send data vast distances
without being disturbed by latency. Evans notes that multiterabit networks using lasers are being explored. And
early work is happening on a concept called "quantum networking," based on quantum physics. This involves
"quantum entanglement" in which two particles are entangled after which they can be separated by any distance,
and when one is changed, the other is also instantly changed. Production quantum networks are likely decades in
the future.
With always-on connectivity, social networking has the power to change cultures, as we saw with the Egyptian
Revolution, which led to the Arab Spring. Social influences will continue to move rapidly between cultures.
A smaller world also means faster information dissemination. "Tweets from people in Japan during the recent
earthquake were sent to followers even before the U.S. Geological Survey could issue its official tsunami warning
to Alaska, Washington, Oregon and California," says Evans.
The capture, dissemination and consumption of events are going from "near time" to "real time." This in turn will
drive more rapid influence among cultures.
The human population also continues to grow, and Evans estimates that a city with 1 million inhabitants will be
built every month over the next two decades. More efficient methods to power those cities are becoming a
necessity, particularly solar energy.
"Solar alone can meet our energy needs. In fact, to address today's global demand for energy, 25 solar super sites --
each consisting of 36 square miles -- could be erected. Compare this to the 170,000 square kilometers of forest area
destroyed each year," says Evans. Such a solar farm could be completed in just three years.
TRYING: Energy Dept. spends $2B to double US concentrated solar power capacity
Technologies to make this more economically pragmatic are on their way. In June, Oregon State University
researchers showed off a novel, relatively affordable, low-impact method to "print" solar cells using an inkjet
printer.
More items will move from physical to virtual. Today, we download e-books and movies, rather than bound books
and DVDs. A technology called 3D printing will allow us to instantly manufacture any physical item, from food to
bicycles, using printer technology. This is strikingly like the replicator concept from "Star Trek."
"3D printing, or additive manufacturing, is the process of joining materials to make objects from 3D model data,
usually layer upon layer," says Evans.
Already, things ranging from toys to cars to living structures are being printed and because the process is done by
adding layers of materials on top of one another, they are printed fully assembled and decorated, too. The bicycle
pictured with this story is an actual working bicycle created by a 3D printer.
In the not-too-distant future, we will be able to print human organs," says Evans. In March, Dr. Anthony Atala from
the Wake Forest Institute for Regenerative Medicine printed a proof-of-concept kidney mold onstage at TED. It
was not living tissue, but the point was well made even so.
Trend 8: Another family tree
Virtual humans, both physical (robots) and online avatars will be added to the workforce. "Already, animated
characters can recognize speech, convert text to speech, and have knowledge of previous encounters," says Evans.
By 2020, robots will be physically superior to humans. IBM's Blue Brain project, for instance, is a 10-year mission
to create a human brain using hardware and software. "They believe that within a decade they'll start to see
consciousness emerge with this brain," Evans says.
By 2025, the robot population will surpass the number of humans in the developed world. By 2032, robots will be
mentally superior to humans. And by 2035, robots could completely replace humans in the workforce.
Beyond that, we'll see the creation of sophisticated avatars. Evans points to IBM's Watson as a template for the
virtual human. Watson was able to answer a question by returning a single, accurate result. A patient may use a
virtual machine instead of a WebMD search. Or hospitals can augment patient care with virtual machines.
Between now and then, augmented reality and gesture-based computing will enter our classrooms, medical facilities
and communications, and transform them as well. "Already, machine vision enables users to take a picture of a
Sudoku puzzle with their smartphone and have it solved almost immediately," he notes.
"We think nothing of using pacemakers," Evans points out. In the next 10 years, he believes medical technologies
will grow vastly more sophisticated as computing power becomes available in smaller forms. Devices such as
nanobots and the ability to grow replacement organs from our own tissues will be the norm. "The ultimate
integration may be brain-machine interfaces that eventually allow people with spinal cord injuries to live normal
lives," he says.
Today we have mind-controlled video games and wheelchairs, software by Intel that can scan the brain and tell
what you are thinking and tools that can actually predict what you are going to do before you do it.
According to Stephen Hawking, "Humans are entering a stage of self-designed evolution." Taking the medical
technology idea to the next level, healthy humans will be given the tools to augment themselves. Evans offers the
following examples:
October 2009 -- Italian and Swedish scientists develop the first artificial hand with feeling.
June 2011 -- Texas Heart Institute develops a "spinning" heart with no pulse, no clogs and no breakdowns.
Popular
What broke Amazon’s cloud
While the early use of these technologies will be to repair unhealthy tissue or fix the consequences of brain injury,
eventually designer enhancements will be available to all.
Ultimately, humans will use so much technology to mend, improve or enhance our bodies, that we will become the
Borg. Futurist Ray Kurzweil is pioneering this idea with a concept he calls singularity, the point at which man and
machine merge and become a new species. (Kurzweil says this will happen by 2054). Evans is not convinced about
singularity, particularly in Kurzweil's time frame. Evans sits on the Singularity University in Mountain View and
finds the data plausible, and agrees that we are on that trajectory.
1982
IBM PC: Computers as a low-cost assemblage of electronic Lego parts made every neighborhood electronics geek a
computer technician and every small office and home work room a data center.
RELATIONAL DATABASES: The second generation of RDBMS systems began to take hold.
1983
GPS/GIS: The Global Positioning System was opened for use by civilian aircraft in 1983, beginning a trend that '
combined with great advances in geographic information systems and mapping tools ' led to agency data visualized
in layered maps and cars telling their drivers where to turn.
1984
CD-ROM for computers: Flattened two entire industries, data storage and music dissemination.
Its successor, the DVD (1996), killed off the video tape.
FLASH MEMORY: Invented in 1984 at Toshiba, it found its place in small devices.
Smart phones, digital cameras, other devices (and, soon, laptops) all rely on Flash.
1985
NETWORK FILE SYSTEM: The file system that brought us to the age of network storage. No longer would your
data be hostage to the computer in which it was created ' or to backup tape.
1987
POWERPOINT: The one you love to hate. All the knowledge in the world boiled down to easy, succinct, bullet-
pointed meaninglessness.
PERL: God's own duct tape, at least when working in Unix-based systems.
1989
WORLD WIDE WEB: Invented by Tim Berners-Lee, it would soon change the way governments, business and
people operate.
1990
SLIP/PPP (Serial Line Internet Protocol and Point-to-Point Protocol): We've forgotten about this now, but
SLIP/PPP ' mostly PPP ' is what got everyone on the Internet via dial-up modems back when broadband was an
obscure industry term.
1991
LINUX: A Unix knockoff that is the world's largest hobby project for coders. A select few are among the world's
best.
HYPERTEXT MARKUP LANGUAGE: You send the instructions to the remote computer and let it figure out how
to render the layout, dummy! PCI SLOTS: Rumors are unconfirmed that the national boost in technology
productivity came from the thousands of admins who no longer had to fiddle with the IRQ settings each time they
installed a new peripheral.
1991
GRAPHICS COPROCESSORS: They made the fancy stuff possible by pulling graphics data away from the CPU
and eventually gave rise to separate graphics cards.
1992
THE BROWSER: It made the Web work for the rest of us.
1993
E-MAIL: Electronic mail goes back to the 1960s, but it really started taking off with Web use. By 1997, the volume
of business e-mail surpassed that of regular mail.
ADOBE PDF: Lawyers and other control freaks love it! Also, it was perhaps the first truly effective document-
sharing technology.
1994
BEOWULF (LINUX) CLUSTERS: Changed the supercomputing industry with cheap hardware and an open-source
operating system.
1995
WINDOWS 95: 32-bit pre-emptive multitasking made possible everything that has come along for the desktop
since ' including the graphical Internet and Mac OS X.
1995
LIGHTWEIGHT DIRECTORY ACCESS PROTOCOL: The universal administrative assistant (mostly in the form
of Microsoft Outlook/Exchange) for the cubicled middle rank ' and a nursemaid for their bosses.
WIKIS: They may have taken a while to catch on, but wikis are becoming a dominant collaboration tool.
IPV6: The newest set of protocols makes tomorrow's online dreams possible.
1996
UNIVERSAL SERIAL BUS: Got all the device manufacturers to settle on one device bus. Cats, meet herder.
MP3 AUDIO FORMAT: A file format that pretty much leveled an entire industry ' and movies are next.
FLASH: Scripting your Web page like a movie, or anything else, with almost zero-client footprint.
1997
BROADBAND: Cable and Digital Subscriber Lines start to make an appearance in homes, and telecommuting
becomes a real option.
1998
GOOGLE: We'd call it the portal to the Web, except portals aren't this easy to use. The search bar is rapidly
becoming the sippy cup of culture ' with more than partial thanks to Wikipedia, Google's query shortstop.
EXTENSIBLE MARKUP LANGUAGE: Data that tells us what our data is. But this data is in brackets, so we
know what it means, more or less.
1999
WI-FI: The network computer Libre! BLACKBERRY: Life support for your government executive, with its push
technology making the difference.
VIRTUALIZATION FOR X86 ARCHITECTURES: Making the most of what you have.
2002
2003
SERVICEORIENTED ARCHITECTURE: SOA and Web services pave the way for a new generation of online
government services.
2004
ADOBE FLEX: Flash development, open-sourced in 2007, for Rich Internet Applications.
2005
MULTICORE PROCESSORS: More performance, less energy use; a wave of the future.
2007
FACEBOOK API/GOOGLE OPEN SOCIAL API: Social network programming goes mainstream.
SPECIAL JUDGE'S AWARD: Evolving technologies for programmer nutrition: foods that can be eaten with one
hand, such as Doritos with salsa; plus remote teleworking at Starbucks with a double-shot latte and raspberry
muffin.