0% found this document useful (0 votes)
45 views

Report Sts

Sumerian was an isolate language spoken in ancient Mesopotamia from the 4th millennium BC until around 2000 BC, when it was replaced by Akkadian. Sumerian cuneiform is the earliest known writing system, developing from pictographs used to represent trade goods and livestock on clay tablets as early as 8000 BC. Egyptian hieroglyphs emerged around 3400 BC and were used until around 400 AD, with knowledge of them lost until the early 19th century when Champollion deciphered the Rosetta Stone. Oracle bones used for divination in ancient China from around 1600 BC provided early records of the Shang dynasty and gave rise to Chinese script.

Uploaded by

Met Malayo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views

Report Sts

Sumerian was an isolate language spoken in ancient Mesopotamia from the 4th millennium BC until around 2000 BC, when it was replaced by Akkadian. Sumerian cuneiform is the earliest known writing system, developing from pictographs used to represent trade goods and livestock on clay tablets as early as 8000 BC. Egyptian hieroglyphs emerged around 3400 BC and were used until around 400 AD, with knowledge of them lost until the early 19th century when Champollion deciphered the Rosetta Stone. Oracle bones used for divination in ancient China from around 1600 BC provided early records of the Shang dynasty and gave rise to Chinese script.

Uploaded by

Met Malayo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 11

Sumerian Sumerian was spoken in Sumer in southern Mesopotamia (part of modern Iraq) from perhaps the 4th

millennium BC until about 2,000 BC, when it was replaced by Akkadian as a spoken language, though continued to be used in
writing for religious, artistic and scholarly purposes until about the 1st century AD. Sumerian is not related to any other known
language so is classified as a language isolate.

Sumerian cuneiform

Sumerian cuneiform is the earliest known writing system. Its origins can be traced back to about 8,000 BC and it developed from the

pictographs and other symbols used to represent trade goods and livestock on clay tablets. Originally the Sumerians made small

tokens out of clay to represent the items. The tokens were kept together in sealed clay envelopes, and in order to show what was

inside the envelopes, they press the tokens into the clay in the outside.

Egyptian Hieroglyphs

The word “hieroglyph” comes from the Greek hieros (sacred) and glyphos (words or signs) and was first used by

Clement of Alexandria (c 150 – 230 A.D.). The ancient Egyptians called them ‘mdju netjer or “words of the gods.”

The first known example of hieroglyphic writing in ancient Egypt was discovered on bone and ivory tags, pottery

vessels and clay seal impressions discovered in a pre-dynastic tomb in Abydos. The deposits are dated to between

3400 and 3200 B.C. The last example is from the temple of Philae, and is dated to 394 A.D. Knowledge of the

hieroglyphic script was lost until the early 19th century, when Jean-Francois Champollion (1790-1832) used

the Rosetta Stone to decipher the script.We know little about the early development of Egyptian hieroglyphs. The

early signs found in Abydos (left) already include short phrases and no earlier signs have been discovered. It is

possible there was only a short developmental phase before the emergence of the fully fledged language, but some

have used the lack of information to argue that the Ancient Egyptians learned writing from a lost civilisation or

aliens. More reasonable scholars suggest that the ancient Egyptians borrowed the idea of writing (but not the form)

from Mesopotamia. However, inscriptions dating from the reign of King Scorpion (roughly 3400 B.C.) have led

many to suggest that Egyptian hieroglyphs may have preceded the Mesopotamian scripts.

Oracle Bones (also known as Dragon's Bones) were the shoulder blades of oxen or plastrons of turtles (the flat, underside of the
turtle's shell) which were used in the Shang Dynasty of China (c. 1600-1046 BCE) for divination. A fortune-teller would carve (later,
paint) symbols on the bones of the ox or the turtle shell, apply a hot poker or fire until the bone or shell cracked, and then interpret
the direction of the crack through their drawing to predict the future. Eventually, the symbols became words and a recognizable
Chinese script developed from this practice.

Most of the oracle bones discovered come from the Shang Dynasty but some from the early Zhou Dynasty (1046-226 BCE). The
practice of telling the future through oracle bones is known as scapulimancy (telling the future through the scapula, the shoulder
bone, of an animal), plastromancy (using a turtle's plastron) or pyromancy (the use of fire). These methods all declined when the
book known as the I-Ching (a fortune-telling manual which uses hexagrams and yarrow sticks) became more popular in
the Zhou Dynasty.

Oracle bones continued to be used in later dynasties but not as regularly as during the Shang. These bones are important primary
sources on the history of the Shang Dynasty and gave birth to Chinese script. Historian Harold M. Tanner writes, "oracle bones are
the earliest written records of Chinese civilization. The inscriptions give us a highly selective picture of some of the concerns and
events that were relevant to the Shang elite.The earliest of these records date to the reign of King Wu Ding in the late Shang (40)."
Even though everyone was interested in what the future held, questions from the wealthier classes in China make up the majority of
the inscriptions. This is probably because they could afford to consult the psychics more often than the poor.

Use of Oracle Bones

The desire to know the future has been a constant in human history and the people of China during the Shang Dynasty were no
different along these lines than people today. Fortune-telling during the Shang Dynasty was considered an important resource in
making decisions, and these 'psychics' were consulted by everyone from the farmer to the king. These fortune-tellers were thought
to be in touch with the spirit world of the ancestors who lived with the gods and knew the future. These spirits would communicate
with the psychics through the oracle bones. Each fortune-teller had his or her area of expertise (love, money, work, etc) but could
answer questions on any topic.

Fortune-tellers either got the bones and shells themselves (and prepared them) or bought them from a merchant who scraped and
cleaned them. The bones/shells were then kept in the fortune-teller's shop. If someone wanted to know whether they should take
their cattle to market, or go visit a friend on a certain date, they would visit a fortune-teller who could predict how well their plans
would work out.

The person would ask the fortune-teller a question like "Should I bring my oxen to market next month?" and the fortune-teller
would carve the symbols for the person, oxen, the next month, and maybe a later month on the turtle shell or bone. A hole would be
drilled in the object and a hot poker would be applied, or the shell/bone placed near fire until it cracked. If the crack went one way it
would mean the person should go to market with the animals, and if it went another way they should wait.

People relied on these fortune-tellers to help them make decisions about all kinds of choices in their lives from matchmaking to
having children, to travel and financial decisions, and even to making war. People consulted fortune-tellers back then the same way
they check their horoscopes in modern newspapers or the internet today to see what the day holds in store. There were symbols
carved on the bones which meant 'good day' and 'bad day', and a person could consult a fortune teller in the morning to see which
kind of day they had in front of them. The oracle bones found thus far date from c. 1250-1046 BCE and give all kinds of important
information about the Shang Dynasty.

Discovery of Oracle Bones

The Shang Dynasty was replaced by the Zhou who still used oracle bones but mainly relied on the I-Ching and other methods of
telling the future. The capital of the Shang Dynasty, modern Anyang, was renovated by the Zhou and the areas of the bone
workshops and places of divination were neglected.

In 1899 CE the Chancellor of the Imperial Academy, Wang Yirong (1845-1900 CE), became sick with malaria. He asked his doctor
for medicine and was sent to an apothecary for the best-known remedy: dragon's bones. This medicine was supposed to be made
from the ancient bones of dragons and had mystical properties for healing. Taking a dose of dragon's bones during the Shang
Dynasty would be like taking aspirin or a prescription drug today, and the apothecaries, doctors, and suppliers all made money off
this medicine, which was always given to patients in its ground up, powdered form.

When Wang Yirong got his dragon bones, though, they were not ground up. On the night this happened, he had a friend visiting
named Liu E (1857-1909 CE) who examined the dragon's bones with him. They were both interested in palaeography (the study of
ancient writing) and noticed that these bones seemed to be covered in ancient Chinese script.

Wang Yirong and Liu E went to the apothecary to find out where he got these bones but the man would not tell them. He agreed to
sell them whatever unground bones he had in his shop, though, and Wang and Liu told other people about their discovery. Scholars
quickly became interested in finding out where these bones were coming from but the apothecaries and bone dealers were not
about to tell them; they were making too much money off the medicine.

Papyrus was used as a food source, to make rope, for sandals, for boxes and baskets and mats, as window shades, material for
toys such as dolls, as amulets to ward off throat diseases, and even to make small fishing boats.
Small Seal Script (Chinese: 小篆, xiǎozhuàn), formerly romanized as Hsiao-chuan and also known as Seal Script, Lesser Seal
Script and Qin Script (秦篆, Qínzhuàn), is an archaic form of Chinese calligraphy. It was standardized and promulgated as a
national standard by Li Si, prime minister under Shi Huangdi, the First Emperor of Qin.
Before the Qin conquest of the six other major warring states of Zhou China, local styles of characters had evolved independently of
one another for centuries, producing what are called the "Scripts of the Six States" (六國文字), all of which are included under the
general term "Great Seal Script". Under one unified government, however, the diversity was deemed undesirable as it hindered
timely communication, trade, taxation, and transportation, and as independent scripts might be used to represent dissenting political
ideas.
Hence, Emperor Qin Shi Huang mandated the systematic unification of weights, measures, currencies, etc., and the use of a
standard writing script. Characters which were different from those found in Qin were discarded, and the Qin's small seal characters
became the standard for all regions within the empire. This policy came in about 220 BC, the year after Qin's unification of the
Chinese states.[1]
The standardized use of small seal characters was promulgated via the Cangjiepian, a primer compiled by Li Si and two other
ministers. This compilation, stated to contain 3,300 characters, is no longer extant, and is known only through Chinese
commentaries through the centuries. Several hundred characters from fragmented commentaries were collected during the Qing
period, and recent archeological excavations in Anhui, China, have uncovered several hundred more on bamboo strips, showing the
order of the characters; however, the script found is not the small seal script, as the discovery dates from Han times.

A codex is essentially an ancient book, consisting of one or more quires of sheets of papyrus or parchment folded together to form

a group of leaves, or pages.

A codex (from the Latin caudex, meaning "trunk of a tree", “block of wood” or “book”), plural codices (/ˈkɒdɪsiːz/), is a book
constructed of a number of sheets of paper, vellum, papyrus, or similar materials. The term is now usually only used
of manuscript books, with hand-written contents,[1] but describes the format that is now near-universal for printed books in the
Western world. The book is usually bound by stacking the pages and fixing one edge to a spine, which may just be thicker paper
(paperback or softback), or with stiff boards, called a hardback, or in elaborate historical examples a treasure binding.[citation needed]
At least in the Western world, the main alternative to the paged codex format for a long document is the continuous scroll, which
was the dominant form of document in the Ancient World. Some codices are continuously folded like a concertina, in particular
the Maya codices and Aztec codices, which are actually long sheets of paper or animal skin folded into pages. These do not really
meet most current definitions of the "codex" form, but are so called by convention.[citation needed]
The Romans developed the form from wax tablets. The gradual replacement of the scroll by the codex has been called the most
important advance in book making before the invention of the printing press.[2] The codex transformed the shape of the book itself,
and offered a form that lasted until the present day (and continues to be used alongside e-paper).[3] The spread of the codex is often
associated with the rise of Christianity, which adopted the format for use with the Bible early on.[4] First described by the 1st-century
AD Roman poet Martial, who praised its convenient use, the codex achieved numerical parity with the scroll around 300 AD, [5] and
had completely replaced it throughout what was by then a Christianized Greco-Roman world by the 6th century.[6]
Paper was first invented in China during the Han dynasty around 105 AD by a government worker called Cai Lun. He developed a
way to make paper using the bark of trees and rags of cloth. Paper was made by creating a mix of bark and rags that would float on
water.
1455-JOHANNES GUTENBERG

The invention of mass printing practices changed our world and the print invention is regarded by many as the
invention of the millennium.

Before Gutenberg, books were either copied out by hand on scrolls and paper, and even a small book could take
months to complete, or printed from hand-carved wooden blocks, each block printing a whole page, a part of a
page or even individual letters. But the wood work was extremely time-consuming and the carved letters or blocks
were very fragile and the susceptibility of wood to ink gave such blocks a limited lifespan. Moreover, the hand-
carved letters were different.

Gutenberg is generally credited with the invention of practical movable type. He made metal moulds, by the use of
dies, into which he could pour hot liquid metal, in order to produce separate letters as the same shape as those
written by hand. These letters were similar, more readable, and more durable than wooden blocks. Such letters
could be arranged and rearranged many times as the printer wished to create different pages from the same
letters.

Gutenberg also introduced the use of printing press to press the type against paper. For this he used a hand press
used in his times by wine industry. Ink was rolled over the raised surfaces of the hand-set letters held within a
wooden frame, and the frame was then pressed against the paper. The press enabled sharp impressions on both
sides of a sheet of paper and many repetitions. After a page was printed, the type could be reused for printing
other pages.

Voted the Man of the Millenium for his invention, Gutenberg and his famous printing press revolutionized the world. Thousands of copies of
books could be made to be sold all across the world, and the immense value of that spread of information cannot be overstated. Born in 1398 in
Germany (Lehmann-Haupt), Gutenberg was an inventor by trade, and had amassed a large array of inventors to begin work on his new project;
the moveable type printer. Though movable type had already been invented by the Chinese, it had not yet reached Europe, and a new machine
using metal blocks of letters all held together had yet to be created (“Gutenberg’s Millenium”). Gutenberg’s complex machine consisted of a
frame which could hold the letters (made of a durable alloy Gutenberg had created himself) arranged in a specific way, which a dye was added to
to stain the paper with the passage. Gutenberg’s 42-line bible had been produced by 1955 and it, along with his invention, was in circulation all
across Europe.

The printing press revolutionized attitudes toward knowledge and learning, and “the questioning of received wisdom, greatly increased. In many
ways, the communal medieval world gave way rapidly to the modern world and its focus on the individual” (“Gutenberg’s Millenium”). The
Renaissance, which is defined as a rebirth of cultural attitudes and a thirst for artistry and knowledge, was greatly impacted by the Gutenberg’s
invention. Without it, the Renaissance would have only touched the nobility, who could afford to read and own books written by famous
Renaissance artists. However, the printing press enabled everyone across Europe to have the same access to great works of Renaissance literature,
spreading and magnifying the overall effects of the Renaissance even further. The printing press allowed for works such as Machiavelli’s The
Prince, and even Martin Luther’s 95 Thesis to spread across Europe, and it was this spread of ideas and culture that encapsulates the Renaissance.

1755---Samuel Johnson's 'Dictionary of the English Language' is one of the most famous dictionaries in history. First published
in 1755, the dictionary took just over eight years to compile, required six helpers, and listed 40,000 words.

The Library of Congress was subsequently established April 24, 1800 when President John Adams signed an act

of Congressproviding for the transfer of the seat of government from Philadelphia to the new capital city of Washington.

History. The concept of carbon-arc lighting was first demonstrated by Humphry Davy in the early 19th century, but sources

disagree about the year he first demonstrated it; 1802, 1805, 1807 and 1809 are all mentioned. ... Her paper was "The Hissing of

the Electric Arc".

In 1824 Peter Roget introduced the concept of how the brain sees individual images as a sequence of motion.Persistence of

vision is the phenomenon of the eyeby which an afterimage is thought to persist for approximately one twenty-fifth of a second on

the retina.Oct 7, 2016

The English inventor Charles Babbage, however, is generally credited with having conceived the firstautomatic digital computer.

During the 1830sBabbage devised his so-called Analytical Engine, a mechanical device designed to combine basic arithmetic

operations with decisions based on its own computations.

Some people used to call her the World's First Computer Programmer, and even Founder of Scientific Computing. ... Foreseeing

almost with a century the modern general-purpose programmable computer, in 1830s Babbage designed his Analytical Engine.

People love legends, even in the world of computers. And Ada Lovelace is the perfect person for such a legend.

Augusta Ada (young Augusta's name was changed to Ada in order to avoid association with her aunt, for whom she had been named)
Byron (born on 10 December, 1815) was the only legitimate child of the British Romantic poet, George Gordon, Lord Byron. Her
mother was Anne Isabella Milbanke, who took the baby at one month old away from her father's home. Ada Augusta Byron never
saw her father again, he died when she was eight.

Ada got a rather uncommon for a woman of these times education, as his mother, who had studied mathematics herself, decided
that her daughter would be spared the father's eccentricities by studying more logical subjects like math and science, rather than
literature or poetry. Young Ada Lovelace showed a genius for math from an early age. Her tutors included William Frend and Mary
Somerville. She also learned music, drawing and languages, and became fluent in French.

In 1835 Ada Byron married a William King. In 1838 her husband became the first Earl of Lovelace, and Ada became countess of
Lovelace. They had three children.
Ada Lovelace unknowingly developed an addiction to prescribed drugs including laudanum, opium and morphine, and displayed
classic mood swings and withdrawal symptoms. She took up gambling and lost most of her fortune. She was suspected of an affair
with a gambling comrade.

Ada died young (in 1852) of uterine cancer and bloodletting by her physicians and was buried next to her famous father.

It is out of the question, that she was an outstanding woman. But what is her place in the computer science? Some people used to
call her the World's First Computer Programmer, and even Founder of Scientific Computing. In 1980, the U.S. Department of
Defense settled on (in honor of Lady Lovelace) the name Ada for a new standardized computer language.

The truth is somewhat different—Ada Lovelace has a rather modest place in the world of computers. She was in the right place at the
right time and reflected some of the light of one of the greatest persons in the world of computers—Charles Babbage.

Foreseeing almost with a century the modern general-purpose programmable computer, in 1830s Babbage designed his Analytical
Engine. He was not a diligent writer however, especially concerning documentation of his ideas (actually he was rather careless and
impractical man, which will end in some friction developed between Ada Lovelace and him during their later mutual work), thus we
know little of Babbage's programming ideas.

In August 1840, Babbage visited Turin in Italy and gave a series of seminars on the Analytical Engine at the Academy of Sciences.
One of the listeners—the Italian engineer and professor Federico Luigi Menabrea (1809-1896), who later on will become the Prime-
Minister of Italy, wrote up the lectures, modified with ideas from the discussions, in the paper Notions sur la Machine Analytique de
M. Charles Babbage, which was published in French in Bibliothèque Universelle de Genève in October 1842. This paper was the first
extensive publication on the computers and programming in the world.

Ada Byron met Charles Babbage in 1833, and initially became interested in the model he had constructed of his famous Differential
Engine. Soon Babbage became something like a mentor of the young lady (and vice versa, as Ada was an extraordinary celebrity,
and later as the wife of a prominent aristocrat, she was in a position to act as patron to Babbage and his engines (though she never in
fact did so)), and helped Ada to begin mathematical studies with the great mathematician Augustus de Morgan in 1840 at the
University of London. Babbage was impressed by Lovelace's intellect and writing skills. He called her The Enchantress of
Numbers and in 1843 he wrote:
Forget this world and all its troubles and if
possible its multitudinous Charlatans—every thing
in short but the Enchantress of Numbers.

After publication in 1842 of the paper of Menabrea, Babbage was asked to write a paper for two British scientific journals (The
Ladies Diary and Taylor's Scientific Memoirs). Thus he decided to ask Ada Lovelace to translate Menabrea's article into English, but
to append extensive notes to the translation, prepared under Babbage's close guidance. These deal with the familiar modern ideas of
flow of control in programs, particularly the formulation of simple loops and nested loops controlled by counters. However, the
paper (see Sketch of The Analytical Engine) and notes carefully and deliberately skirt around any discussion of details of the means
by which these are to be implemented, yet it was written for a journal audience.

Babbage wrote the following on the subject, in his book Passages from the Life of a Philosopher:
I then suggested that she add some notes to Menabrea's memoir, an idea which was immediately adopted. We discussed together
the various illustrations that might be introduced: I suggested several, but the selection was entirely her own. So also was the
algebraic working out of the different problems, except, indeed, that relating to the numbers of Bernoulli, which I had offered to do
to save Lady Lovelace the trouble. This she sent back to me for an amendment, having detected a grave mistake which I had made
in the process.

Babbage considered this paper a complete summary of the mathematical aspects of the machine, proving that the whole of the
development and operations of Analysis are now capable of being executed by machinery.

Ada also expanded upon Babbage's general views of the Analytical Engine as a symbol-manipulating device rather than a mere
processor of numbers. She brought to the project a fine sense of style that resulted in the frequently quoted analogy, "We may say
most aptly that the Analytical Engine weaves algebraic patterns just as the Jacquard-loom weaves flowers and leaves." She suggested
that it "might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by
those of the abstract science of operations... Supposing, for instance, that the fundamental relations of pitched sounds in the science
of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate
and scientific pieces of music of any degree of complexity or extent... Many persons who are not conversant with mathematical
studies, imagine that because the business of the engine is to give its results in numerical notation, the nature of its processes must
consequently be arithmetical and numerical, rather than algebraic and analytical. This is an error. The engine can arrange and
combine its numerical quantities exactly as if they were letters or any other general symbols; and in fact it might bring out its results
in algebraic notation, were provisions made accordingly".

So, what about the Ada's title the world's first programmer? Well, this is nonsense! Babbage was, if programmer is the right term at
all. After Babbage came a mathematical assistant of his, Babbage's eldest son—Benjamin Herschel, then Menabrea and possibly
Babbage's two younger sons. Ada was probably the fifth or six person to write programs. Moreover all she did was rework some
calculations Babbage had carried out years earlier. It is out of the question however, that Ada was Babbage's fairy lady—interpreter,
adviser, collaborator, and confidante, supporting his work financially, intellectually, and emotionally. As such her achievement was
certainly remarkable.
Developed in the 1830s and 1840s by Samuel Morse (1791-1872) and other inventors, the telegraphrevolutionized long-distance

communication. It worked by transmitting electrical signals over a wire laid between stations.Jun 6, 2019

Motion picture. Motion picture, also called film or movie, series of still photographs on film, projected in rapid succession onto a

screen by means of light. Because of the optical phenomenon known as persistence of vision, this gives the illusion of actual,

smooth, and continuous movement.

Melvil Dewey (1851–1931) was an American librarian and self-declared reformer. ... He developed the ideas for his

library classification system in 1873 while working at Amherst College library. He applied the classification to the books in that

library, until in 1876 he had a first version of the classification.

Today, Muybridge is known for his pioneering work on animal locomotion in 1877 and 1878, which used multiple cameras to

capture motion in stop-motionphotographs, and his zoopraxiscope, a device for projecting motion pictures that pre-dated the

flexible perforated film strip used in cinematography.

Marvin Camras
From Wikipedia, the free encyclopedia
Jump to navigationJump to search

Marvin Camras

Born January 1, 1916

Chicago

Died June 23, 1995 (aged 79)

Evanston, Illinois

Education Illinois Institute of Technology

Occupation Engineer

Spouse(s) Isabelle Pollak Camras

Children Robert A. Camras, Carl B. Camras, Louis E. of Camras, and

Michael D. Camras, and Ruth Camras Pikler

Engineering career

Discipline Electrical engineer


Significant Wire recorder, multi-track tape recording

design

Awards National Medal of Technology, 1990

Marvin Camras (January 1, 1916 – June 23, 1995) was an electrical engineer and inventor who was widely influential in the field
of magnetic recording.
Camras built his first recording device, a wire recorder, in the 1930s for a cousin who was an aspiring singer. Shortly afterwards he
discovered that using magnetic tape made the process of splicing and storing recordings easier.
Camras's work attracted the notice of his professors at what is now Illinois Institute of Technology (IIT) and was offered a position
at Armour Research Foundation(which merged with Lewis Institute in 1940 to become IIT) to develop his work.
Before and during World War II Camras' early wire recorders were used by the armed forces to train pilots. They were also used for
disinformation purposes: battle sounds were recorded and amplified and the recordings placed where the D-Day invasion was not
going to take place. This work was kept secret until after the war.
In June 1944 he was awarded U.S. Patent 2,351,004,[1] titled "Method and Means of Magnetic Recording". In all, Camras received
more than 500 patents, largely in the field of electronic communications.
Camras received a bachelor's degree in 1940 and a master's degree in 1942, both in electrical engineering, from IIT. In 1968, the
institution awarded him an honorary doctorate.
In May 1962 Camras wrote a predictive paper titled "Magnetic recording and reproduction - 2012 A.D.".[2] In his paper Camras
predicted the existence of mass-produced portable media players he described as memory packs the size of a package of playing
cards holding up to 1020 bits of information. Such devices would not have any mechanically moving parts and would store both
sound and movies. He also predicted music and movie downloads, online shopping, access to online encyclopedias and
newspapers and the widespread use of online banking transactions.
In recognition of his achievements, he received the National Medal of Technology award in 1990.
Marvin Camras died of kidney failure at the age of 79 in Evanston, Illinois.

The Audion was an electronic detecting or amplifying vacuum tube invented by American electrical engineer Lee de

Forest in 1906. It was the first triode, consisting of an evacuated glass tube containing three electrodes: a heated filament, a grid,

and a plate.

From the output of the camera tube, the signal traveled to an amplifier before being transmitted to a receiver. A Russian-born

American, Vladimir Zworykin,invented the iconoscope in 1923.Dec 10, 2014

Iconoscope – 1923

 font size
 Print

American inventor Vladimir Zworykin, the “father of television," conceived two components key to that invention: the iconoscope and

the kinescope.

The iconoscope was an early electronic camera tube used to scan an image for the transmission of television. No other practical
television scanning device prior to it was completely electronic, although some, such as the Nipkow disc, combined electronic
elements with mechanical ones. Within glass housing, the iconoscope contained a photosensitive plate or “mosaic,” which divided
the image to be televised into tiny sections called pixels. An electron gun, also placed in the housing, projected a scanning beam of
electrons toward the plate. Deflecting coils directed the electron beam, which charged the plate’s pixels. The charge of individual
pixels was proportional to the brightness of light initially focused on them, so that the electrical signal produced derived from the
original image. From the output of the camera tube, the signal traveled to an amplifier before being transmitted to a receiver.

A Russian-born American, Vladimir Zworykin, invented the iconoscope in 1923. Now commonly referred to as the “father of
television,” Zworykin worked at the Westinghouse Electronic Company at the time he filed a patent for the iconoscope. According to
the patent, he planned for the device to be part of a completely electronic television system. It would take Zworykin six more years,
however, before he could actually construct an effective electronic receiver, which he dubbed the kinescope. The Radio
Corporation of America (RCA), the parent company of Westinghouse, funded Zworykin’s television research. In 1939, RCA finally
reaped the benefits from their investment when they used Zworykin’s system to broadcast TV to the public for the first time.

In the decades following the iconoscope’s invention, improved camera tubes appeared and gradually replaced Zworykin’s version.
Many of them, however, were based on the same basic principles as the iconoscope and featured somewhat similar designs. As TV
broadcasting was refined and the technology involved became more affordable, more and more people became familiar with
television.

Television eventually became fully integrated into the daily lives of Americans. Today in the United States, people watch more than
four hours of TV each day on average, and a typical American household contains at least two television sets. Despite his role in its
development, Zworykin, who lived into the early 1980s, became concerned with the direction television had taken and its affect on
society. He had hoped TV would serve to educate the public and to broadcast cultural events. Dismayed at the trivial and
counterproductive materials often featured on television, Zworykin lamented in his later years, "I hate what they've done to my child
... I would never let my own children watch it."

In New York, Warner Brothers debuted Don Juan, thefirst Vitaphone sound film (developed by Bell Telephone Laboratories

in 1926) and the firstpublically-shown 'talkie' with synchronized soundeffects and orchestral music (but no dialogue) - starring John

Barrymore.

It began regular U.S. television broadcasting on April 30, 1939, with a telecast of President Franklin D. Roosevelt opening the
New York World's Fair.

Early beginnings
Information science, in studying the collection, classification, manipulation, storage, retrieval and dissemination ofinformation has
origins in the common stock of human knowledge. ... Institutionally, information science emerged in the 19th century along with
many other social science disciplines.

Douglas Engelbart
Shortly after "As We May Think" was originally published, Douglas Engelbart read it, and with Bush's visions in mind, commenced
work that would later lead to the invention of the mouse. Ted Nelson, who coined the terms "hypertext" and "hypermedia", was
also greatly influenced by Bush's essay.

In 1946, Mauchly and Eckert developed the Electrical Numerical Integrator And Calculator (ENIAC). ... Mauchly had
previously created several calculating machines and in 1942 began designing a better calculating machine based on the work of
John Atanasoff, an inventor who used vacuum tubes to speed up calculations.

Shannon's most important paper, 'A mathematical theory of communication,' was published in 1948. ... Shannon went on to
develop many other important ideas whose impact expanded well beyond the field of “information theory” spawned by
his 1948 paper.Shannon approached research with a sense of curiosity, humor, and fun.
welve years ago, Robert McEliece, a mathematician and engineer at Caltech, won the Claude E. Shannon Award, the highest honor in
the field of information theory. During his acceptance lecture, at an international symposium in Chicago, he discussed the prize’s
namesake, who died in 2001. Someday, McEliece imagined, many millennia in the future, the hundred-and-sixty-sixth edition of the
Encyclopedia Galactica—a fictional compendium first conceived by Isaac Asimov—would contain the following biographical note:

Jean Hoerni was a silicon transistor pioneer whoinvented the planar process that his colleague Robert Noyce would use to
create the modern integrated circuit. ... Only a year later in 1957, Hoerni and seven others left Shockley to found Fairchild
Semiconductor.

JACK KILBY AND ROBERT NOYCE- CREATED THE FIRST INTEGRATED CIRCUIT

MARC. The Library of Congress developed MARCin the 1960's. Their intent was to create a computer-readable format that
could be used for bibliographic records, enabling libraries to download cataloging, share information, and search all parts of a
cataloging record.

Unix is a multiuser, multitasking operating systemthat was developed by Bell Laboratories in 1969. In a multiuser system, many
users can use the systemsimultaneously. ... Each user interacts with their own shell instance in this type of operating
system andcan start applications as required.Sep 3, 2018

Intel's first microprocessor, the 4004, was conceived by Ted Hoff and Stanley Mazor. Assisted by Masatoshi Shima, Federico
Faggin used his experience in silicon-gate MOS technology (1968 Milestone) to squeeze the 2300 transistors of the 4-bit MPU into a
16-pin package in 1971.
Optical video recording technology, using a transparent disc, was invented by David Paul Gregg and James Russell in 1958 (and
patented in 1970 and 1990). The Gregg patents were purchased by MCA in 1968. By 1969,Philips had developed a videodisc in
reflective mode, which has advantages over the transparent mode.

Laserdisc (1971)
By 1969 Philips had developed a videodisc in reflective mode, which has great advantages over the transparent mode. MCA and
Philips decided to join their efforts. They first publicly demonstrated the videodisc in 1972. ... Philipsproduced the players
and MCA the discs.

It was the first microcomputer to sell in large numbers. In January 1975, a photograph of the Altairappeared on the cover of the
magazine Popular Electronics. The caption read "World's FirstMinicomptuer Kit to Rival Commercial Models."

Introduced in August 1977, the TRS 80 was the first complete, pre-assembled small computer system on the market. When the
TRS-80 — a personal computer from Tandy that would be sold via their RadioShack stores, hence TRS — went on sale on Aug.
3 in 1977, computers weren't exactly new.Aug 3, 2015

The first Macintosh was introduced on January 24,1984, by Steve Jobs (see the lower photo) and it was the first commercially
successful personal computer to feature two old known then, but still unpopular features—the mouse and the graphical user
interface, rather than the command-line interface of its predecessors.

The history of Artificial Intelligence (AI) began in antiquity, with myths, stories and rumors of artificial beings endowed with
intelligence or consciousness by master craftsmen. The seeds of modern AI were planted by classical philosophers who attempted
to describe the process of human thinking as the mechanical manipulation of symbols[citation needed]. This work culminated in the
invention of the programmable digital computer in the 1940s, a machine based on the abstract essence of mathematical reasoning.
This device and the ideas behind it inspired a handful of scientists to begin seriously discussing the possibility of building an
electronic brain.
The field of AI research was founded at a workshop held on the campus of Dartmouth College during the summer of 1956.[1] Those
who attended would become the leaders of AI research for decades. Many of them predicted that a machine as intelligent as a
human being would exist in no more than a generation and they were given millions of dollars to make this vision come true.
Eventually, it became obvious that they had grossly underestimated the difficulty of the project. In 1973, in response to the criticism
from James Lighthill and ongoing pressure from congress, the U.S. and British Governments stopped funding undirected research
into artificial intelligence, and the difficult years that followed would later be known as an "AI winter". Seven years later, a visionary
initiative by the Japanese Government inspired governments and industry to provide AI with billions of dollars, but by the late 80s
the investors became disillusioned by the absence of the needed computer power (hardware) and withdrew funding again.
Investment and interest in AI boomed in the first decades of the 21st century, when machine learning was successfully applied to
many problems in academia and industry due to the presence of powerful computer hardware.

Since the late 1960’s and the groundbreaking movie “2001: A Space Odyssey” the idea of “learning machines” has slowly crept into our
consciousness and almost always in an ominous context: machines are getting too smart for our own good.Fictional smart machines
from Hal in the 1960s to The Gunslinger in the 1970s, The Terminator in the 1980’s, and into the new century with the Matrix, all play
on the same theme: one day we humans will invent artificial intelligence which will replace us as the dominant life on the planet.These
stories always include one very crucial element, machine learning, or more to the point, machine self-learning. The idea that machines
can self-learn has been fascinating and terrifying audiences for more than 50 years and for almost the same amount of time has been
the Holy Grail of business intelligence and data science (and every other science).Since I took my first course in AI, and thinking
machines became my own passion, I’ve noticed that the dreaded “Machines are getting too smart!” article appears in popular press
like clockwork every time AI makes the news. Consider the following opinion piece on Fox News complete with the obligatory picture
of the grinning Terminator.

HyperCard was developed by Bill Atkinson and gifted to Apple on the basis that Apple would release it for free use on all
Macintoshes. It was initially released in August 1987. It immediately became a huge success and was used in many ways by many
people, many of whom began programming for the first time.

A CD-ROM is a pre-pressed optical compact disc that contains data. Computers can read—but ... Oneof a set of color-
bound books that contain the technical specifications for all CD formats, .... CD-ROMand CD-i (Green Book) and was published by
Sony and Philips in 1991. .... The first 12× drive was released in late 1996.

As part of a contest to prove the frailty of low-level encryption, a European team based in Switzerland cracked a 48-bit encryption

code in less than two weeks.The codebreaking was part of a contest sponsored by RSA Data Security, which holds a dominant
share of the encryption toolkit market. Part mathematical challenge and part marketing campaign, the competition urges contestants

to crack various strengths of RSA's algorithms, which scramble plain text into unreadable ciphers.

The point of the contest is to show that current government regulations restrict users to levels of encryption that are unsecure. The

lowest level of code, 40 bits, was cracked in less than 4 hours by a graduate student at the University of Califonia at Berkeley.

The next level, 48 bits, took 13 days to crack by a team of researchers using 3,500 computers spread across Europe, according to

Scott Schnell, RSA vice president of marketing. For each bit, the number of possible key combinations increases by a factor of two.

Therefore, the 48-bit code was roughly 256 times more difficult to break than the 40-bit code.

Germano Caronni of the Swiss Federal Institute of Technology in Zurich wrote the codebreaking software and posted it on the

Internet for anyone who wanted to join his efforts.

Strong encryption is considered crucial to protect electronic privacy in the digital age. Current U.S. government regulations allow

vendors to export up to 56-bit encryption if the vendor agrees to build in a key recovery system that would help law enforcement

officials decrypt messages implicated in criminal cases. Under current regulations, domestic use of encryption is unregulated.

You might also like