0% found this document useful (0 votes)
5 views

IT INTERSCHOOL

Uploaded by

jaishikha2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

IT INTERSCHOOL

Uploaded by

jaishikha2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Polymorphism, as related to genomics, refers to the presence of

two or more variant forms of a specific DNA sequence that can


occur among different individuals or populations. The most common
type of polymorphism involves variation at a single nucleotide (also
called a single-nucleotide polymorphism, or SNP).

1. Ω. Big Omega is used to represent the best-case


scenario of an algorithm. When an algorithm is
given the simplest data-structure possible as input.
2. O. BigO is used to represent the worst-case
scenario of an algorithm. When an algorithm is
given a very large and complex data-set as input.
3. Θ. Big Theta is used only when the time complexity
of an algorithm is the same in both worst-case and
best-case scenarios.
Development of Computers

 Early 20th Century


o Alan Turing: Turing Machine, theoretical foundation of computation and
artificial intelligence.
o Konrad Zuse: Developed Z3, the first programmable computer.
 World War II Era
o Colossus: First programmable digital computer, used for codebreaking.
o ENIAC: First general-purpose electronic digital computer.

Evolution of Computer Hardware

 Transistors (1947): Replaced vacuum tubes, leading to smaller and more reliable
computers.
 Integrated Circuits (1958): Allowed for the development of microprocessors,
revolutionizing computer design.
 Microprocessors (1971): Intel 4004, the first commercially available microprocessor.

Development of Programming Languages

 Assembly Language: Low-level programming close to machine code.


 FORTRAN (1957): First high-level programming language, used for scientific
computing.
 COBOL (1959): Developed for business applications.
 LISP (1958): One of the earliest programming languages for artificial intelligence.
 C (1972): Influential language that led to the development of Unix and many modern
languages.

Operating Systems

 Early Systems: Batch processing systems.


 Unix (1969): Developed at AT&T Bell Labs, influential in the development of many
modern operating systems.
 MS-DOS (1981): Early operating system for IBM PCs.
 Windows (1985): Microsoft's operating system, which became widely popular.

Networking and the Internet

 ARPANET (1969): Precursor to the internet, developed by the U.S. Department of


Defense.
 Development of Protocols: TCP/IP (1970s), foundational protocols for the internet.
 World Wide Web (1989): Developed by Tim Berners-Lee, revolutionized
information sharing and access.

Advances in Personal Computing

 Apple II (1977): One of the first successful personal computers.


 IBM PC (1981): Standardized personal computing and led to widespread adoption.
 Graphical User Interface (GUI): Popularized by the Apple Macintosh (1984) and
Microsoft Windows.

Mobile Computing

 Early Devices: Personal Digital Assistants (PDAs).


 Smartphones: Evolution from early mobile phones to smartphones with the launch of
the iPhone (2007) and Android (2008).

Artificial Intelligence and Machine Learning

 Early AI: Turing Test, early AI programs like ELIZA.


 Machine Learning: Development of algorithms and models, neural networks, and
deep learning.

Modern Developments

 Cloud Computing: On-demand availability of computing resources and data storage.


 Big Data: Processing and analysis of large datasets.
 Quantum Computing: Developing computers based on quantum mechanics
principles.
 Blockchain: Technology behind cryptocurrencies like Bitcoin.
An algorithm is a step-by-step procedure or formula for solving a problem or performing a task. In
computer science and mathematics, algorithms are essential for creating efficient and effective
solutions to various problems.

Characteristics of Algorithms

 Finite: Must terminate after a finite number of steps.


 Definite: Each step must be precisely defined and unambiguous.
 Input: Accepts zero or more inputs.
 Output: Produces at least one output.
 Effective: Each step must be simple enough to be carried out, in principle, by a
person using only pencil and paper.

Types of Algorithms

 Sorting Algorithms
o Bubble Sort: Simple comparison-based sorting.
o Merge Sort: Divide and conquer algorithm for efficient sorting.
o Quick Sort: Another efficient divide and conquer sorting algorithm.
 Searching Algorithms
o Linear Search: Sequentially checks each element until the target is found.
o Binary Search: Efficient search on sorted arrays by repeatedly dividing the
search interval in half.
 Graph Algorithms
o Breadth-First Search (BFS): Explores neighbors level by level.
o Depth-First Search (DFS): Explores as far as possible along each branch
before backtracking.
o Dijkstra's Algorithm: Finds the shortest paths between nodes in a graph.

the first modern robot was created by George Devol in the early 1950s. Devol,
an American inventor, developed a reprogrammable manipulator called
"Unimate" (Universal Automation).

The first Unimate robot was installed in a General Motors (GM) factory in 1961
for tasks like die casting and welding.

You might also like