0% found this document useful (0 votes)
2 views

Lecture 2 Final

Cognitive Computing is a subset of AI aimed at mimicking human cognitive functions, focusing on problem-solving and decision-making. It differs from traditional AI by augmenting human intelligence rather than replacing it, utilizing technologies like Machine Learning, Natural Language Processing, and Big Data Analytics. The document also outlines the fundamental principles of Cognitive Computing, including learning from data, modeling information, and generating hypotheses.

Uploaded by

kagarwal3be23
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Lecture 2 Final

Cognitive Computing is a subset of AI aimed at mimicking human cognitive functions, focusing on problem-solving and decision-making. It differs from traditional AI by augmenting human intelligence rather than replacing it, utilizing technologies like Machine Learning, Natural Language Processing, and Big Data Analytics. The document also outlines the fundamental principles of Cognitive Computing, including learning from data, modeling information, and generating hypotheses.

Uploaded by

kagarwal3be23
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

Lecture - 2

Cognitive Computing, Cognition, AI, NLP, ML,


Big Data
Cognitive Computing - Introduction
• Cognitive Computing is a subset of artificial intelligence (AI), that aims
to create systems capable of mimicking human-like cognitive
functions.

• Borrowing characteristics from human


1. Thinking
Artificial
2. Perception
3. Action

• The capability to find solutions to complex


Intelligence
problems

2
Cognition
Definition

• Cognition refers to the mental processes of acquiring knowledge and


understanding through thought, experience, and the senses
• It involves activities like perception, learning, memory, reasoning, and
problem-solving
• It is a biological and psychological process that happens in the human brain

Examples of Processes
• Understanding spoken or written language
• Solving problems and making decisions
• Recognizing patterns and interpreting emotions
3
Cognition
Goal

• To describe and understand how humans think and process information

Key Features

• Involves subjective, emotional, and intuitive elements


• Relates to consciousness and awareness

4
Cognition Vs Cognitive Computing
Aspect Cognition Cognitive Computing

Refers to technology systems that simulate human


Refers to the mental processes of thought processes.
acquiring knowledge and These systems use AI, Machine Learning (ML), Natural
Definition
understanding through thought, Language Processing (NLP), and other tools to mimic
experience, and the senses. human cognition in analyzing data, reasoning, and
decision-making.

How humans think and understand


Focus Simulating human-like thinking in machines.
the world.

To study and understand human To enhance decision-making and problem-solving through


Goal
thought. machines.

5
Human reasoning, perception, AI-driven systems like IBM Watson: answering questions
Examples
memory. in Jeopardy or assisting in medical diagnosis, chatbots.
Cognitive Computing - Goal
Goal

• Develop systems that can understand, interpret, and respond to complex


information like the way humans do.
• Focuses on replicating human-like cognitive processes and emphasizes
applications where human- machine collaboration is essential.
• AI, on the other hand, encompasses a broader set of technologies and
techniques that aim to create intelligent systems capable of performing
diverse tasks across various domains.

6
Human Intelligence vs AI

7
AI vs Cognitive Computing
Aspect AI Cognitive Computing
Definition Focuses on building systems that replace Focuses on building systems that
human intelligence for automating tasks augment/assist human intelligence, helping
and decision-making. humans make better decisions.

Purpose Automate tasks and make decisions Augment human decision-making by


independently, often replacing human providing tools that mimic human thought
involvement. processes.

Approach Works autonomously to solve problems. Collaborates with humans to assist in


problem-solving.

Technology Machine Learning, Natural Language ML, Natural Language Processing, Big Data
Processing, Neural Networks, Deep Analytics, Neural Networks, Deep Learning,
Learning Sentiment Analysis
Interaction
Minimal or none High
with humans
8
AI vs Cognitive Computing
Aspect AI Cognitive Computing
Applications Autonomous systems, such as robotics, Decision-support systems in industries like
self-driving cars (Tesla), and predictive healthcare, finance, and education.
analytics.
- IBM Watson assisting doctors with
- AI Powered chatbots like Siri, Alexa diagnoses
- Financial Decision Support Systems
- Customer Service Improvement through
NLP

KEY DIFFERENCE: REPLACEMENT vs AUGMENTATION


SIMILARITY: Ability of machines to act, adapt, and reason based on
experience it has learned

9
Foundation of Cognitive Computing
• Natural Language Processing (NLP)
• Machine Learning (ML)
• Big Data Analytics

10
Natural Language Processing (NLP)
Definition

• A field of AI that enables computers to understand


and respond to human language

Key Techniques
Breaking down text into smaller, manageable units, called tokens.
• Tokenization
Word Tokenization:

Sentence Tokenization:

11
Natural Language Processing (NLP)
Definition

• A field of AI that enables computers to understand


and respond to human language

Key Techniques
Breaking down text into smaller, manageable units, called tokens.
Analyzing the grammatical structure of a sentence to
understand the meaning
Word Tokenization:
For the sentence "The cat sits on the mat.", parsing identifies:
• Parsing Subject: "The cat"
Verb: "sits"
Sentence Tokenization:
Prepositional phrase: "on the mat"
12
Natural Language Processing (NLP)
Definition

• A field of AI that enables computers to understand


and respond to human language

Key Techniques
Breaking down text into smaller, manageable units, called tokens.
Analyzing the grammatical structure of a sentence to
Also known asthe
understand opinion mining
meaning
Word Tokenization:
Used
Forto
thedetermine thecat
sentence "The sentiment
sits on the or emotion
mat.", parsingexpressed
identifies: in a
piece"The
Subject: of text
cat"
Verb: "sits"
Sentence Tokenization:
Prepositional phrase: "on the mat"
• Sentiment analysis
13
Natural Language Processing (NLP)
Applications

• Chatbots, language translation, smart assistants:


voice assistants like Siri, Alexa.

Uses in Cognitive Computing


• It allows cognitive systems to interpret human language in all its
complexity, enabling machines to understand spoken or written text much
like a human listener or reader would.

14
Machine Learning (ML)
Definition

• A subset of AI focused on building systems that learn from data.

Types

• Supervised

• Unsupervised, and

• Reinforcement Learning
15
Machine Learning (ML)
Key Algorithms

• Decision trees, SVM, k-nearest neighbors, etc.


Applications:
• Image recognition, Email classification

Uses in Cognitive Computing:


• Equips systems with the ability to learn from data and experiences, much like
humans learn from observation and interaction with the world.
• The more data these systems are exposed to, the more accurately they can make
predictions or decisions.
16
Big Data Analytics
Definition

• A field that involves examining large and complex data sets to uncover
hidden patterns, correlations, and insights, enabling data-driven
decision-making.

Key Techniques
• Data mining, data visualization, predictive analytics, clustering, and real-time
processing.

17
Big Data Analytics
Applications

• Fraud detection, recommendation systems, customer segmentation,


healthcare diagnostics, financial risk analysis.

Uses in Cognitive Computing


• It enables cognitive systems to process and analyze massive volumes of
structured, unstructured and semi-structured data, extracting
meaningful insights to support reasoning, learning, and decision-making
processes.

18
Big Data Analytics

Structured data
• Data that can be processed, stored, and retrieved in a fixed format.
• It refers to highly organized information that can be readily and seamlessly stored
and accessed from a database by simple search engine algorithms.
• Example-The employee table in a company database will be structured as the
employee details, their job positions, their salaries, etc., will be present in an organized
manner.

19
Big Data Analytics

Unstructured data
• Refers to the data that lack any specific form or
structure whatsoever.
• This makes it very difficult and time-consuming
analyze unstructured data.
• Stored in data lakes, NoSQL (Not Only Structured
Query Language) databases, or cloud-based
systems.
• Free-form or undefined format (e.g., text, images,
videos, audio). 20
Big Data Analytics

Semi-structured data
• Pertains to the data containing both the formats
mentioned above, that is, structured and unstructured data.
• To be precise, it refers to the data that although has
not been classified under a particular repository
(database), yet contains vital information or tags that
segregate individual elements within the data.
• Example- Email
• The written content of an email is unstructured,
whereas the there are some inherent structure to the
information in each email, such as the sender’s name, 21

recipient address, recipient name and date sent.


Bank Transactions: Social Media Posts: Emails with Metadata:
22
Fields: Transaction ID, Account Facebook posts, tweets, The body is unstructured, but
Number, Date, Amount, Instagram captions, and metadata like sender, receiver,
Merchant Name. hashtags. subject, and timestamp is structured.
Big Data
Definition

• Refers to the dynamic, large and disparate volumes of data being


created by people, tools, and machines.
• It requires new, innovative, and scalable technology to collect,
host, and analytically process the vast amount of data gathered in
order to derive real-time business insights that relate to consumers,
risk, profit, performance, productivity management, and enhanced
shareholder value. - Ernst & Young

23
All definitions have 5 common characteristics
Characteristics of Big Data – 5Vs

24
• Velocity is the speed at which data is being created
in real time.
• Data is being generated extremely fast, in a process
that never stops
• Near or real-time streaming, local, and cloud-based
technologies can process information very quickly

25
• Volume is the scale of the data, or the increase in the
amount of data stored.
• Drivers of volume are the increase in data sources,
higher resolution sensors, and scalable infrastructure
• A large amount of data is stored in data warehouses.

26
• Variety is the diversity of the data.
• Structured data fits neatly into rows and columns, in relational
databases while unstructured data is not organized in a pre-defined
way, like Tweets, blog posts, pictures, numbers, and video.
• Variety also reflects that data comes from different sources,
machines, people, and processes, both internal and external to
organizations.
• Drivers are mobile technologies, social media, wearable
technologies, geo technologies, video, and many, many more.
Machines

People,
27

Processes
• Veracity is the quality and origin of data, and its conformity to facts
and accuracy.
• Attributes include consistency, completeness, integrity, and
ambiguity.
• Drivers include cost and the need for traceability.
• With the large amount of data available, the debate rages on about
the accuracy of data in the digital age.
• Is the information real, or is it false?

28
• Value is our ability and need to turn data into value.
• Value isn't just profit.
• It may have medical or social benefits, as well as
customer, employee, or personal satisfaction.
• The main reason that people invest time to understand
Big Data is to derive value from it.

29
Fundamental Principles of Cognitive Computing

Learn Model Generate


Hypotheses
Gathering and Organizing data into a Proposing and testing
understanding data useful structure possible solutions
30
Fundamental Principles of Cognitive Computing
• A cognitive system learns from data.
• The system leverages data to make inferences
about a domain, a topic, a person, or an issue
based on training and observations from all
varieties, volumes, and velocity of data.

Learn

31
Fundamental Principles of Cognitive Computing

• To learn, the system needs to create a model or


representation of a domain (which includes
internal and potentially external data) and
assumptions that dictate what learning algorithms
are used
Model

32
Fundamental Principles of Cognitive Computing
• The hypothesis is defined as the supposition or proposed
explanation based on insufficient evidence or assumptions
• It is just a guess based on some known facts but has not yet
been proven.
• A good hypothesis is testable, which results in either true or
false.
• Example:
Generate • Some scientist claims that ultraviolet (UV) light can damage the eyes then it
may also cause blindness.
Hypotheses • In this example, a scientist just claims that UV rays are harmful to the eyes,
but we assume they may cause blindness.
• However, it may or may not be possible. Hence, these types of assumptions
are called a hypothesis.

33
Fundamental Principles of Cognitive Computing
• Generate Hypothesis-
• A cognitive system assumes that there is not a single correct answer.
• The most appropriate answer is based on the data itself.
• Therefore, a cognitive system is probabilistic.
• A hypothesis is a candidate explanation for some of the data already
understood.
• A cognitive system uses the data to train, test, or score a hypothesis.

Generate
Hypotheses

34

You might also like