0% found this document useful (0 votes)
17 views46 pages

Unit 3

Uploaded by

070Sagar Vashist
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views46 pages

Unit 3

Uploaded by

070Sagar Vashist
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 46

UNIVERSITY INSTITUTE OF ENGINEERING

DEPARTMENT OF AIT - CSE


Bachelor of Engineering (CSE)
Natural Language Processing (21CSH-399)
By: Prabhjot Kaur (E16646)

SEMANTICS:
DISCOVER . LEARN . EMPOWER
(REPRESENTING MEANING):

1
Content
• Computational Desiderata for representation
• Meaning structure of language
• First order predicate calculus
• linguistically relevant concept
• Related Representational approaches
• Alternative approaches to meaning.

2
The Levels of Language Analysis
Language processing involves analyzing text at different levels:

• Morphology: Deals with the structure of words (e.g., prefixes,


suffixes)
• Syntax: Focuses on the grammatical structure of sentences
(e.g., word order)
• Semantics: Uncovers the meaning of words and sentences
Example:
o Just like peeling an onion, language analysis involves layers of complexity. Before computers
can grasp the meaning of a sentence, they need to understand the individual words
(morphology) and how they are arranged (syntax). Semantic analysis builds upon these
foundations to unlock the true essence of what is being communicated.

3
Semantic Analysis
• Semantic analysis delves into the meaning of text, uncovering
the relationships between words and concepts.
”Giving exact meaning of Text”
Role of semantic analyzer is to check text for the
meaningfulness.
• It aims to understand:
• The meaning of individual words (lexical semantics)
• How words combine to create meaning (sentence semantics)
• The context in which language is used
• Example: Semantic analysis goes beyond the literal meaning of words. It
considers the context, relationships between words, and the broader world
knowledge to extract the intended meaning. Imagine a brain trying to decipher a
message - semantic analysis equips computers with similar capabilities to
understand the true intent behind human language.

4
Semantic Processing

• Representations that
– Permit us to reason about their truth (i.e., their relationship to
some world)
– Permit us to answer questions based on their content
– Permit us to perform inference (answer questions and determine
the truth of things we don’t already know to be true)

5
Semantic Analysis
• Meaning of individual • Combination of
word words(sentences)
(Lexical semantic) • Relationship exist
between the words
• The word order
context
• Semantic structure
• Real word knowledge

6
Composition Semantic
• It involves how words combines to form a larger
meaning
• It says that meaning of each word matters but syntax
and the way in which the sentence constructed plays an
important role as well.
Example: “I like you and you like me “ are same words but
meaning is different
I- Subject | you – Subject
You – Object | me –object
Lexical Semantic + composition semantic= semantic analysis

7
Semantic Analysis
• Compositional Analysis
– Create a logical representation that accounts for all the entities,
roles and relations present in a sentence.

8
Semantic Processing
• Several ways to attack this problem
– Limited, shallow, practical approaches that have some hope of
actually being useful
• Information extraction
– Principled, theoretically motivated approach…
• Computational/Compositional Semantics
– Chapters 17 and 18
– Something midway that can plausibly serve both purposes
• Semantic role labeling

9
Application of Semantic Analysis
• Semantic analysis empowers various NLP applications:
– Machine translation: To translate text accurately, capturing the true meaning,
not just word-for-word conversion.
– Text summarization: To generate summaries that capture the essence of the
text.
– Sentiment analysis: To understand the emotional tone of text (positive,
negative, neutral).
– Question answering: To provide accurate answers to user queries by
understanding the context and intent.
– Chatbots: To create chatbots that can engage in meaningful conversations by
understanding the user's intent.
• Example: Semantic analysis is the driving force behind many of the
NLP applications we encounter daily. From chatbots that seem to
understand our requests to machine translation that captures the
nuances of language, semantic analysis plays a critical role in
making these applications more helpful and accurate.

10
Challenges in Semantic Analysis
• Semantic analysis faces several challenges:
– Word ambiguity: Words can have multiple meanings depending
on the context (e.g., bat - animal, baseball tool)
– Idioms and sarcasm: Figurative language and expressions can
be difficult for computers to interpret literally.
– Context dependence: Meaning can shift significantly based on
the surrounding text and situation.
• Example: Just like humans can sometimes
misunderstand each other, semantic analysis also faces
hurdles.

11
Meaning Representations

• Semantic analysis create a representation of the meaning of


the sentence. The same basic approach to meaning that we
took to syntax and morphology
• To create representations of linguistic inputs that capture the
meanings of those inputs.
• But unlike parse trees, these representations aren’t primarily
descriptions of the structure of the inputs…
• In natural language processing (NLP), meaning representation
refers to the process of capturing the semantics or meaning of
natural language expressions in a structured and
computationally tractable form. The goal is to represent the
meaning of text in a way that can be understood and
processed by computers.
12
Basic building Blocks
• Before to getting the concept and approach for meaning
representation ,understand of building blocks of
semantic analysis.
• The following building blocks play the important role in
word presentation.
 Entity: represent the individual like particular person, place etc
 Concepts: represent general category of individual like person, city etc.
 Relations: represent relationship between the entity and concept , like
Ram is a person.
• Predicates : represent the linguistic expressions that denote properties,
relations, or actions within a sentence. They typically consist of a verb or a
verb phrase that describes an action or a state. Here's an example of a
predicate in a sentence:
• “The cat is sleeping”, predicate : is sleeping
13
Representational Schemes
(Approaches for meaning representation)
• We’re going to make use of First Order Logic (FOL) as
our representational framework
1. First order Predicate Logic (FOPL)
2. Semantic Nets
3. Frames
4. Conceptual Dependency (CD)
5. Rule based Architecture
6. Case grammar
7. Conceptual graphs

14
FOPL
• First-order Perdicate logic (FOPL), also known as first-order
predicate calculus or first-order predicate logic, is a formal
system used in mathematics, philosophy, linguistics, and
computer science to represent statements about objects,
relationships between objects, and properties of objects. In
first-order logic, statements are built using quantifiers,
variables, predicates, functions, and logical connectives.
• Allows for…
– The analysis of truth conditions
• Allows us to answer yes/no questions
– Supports the use of variables
• Allows us to answer questions through the use of variable binding
– Supports inference
• Allows us to answer questions that go beyond what we know explicitly

15
First-Order logic:
• First-order logic is another way of knowledge representation in artificial
intelligence. It is an extension to propositional logic.

• FOL is sufficiently expressive to represent the natural language statements


in a concise way.

• First-order logic is also known as Predicate logic or First-order predicate


logic. First-order logic is a powerful language that develops information
about the objects in a more easy way and can also express the relationship
between those objects.

16
First-order logic (like natural language) does not only assume that the
world contains facts like propositional logic but also assumes the
following things in the world:
• Objects: A, B, people, numbers, colors, wars, theories, squares,
pits, ......
• Relations: It can be unary relation such as: red, round, is adjacent,
or n-any relation such as: the sister of, brother of, has color,
comes between
• Function: Father of, best friend, third inning of, end of, ......
• As a natural language, first-order logic also has two main parts:

a. Syntax
b. b. Semantics

17
Cont..
• Real world facts can be represented as logical proposion
in logical symbols.
1.Objects
2. Properties
3. Relation
• Symbols are formed in the following
--set of Uppercase English alphabets
-- set of digit 0 to9
-- Underscore (_) special symbol

18
Predicate-Argument Structure
• Predicates
– Primarily Verbs, VPs, Sentences
– Sometimes Nouns and NPs
• Arguments
– Primarily Nouns, Nominals, NPs, PPs

19
Syntax of First-Order logic:

• The syntax of FOL determines which collection of


symbols is a logical expression in first-order logic. The
basic syntactic elements of first-order logic are symbols.
We write statements in short-hand notation in FOL

20
Basic Elements of First-order logic:
• Following are the basic elements of FOL syntax:

Constant 1, 2, A, John, Mumbai,


cat,....

Variables x, y, z, a, b,....

Predicates Brother, Father, >,....

Function sqrt, LeftLegOf, ....

Connectives ∧, ∨, ¬, ⇒, ⇔

Equality ==

Quantifier ∀, ∃

21
Atomic Sentences
• Atomic sentences are the most basic sentences of first-
order logic. These sentences are formed from a
predicate symbol followed by a parenthesis with a
sequence of terms.
• We can represent atomic sentences as
• Predicate (term1, term2, ......, term n).
• Example:
Ravi and Ajay are brothers: => Brothers(Ravi,Ajay).

Chinky is a cat: => cat (Chinky).

22
Complex Sentences:
• Complex sentences are made by combining atomic
sentences using connectives.
• First-order logic statements can be divided into two parts:
Subject: Subject is the main part of the statement.
Predicate: A predicate can be defined as a relation,
which binds two atoms together in a statement.

Consider the statement: "x is an integer.", it consists of two


parts, the first part x is the subject of the statement and
second part "is an integer," is known as a predicate

23
Components of FOL
1. Quantifiers specify the scope of variables in a statement. The two most
common quantifiers are the existential quantifier (∃), which means "there
exists," and the universal quantifier (∀), which means "for all.“
2. Variables are symbols that represent elements of a domain. They can stand
for objects, numbers, or other entities.
3. Predicates are symbols that represent properties or relations between
objects. They are used to make statements about objects in the domain. For
example, "P(x)" might represent the property "x is red."
4. Functions are symbols that represent operations that take one or more
inputs and produce an output. They are used to represent relationships
between objects. For example, "f(x)" might represent the function "double x."
5. Logical connectives, such as conjunction (AND), disjunction (OR), negation
(NOT), implication (IF...THEN), and biconditional (IF AND ONLY IF), are used
to combine simpler statements into more complex ones.

24
Defining a Sentence
• Every atomic sentence is defined as predicate rules:
1. If S is a sentence than ⌐S is a sentence
2. If S1 and S2 is a sentence than S1

25
Meaning Structure of Language

• Natural languages convey meaning through the use of


– Predicate-argument structures
– Variables
– Quantifiers
– Compositional semantics

26
Predicate-Argument Structure
• Events, actions and relationships can be captured with
representations that consist of predicates and arguments
to those predicates.
• Languages display a division of labor where some words
and constituents (typically) function as predicates and
some as arguments.

27
Quantifiers in First-order logic:
• A quantifier is a language element which generates quantification,
and quantification specifies the quantity of specimen in the universe
of discourse. These are the symbols that permit to determine or
identify the range and scope of the variable in the logical
expression.
• There are two types of quantifier:

a. Universal Quantifier, (for all, everyone, everything)


b. Existential quantifier, (for some, at least one).

28
Universal Quantifier:
• Universal quantifier is a symbol of logical representation, which
specifies that the statement within its range is true for everything or
every instance of a particular thing.
• The Universal quantifier is represented by a symbol ∀, which
resembles an inverted A.
Note: In universal quantifier we use implication "→". If x is a variable,
then ∀x is read as:
• For all x
• For each x
• For every x

29
Example
• Mary gave a list to John.
• Giving(Mary, John, List)
• More precisely
– Gave conveys a three-argument predicate
– The first argument is the subject
– The second is the recipient, which is conveyed by the NP inside
the PP
– The third argument is the thing given, conveyed by the direct
object

30
Note
• Giving(Mary, John, List) is pretty much the same as
– Subj(Giving, Mary), Obj(Giving, John), IndObj(Giving, List)
– Which should look an awful lot like.... what?

31
Better
• Turns out this representation isn’t quite as useful as
it could be.
• Better would be

Speech and Language Processing - Jurafsky and Martin 32


Predicates
• The notion of a predicate just got more complicated…
• In this example, think of the verb/VP providing a
template like the following

• The semantics of the NPs and the PPs in the sentence


plug into the slots provided in the template

Speech and Language Processing - Jurafsky and Martin 33


Two Issues
• How can we create this • What makes that
kind of representation in representation a
a principled and efficient “meaning”
way representation, as
opposed say to a parse
tree?

Speech and Language Processing - Jurafsky and Martin 34


Semantic Analysis
• Semantic analysis is the process of taking in some
linguistic input and assigning a meaning representation
to it.
– There a lot of different ways to do this that make more or less
(or no) use of syntax
– We’re going to start with the idea that syntax does matter
• The compositional rule-to-rule approach

Speech and Language Processing - Jurafsky and Martin 35


Compositional Analysis
• Principle of Compositionality
– The meaning of a whole is derived from the meanings of the
parts
• What parts?
– The constituents of the syntactic parse of the input
• What could it mean for a part to have a meaning?

Speech and Language Processing - Jurafsky and Martin 36


Example
• Franco likes Frasca.

Speech and Language Processing - Jurafsky and Martin 37


Compositional Analysis

Speech and Language Processing - Jurafsky and Martin 38


Augmented Rules
• We’ll accomplish this by attaching semantic formation
rules to our syntactic CFG rules
• Abstractly

• This should be read as the semantics we attach to A


can be computed from some function applied to the
semantics of A’s parts.

Speech and Language Processing - Jurafsky and Martin 39


Example

• Easy parts… • Attachments


– NP -> PropNoun {PropNoun.sem}
– PropNoun -> Frasca {Frasca}
– PropNoun -> Franco {Franco}

Speech and Language Processing - Jurafsky and Martin 40


Lambda Forms
• A simple addition to
FOL
– Take a FOL sentence
with variables in it that
are to be bound.
– Allow those variables
to be bound by
treating the lambda
form as a function
with formal arguments

Speech and Language Processing - Jurafsky and Martin 41


Compositional Semantics by
Lambda Application

Speech and Language Processing - Jurafsky and Martin 42


Lambda Applications and
Reductions

Speech and Language Processing - Jurafsky and Martin 43


Lambda Applications and
Reductions

VP VP  Verb NP {Verb.sem(NP.Sem)

Frasca

Speech and Language Processing - Jurafsky and Martin 44


Complex NPs
• Things get quite a bit more complicated when we start
looking at more complicated NPs
– Such as...
• A menu
• Every restaurant
• Not every waiter
• Most restaurants
• All the morning non-stop flights to Houston

Speech and Language Processing - Jurafsky and Martin 45


Quantifiers
• Contrast...
– Frasca closed

• With
– Every restaurant closed

Speech and Language Processing - Jurafsky and Martin 46

You might also like