0% found this document useful (0 votes)
16 views

NLP Self

Uploaded by

adilkhang34
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

NLP Self

Uploaded by

adilkhang34
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 22

Introduction To NLP

• Natural Language Processing:


• Natural Language Processing (NLP) is the sub-field of Computer Science especially Artificial
Intelligence (AI) that is concerned about enabling computers to understand and process
human language
• The ultimate goal of NLP is to help computers understand human language as well as we do
• NLP focuses on communication between computers and humans in
natural language and NLP is all about making computers understand and generate human
language
• Natural Language Understanding (NLU):

• Definition: Natural Language Understanding (NLU) is a branch of artificial


intelligence (AI) that focuses on the ability of a computer system to
understand, interpret, and respond to human language in a meaningful way.
• Explanation:
• Explanation: NLU allows computers to comprehend the context and meaning
behind human language. Unlike simple text processing, which might only
recognize words and sentences, NLU involves deeper comprehension,
including understanding the intent behind the words, identifying entities (like
names of people, places, or organizations), and extracting relevant
information.
• NLU involves several complex tasks:

• Tokenization: Breaking down a text into individual words or phrases.


• Named Entity Recognition (NER): Identifying and classifying key elements in
the text, such as names of people, organizations, dates, etc.
• Sentiment Analysis: Determining the sentiment or emotional tone of the
text, whether it’s positive, negative, or neutral.
• Intent Recognition: Understanding what the user intends to achieve with
their query or statement.
• Context Understanding: Grasping the context in which the language is used
to provide appropriate responses.
• Vocabulary: A vocabulary is a collection of words.
• Text: A text is made up of a sequence of words from a vocabulary.
• Language: A language is made up of all possible texts. It's a method of
communication that allows us to speak, read, and write.

• NATURAL LANGUAGE PROCESSING:


• Natural language processing studies interactions between humans and
computers to find ways for computers to process written and spoken words
similar to how humans do. The field blends computer science, linguistics and
machine learning.
• Speech recognition — the translation of spoken language into text.
• Natural language understanding — a computer’s ability to understand language.
• Natural language generation — the generation of natural language by a
computer.
• Why NLP is Difficult :

• Complexity: Human language is very complex, making it difficult to


understand.
• Infinite Arrangements: There are countless ways to arrange words in a
sentence.
• Multiple Meanings: Words can have different meanings, so context is needed
to understand sentences correctly.
• Uniqueness and Ambiguity: Every language has its own unique and often
unclear aspects
• Syntactic and Semantic Analysis:
• Syntactic analysis (syntax) and semantic analysis (semantics) are the main
techniques for understanding natural language. Syntax looks at the
grammatical structure of the text, while semantics focuses on the meaning of
the words and sentences.

• Syntactic analysis:
• Syntactic analysis, also known as syntax analysis or parsing, is the process of
examining natural language using formal grammar rules. It assigns a structure
to the text. For instance, in a sentence, the subject is a noun phrase, and the
predicate is a verb phrase.
• Example: In the sentence "The dog went away," "The dog" is the noun phrase
(subject) and "went away" is the verb phrase (predicate).
• SEMANTIC ANALYSIS

• We understand what people say without thinking much about it, using our
knowledge of language. To really understand language, we need to know the
meaning of words and the context they are used in. Semantic analysis is about
figuring out the meaning of words and sentences. This is very difficult and still
not fully solved in natural language processing.
• Natural Language Processing Techniques for Understanding Text:
• Parsing
• Definition: Parsing means breaking down a sentence to see what each part does.
• Explanation: It's like taking a sentence apart to understand how it's built.
Computers do this to figure out the grammar and meaning of the sentence.
• Example: In the sentence "The dog chased the cat," parsing shows that "The dog" is
the one doing something, "chased" is what it's doing, and "the cat" is the one being
chased.

• Stemming
• Definition: Stemming is about finding the main part of a word.
• Explanation: It's like looking at a word and figuring out its basic form, ignoring any
extra bits added to the beginning or end.
• Example: For words like "touching," "touched," and "touches," stemming tells us
they're all about the action of "touch."
• Text Segmentation
• Definition: Text segmentation is breaking up text into smaller parts that make
sense. Explanation: It's like chopping up a long piece of text into smaller,
understandable pieces so a computer can work with it better. Example: Most of
the time, words are separated by spaces, but sometimes words like "icebox"
need special treatment because they're stuck together.

• Named Entity Recognition (NER)


• Definition: NER is about spotting important things in text, like names of people
or places. Explanation: It's like a detective searching through a text for names,
organizations, or other special stuff, and then putting them into groups.
Example: In "The OIC meeting was held in Islamabad," NER finds "OIC" as a
group name and "Islamabad" as a place name.
• Relationship Extraction
• Definition: Relationship extraction is figuring out how things mentioned in text
are connected to each other. Explanation: It's like finding out who's connected
to who or what, based on the things mentioned in a text. Example: If a text
says "Sarah works for ABC Corp," relationship extraction tells us Sarah is linked
to ABC Corp as an employee.

• Sentiment Analysis
• Definition: Sentiment analysis is about understanding if the writer or speaker
feels positive, negative, or neutral about something. Explanation: It's like
reading a text and figuring out if the person is happy, sad, or just giving
information. Example: If a review says "I love this phone," sentiment analysis
knows it's positive, but if it says "I hate this phone," it's negative. This helps
businesses know what customers think about their products.
• Applications Of NLP:

• Spell And Grammar Checking


• Translation of languages
• Hand Written To computer Readable TEXT
• Recognizing Spoken Language And transforming it to TEXT
Parsing:
• Parsing is the process of examining the grammatical structure and
relationships inside a given sentence or text in natural language
processing (NLP).
• It involves analyzing the text to determine the roles of specific words,
such as nouns, verbs, and adjectives, as well as their
interrelationships.
• How words in a phrase connect to one another.
• Parsers expose the structure of a sentence by constructing parse trees
or dependency trees that illustrate the hierarchical and syntactic
relationships between words.
• Types
• 1: Syntactic 2: Semantic
• Syntactic Parsing :
• Syntactic parsing deals with a sentence’s grammatical structure. It involves
looking at the sentence to determine parts of speech, sentence boundaries, and
word relationships.
Constituency Parsing:
• Constituency Parsing builds parse trees that break down a sentence into its
constituents, such as noun phrases and verb phrases.
• It displays a sentence’s hierarchical structure, demonstrating how words are
arranged into bigger grammatical units.
Dependency Parsing:
• Dependency parsing depicts grammatical links between words by constructing a
tree structure in which each word in the sentence is dependent on another.
• It is frequently used in tasks such as information extraction and machine
translation because it focuses on word relationships such as subject-verb-object
relations.
• Semantic Parsing :
• Semantic parsing goes beyond syntactic structure to extract a sentence’s meaning or
semantics.
• It attempts to understand the roles of words in the context of a certain task and how
they interact with one another.
• Semantic parsing is utilized in a variety of NLP applications, such as question
answering, knowledge base populating, and text understanding.
• It is essential for activities requiring the extraction of actionable information from text.
• Parsing Techniques In NLP:
• The fundamental link between a sentence and its grammar is derived from a parse
tree.
• A parse tree is a tree that defines how the grammar was utilized to construct the
sentence.
• There are mainly two parsing techniques, commonly known as top-down and bottom-
up.
• S  NP , VP (Noun phrase , Verb Phrase )

• VP  N , VP (VP is Further defined in the Verb and NP)

• NP  N
• Parsers and Its Types in NLP:

• Recursive Descent Parser: This parser starts from the top of a grammar rule
and works its way down by breaking it into smaller parts. It's like taking apart
a big rule into smaller rules until it's easy to understand. People often use this
method to write parsers for simple programming languages or specific areas.
• Shift-Reduce Parser: This parser starts from the input and builds a tree by
moving data around and applying grammar rules. It's like solving a puzzle by
shifting pieces around and fitting them together. This method is commonly
used for programming languages and uses techniques like LR or LALR.
• Chart Parser: This parser efficiently processes words using a dynamic
programming approach. It's like solving a complex problem by storing and
reusing solutions to smaller parts. An example is the "Early parser" which is
good for context-free grammars.
• Regexp Parser: This parser matches patterns and extracts text from larger
documents using regular expressions. It's like finding specific words or
phrases in a book by looking for patterns. This method is useful for tasks like
text processing and information retrieval.

How Does Parser Work?


• Think of a parser like a detective for sentences. It starts by looking at a
sentence and figuring out who or what it's about (the subject) and what's
happening (the action). Then, it looks at each word and decides what kind of
word it is, like a noun, verb, or adjective. After that, it starts putting
everything together like pieces of a puzzle, following the rules of grammar to
make sure everything fits just right. Finally, it checks its work to make sure
the sentence makes sense and doesn't have any mistakes. So, a parser is like
a smart detective that breaks down sentences, figures out what each word
does, and puts it all together to make sure everything is in order.
• (General Only for Paper)
• Representing Meanings in NLP :
• When we talk about representing meanings in Natural Language Processing
(NLP), we're basically trying to teach computers how to understand the
meaning of words and sentences, just like humans do.
• Word Representations: To start, we need to show computers what each word
means. We can do this by assigning each word a special code or vector that
represents its meaning. For example, we might use numbers to represent
words, where each number corresponds to a different word.
• Semantic Relationships: Next, we want computers to understand how words
are related to each other. For instance, we want them to know that "cat" and
"dog" are similar because they're both animals, while "car" is different
because it's a vehicle. This helps computers understand the context of a
sentence.
• Sentence Understanding: Once computers know the meaning of
individual words and how they relate to each other, they can start
understanding entire sentences. They can look at the meanings of the
words in a sentence and piece together what the sentence is trying to
say.
• Applications: Finally, once computers can understand the meanings of
words and sentences, they can do all sorts of useful things, like answering
questions, translating languages, or even summarizing long texts.
• So, representing meanings in NLP is all about teaching computers to
understand the meaning of language, so they can do useful tasks just like
humans do.
• Semantic Roles :
• Semantic roles refer to the different jobs or functions that words play within a
sentence to convey its meaning. In simpler terms, it's like assigning roles to
words in a sentence to understand who is doing what to whom.
• Agent: The agent is the "doer" or "performer" of an action. For example, in
the sentence "The cat chased the mouse," "the cat" is the agent because it's
doing the chasing.
• Patient: The patient is the entity that undergoes the action. In the same
sentence, "the mouse" is the patient because it's the one being chased.
• Theme: The theme is what the action is directed towards or what is affected
by the action. In "John ate the apple," "the apple" is the theme because it's
what is being eaten.
• Experiencer: The experiencer is the one who perceives or experiences
something. For instance, in "Mary likes chocolate," "Mary" is the experiencer
because she is the one experiencing the liking.

• Instrument: The instrument is the means by which an action is performed.


In "She cut the paper with scissors," "scissors" is the instrument because it's
what she used to cut the paper.

• Understanding these semantic roles helps computers grasp the meaning of


sentences more accurately, making it easier for them to process and analyze
natural language text.

You might also like