SlideShare a Scribd company logo
Engineering
Intelligent NLP
Applications Using
Deep Learning –
Part 1
Saurabh Kaushik
• Part 1:
• Why NLP?
• What is NLP?
• What is the Word & Sentence
Modelling in NLP?
• What is Word Representation in
NLP?
• What is Language Processing in
NLP?
Agenda
• PART 2 :
• WHY DL FOR NLP?
• WHAT IS DL?
• WHAT IS DL FOR NLP?
• HOW RNN WORKS FOR NLP?
• HOW CNN WORKS FOR NLP?
WHY NLP?
What are Generally Known NLPApplications?
Search
Customer SupportQ & A
Summarization
Are there More DeeperApplications of NLP?
Group 1
Cleanup, tokenization
Stemming
Lemmatization
Part-of-speech tagging
Query expansion
Parsing
Topic segmentation and recognition
Morphological segmentation
(word/Sentences)
Group 2
Information retrieval and Extraction
(IR)
Relationship Extraction
Named entity recognition (NER)
Sentiment analysis /Sentence
boundary disambiguation
Word sense and disambiguation
Text similarity
Coreference resolution
Discourse analysis
Group 3
Machine translation
Automatic summarization /
Paraphrasing
Natural language generation
Reasoning over Knowledge base
Question answering System
Dialog System
Image Captioning & other multimodal
tasks
WHAT IS NLP?
• According to Wikipedia:
• Natural language processing (NLP) is a field
of Computer science and Linguistics
concerned with the
• Interactions between computers and
human (natural) languages.
What is NLP?
So far, Computing Device and its Interaction
with Human are two separate thing. But in true
Digital World, this gap needs to bridged by
integrating Human Conversational
Understanding into Intelligent
Apps/Systems/Things, in order to achieve its
true potential.
Ref: https://en.wikipedia.org/wiki/Natural_language_processing
Why Language is so Challenging for Computer?
• Every sentence has
many possible
interpretations.
Language
is
ambiguous
• We will always
encounter new
words or new
constructions
Language
is
productive
• Same word has
different meaning.
Language
is culturally
specific
• Lexical Analysis − It involves identifying and analyzing
the structure of words. Lexicon of a language means the
collection of words and phrases in a language. Lexical
analysis is dividing the whole chunk of txt into
paragraphs, sentences, and words.
• Syntactic Analysis (Parsing) − It involves analysis of
words in the sentence for grammar and arranging words
in a manner that shows the relationship among the
words. The sentence such as “The school goes to boy” is
rejected by English syntactic analyzer.
• Semantic Analysis − It draws the exact meaning or the
dictionary meaning from the text. The text is checked for
meaningfulness. It is done by mapping syntactic
structures and objects in the task domain. The semantic
analyzer disregards sentence such as “hot ice-cream”.
Also called Compositional Semantic.
• Discourse Integration − The meaning of any sentence
depends upon the meaning of the sentence just before it.
In addition, it also brings about the meaning of
immediately succeeding sentence.
• Pragmatic Analysis − During this, what was said is re-
interpreted on what it actually meant. It involves deriving
those aspects of language which require real world
knowledge.
What is NLP Processing?
• Grammar Parsing:
• Articles (DET) − a | an | the
• Nouns − bird | birds | grain | grains
• Noun Phrase (NP) − Article + Noun | Article + Adjective
+ Noun = DET N | DET ADJ N
• Verbs − pecks | pecking | pecked
• Verb Phrase (VP) − NP V | V NP
• Adjectives (ADJ) − beautiful | small | chirping
• POS Tagging:
• Parsing:
• S → NP VP
• NP → DET N | DET ADJ N
• VP → V NP
• Lexicon:
• DET → a | the
• ADJ → beautiful | perching
• N → bird | birds | grain | grains
• V → peck | pecks | pecking
What are Basics Component of NLP?
“The bird pecks the grains”
Parse Tree:
How does NLP understand Syntactically?
Part of Speech – Tagging
WHAT WORD &
SENTENCE MODELLED IN
NLP?
• What is the meaning of words?
• Most words have many different senses:
• E.g. dog = animal or sausage?
How does NLP get Word Meanings?
Word Meaning:
• Polysemy:
• A lexeme is polysemous if it has different related senses
• E.g. bank = financial institution or building
• Homonyms:
• Two lexemes are homonyms if their senses are
unrelated, but they happen to have the same spelling
and pronunciation
• E.g. bank = (financial) bank or (river) bank
• How are the meanings of different words related?
• Specific relations between senses:
• E.g. Animal is more general than dog.
• Semantic fields:
• E.g. money is related to bank
How does NLP get Word Relationships?
Word Relationships:
 Symmetric Relations:
– Synonyms: couch/sofa
 Two lemmas with the same sense
– Antonyms: cold/hot, rise/fall, in/out
 Two lemmas with the opposite sense
 Hierarchical relations:
 Hypernyms and Hyponyms: pet/dog
– The hyponym (dog) is more specific than the
hypernym (pet)
 Homonyms and Meronyms: car/wheel
– The meronym (wheel) is a part of the holonym (car)
• Principle of compositionality:
• The meaning (vector) of a complex expression (sentence) is determined by:
• the meanings of its constituent expressions (words) and
• the rules (grammar) used to combine them”
How does NLP get Sentence Composability?
• SCENE PARSING:
• THE MEANING OF A SCENE IMAGE IS ALSO A
FUNCTION OF SMALLER REGION.
• HOW THEY COMBINE TO FORM AN LARGE OBJECT.
• AND HOW OBJECT INTERACT.
• Sentence Parsing:
• The meaning of a sentence is a function of words.
• How they combine to form an large sentences.
• And how Word Interact in a given sentence.
WHAT IS WORD
REPRESENTATION IN
NLP?
What is basic Linear Representation of Words?
Definition
• Documents are treated as a “bag” of words or
terms.
• Any document can be represented as a vector: a
list of terms and their associated weights
Pros
• Simple Model to start with
Cons
• Disregarding grammar (term.baseform?)
• Disregarding word order (term.position)
• Keeping only multiplicity (term.frequency)
• Less Accurate
Technique : TFIDF:
• Term frequency – inverse document frequency
• TF - is term frequency in a document function - i.e.
measure on how much information the term brings in
one document
• IDF - is inverse document frequency of the term
function - i.e. inversed measure on how much
information the term brings in all documents (corpus)
• Formula:
• t - term, d - one document, D - all documents
Bag of Words
• Statistical Modeling
• Word ordering information lost
• Data sparsity
• Words as atomic symbols
• Very hard to find higher level features
• Features other than BOW
What is Distributed Representation?
Neural Network Modeling
• Trained in a completely unsupervised way
• Reduce data sparsity
• Semantic Hashing
• Appear to carry semantic information about the words
• Freely available for Out of Box usage
Linguistic items with similar distributions have similar meanings. Generally, it is based on co-occurrence/ context and
based on the Distributional hypothesis. Distributional meaning as co-occurrence vector.
What is One Hot Encoding?
Definition:
• The vast majority of rule-‐based and Statistical NLP work
regards words as atomic symbols.
• Form vocabulary of words that maps lemmatized words to a
unique ID (position of word in vocabulary).
• Typical vocabulary sizes will vary between 10 000 and 250
000.
• The one-hot vector of an ID is a vector filled with 0s, except
for a 1 at the position associated with the ID.
• ex.: for vocabulary size D=10, the one-hot vector of word
ID w=4 is e(w) = [ 0 0 0 1 0 0 0 0 0 0 ]
• A one-hot encoding makes no assumption about word
similarity. All words are equally different from each other.
Pros
• Simplicity
Cons
• Notion of word similarity is undefined with one-hot encoding
social [0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0]
public [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0]
• Impossible to generalize to unseen words
• One-hot encoding can be memory inefficient
• One of the most successful ideas of modern statistical NLP!
What is Word Embedding?
“You shall know a word by the company it keeps”
(J. R. Firth 1957)
these words represent banking
Definition:
• Help to find Syntactical as well as Semantical Similarity
Pros
• Simplicity
• Possible to generalize to unseen words
Cons
• All words are equal, but some words are more equal than
others.
What is Word Embedding?
Cosine similarity
Vector Representation
• Allow ability to map each document in a corpus to a n-
dimensional vector, where n is the size of the
vocabulary.
• represent each unique word as a dimension and the
magnitude along this dimension is the count of that
word in the document.
• Given such vectors a, b, …, we can compute the
vector dot product and cosine of the angle between
them.
• The angle is a measure of alignment between 2
vectors and hence similarity.
• An example of its use in information retrieval is to:
Vectorize both the query string and the documents and
find similarity(q, di) for all from 1 to n.
Word2Vec Vector for “Sweden”
What is Word Embedding?
Classical Example to show, How vector can help computer understand semantic meanings between words of a
language.
WHAT IS LANGUAGE
MODELING IN NLP?
• A language model is a probabilistic model that assigns probabilities to any sequence of words p(w1, ...
,wT)
• Language modeling is the task of learning a language model that assigns high probabilities to well
formed sentences
• Plays a crucial role in speech recognition and machine translation systems
• There are three Types of Language Modelling
• Linear Language Modelling – Addressed by finding probability of a word appearing in corpus
• Statistical Language Modelling – Addressed by finding probability of a word in sequence/presence
of other words.
• Neural Language Modelling – Addressed by understanding the context of word in its neighbor?
• Recursive Language Modelling – Addressed by understanding the sequence of words appearing
one after another. .
What is Language Modeling?
• An n-gram is a sequence of n words
• unigrams(n=1):’‘is’’,‘‘a’’,‘‘sequence’’,etc.
• bigrams(n=2): [‘‘is’’,‘‘a’’], [‘’a’’,‘‘sequence’’],etc.
• trigrams(n=3): [‘’is’’,‘‘a’’,‘‘sequence’’], [‘‘a’’,‘‘sequence’’,‘‘of’’], etc.
• n-gram models estimate the conditional from n-grams counts
What is Linear Language Modelling? (N-Gram)
What is Statistical Language Modelling?
• Problem:
• How can we handle co-occurrence of language in our
models?
• Solution
• Using probabilistic modeling any co-occurrence of word
can be modelled.
• A language model is a probabilistic model that assigns
probabilities to any sequence of words p(w1, ... ,wT)
• Language modeling is the task of learning a language
model that assigns high probabilities to well formed
sentences
• Plays a crucial role in speech recognition and machine
translation systems
• Language models define probability distributions over
(natural language) strings or sentences
• Joint and Conditional Probability
• Problem:
• How can we handle context of language in our models?
• Solution:
• Can theoretically (given enough units) approximate “any” function and fit to “any”
kind of data.
• Efficient for NLP: hidden layers can be used as word lookup tables
• Dense distributed word vectors + efficient NN training algorithms: Can scale to
billions of words !
Neural Language Modelling
• Problem
• How do we handle the compositionality of language in
our models?
• Solution:
• Recursion: the same operator (same parameters) is
applied repeatedly on different components. Also
called Recurrent Neural Networks (RNN).
What is Recursive Language Modelling?
Recursive Neural Networks (RNN)
Thank You
Saurabh Kaushik

More Related Content

What's hot (20)

PDF
Deep learning for natural language embeddings
Roelof Pieters
 
PDF
Deep Learning for NLP: An Introduction to Neural Word Embeddings
Roelof Pieters
 
PPTX
NLP Bootcamp
Anuj Gupta
 
PDF
Representation Learning of Vectors of Words and Phrases
Felipe Moraes
 
PDF
Word representation: SVD, LSA, Word2Vec
ananth
 
PDF
Natural Language Processing: L01 introduction
ananth
 
PDF
Word2Vec: Learning of word representations in a vector space - Di Mitri & Her...
Daniele Di Mitri
 
PDF
NLP Bootcamp 2018 : Representation Learning of text for NLP
Anuj Gupta
 
PPTX
A Panorama of Natural Language Processing
Ted Xiao
 
PPT
Introduction to Natural Language Processing
Pranav Gupta
 
PPTX
Talk from NVidia Developer Connect
Anuj Gupta
 
PDF
Introduction to natural language processing
Minh Pham
 
PDF
Natural Language Processing (NLP)
Yuriy Guts
 
PPTX
Artificial Intelligence Notes Unit 4
DigiGurukul
 
PDF
Adnan: Introduction to Natural Language Processing
Mustafa Jarrar
 
PDF
Frontiers of Natural Language Processing
Sebastian Ruder
 
PDF
UCU NLP Summer Workshops 2017 - Part 2
Yuriy Guts
 
PPTX
Lecture 1: Semantic Analysis in Language Technology
Marina Santini
 
PDF
(Deep) Neural Networks在 NLP 和 Text Mining 总结
君 廖
 
PDF
Word Embeddings, why the hype ?
Hady Elsahar
 
Deep learning for natural language embeddings
Roelof Pieters
 
Deep Learning for NLP: An Introduction to Neural Word Embeddings
Roelof Pieters
 
NLP Bootcamp
Anuj Gupta
 
Representation Learning of Vectors of Words and Phrases
Felipe Moraes
 
Word representation: SVD, LSA, Word2Vec
ananth
 
Natural Language Processing: L01 introduction
ananth
 
Word2Vec: Learning of word representations in a vector space - Di Mitri & Her...
Daniele Di Mitri
 
NLP Bootcamp 2018 : Representation Learning of text for NLP
Anuj Gupta
 
A Panorama of Natural Language Processing
Ted Xiao
 
Introduction to Natural Language Processing
Pranav Gupta
 
Talk from NVidia Developer Connect
Anuj Gupta
 
Introduction to natural language processing
Minh Pham
 
Natural Language Processing (NLP)
Yuriy Guts
 
Artificial Intelligence Notes Unit 4
DigiGurukul
 
Adnan: Introduction to Natural Language Processing
Mustafa Jarrar
 
Frontiers of Natural Language Processing
Sebastian Ruder
 
UCU NLP Summer Workshops 2017 - Part 2
Yuriy Guts
 
Lecture 1: Semantic Analysis in Language Technology
Marina Santini
 
(Deep) Neural Networks在 NLP 和 Text Mining 总结
君 廖
 
Word Embeddings, why the hype ?
Hady Elsahar
 

Viewers also liked (20)

PDF
Winning Deals with Design Thinking
Saurabh Kaushik
 
PPTX
Nlp & Hypnosis 2014
Grant Hamel
 
PDF
Deep Learning & NLP: Graphs to the Rescue!
Roelof Pieters
 
PDF
Engineering Intelligent Systems using Machine Learning
Saurabh Kaushik
 
PPTX
Machine Learning at Scale
Madhukara Phatak
 
PPTX
Percolation Model and Controllability
Mohammad Reza Dehghani Tafti
 
PDF
First-passage percolation on random planar maps
Timothy Budd
 
PDF
mtc All Hands 8/15 Werte
Arne Krueger
 
PPTX
20131011 - Los Gatos - Netflix - Big Data Design Patterns
Allen Day, PhD
 
PDF
Percolation
Fazle Rabbi Dayeen
 
PDF
Paper Review: An exact mapping between the Variational Renormalization Group ...
Kai-Wen Zhao
 
PDF
Elastic Search
Lukas Vlcek
 
PDF
Artificial intelligence 2015: Quo Vadis?
Sergey Shelpuk
 
PDF
Network-Growth Rule Dependence of Fractal Dimension of Percolation Cluster on...
Shu Tanaka
 
PPTX
Machine Learning and Logging for Monitoring Microservices
Daniel Berman
 
PDF
Scalable and Reliable Logging at Pinterest
Krishna Gade
 
PPTX
Percolation
ESUG
 
PDF
Interlayer-Interaction Dependence of Latent Heat in the Heisenberg Model on a...
Shu Tanaka
 
PDF
Predictive analytics in mobility
Ektimo
 
Winning Deals with Design Thinking
Saurabh Kaushik
 
Nlp & Hypnosis 2014
Grant Hamel
 
Deep Learning & NLP: Graphs to the Rescue!
Roelof Pieters
 
Engineering Intelligent Systems using Machine Learning
Saurabh Kaushik
 
Machine Learning at Scale
Madhukara Phatak
 
Percolation Model and Controllability
Mohammad Reza Dehghani Tafti
 
First-passage percolation on random planar maps
Timothy Budd
 
mtc All Hands 8/15 Werte
Arne Krueger
 
20131011 - Los Gatos - Netflix - Big Data Design Patterns
Allen Day, PhD
 
Percolation
Fazle Rabbi Dayeen
 
Paper Review: An exact mapping between the Variational Renormalization Group ...
Kai-Wen Zhao
 
Elastic Search
Lukas Vlcek
 
Artificial intelligence 2015: Quo Vadis?
Sergey Shelpuk
 
Network-Growth Rule Dependence of Fractal Dimension of Percolation Cluster on...
Shu Tanaka
 
Machine Learning and Logging for Monitoring Microservices
Daniel Berman
 
Scalable and Reliable Logging at Pinterest
Krishna Gade
 
Percolation
ESUG
 
Interlayer-Interaction Dependence of Latent Heat in the Heisenberg Model on a...
Shu Tanaka
 
Predictive analytics in mobility
Ektimo
 
Ad

Similar to Engineering Intelligent NLP Applications Using Deep Learning – Part 1 (20)

PPTX
NLP Introduction and basics of natural language processing
mailtoahmedhassan
 
PDF
Natural Language Processing
punedevscom
 
PPTX
Pycon ke word vectors
Osebe Sammi
 
PDF
Representation Learning of Text for NLP
Anuj Gupta
 
PDF
Generative Artificial Intelligence and Large Language Model
Shiwani Gupta
 
PDF
Bijaya Zenchenko - An Embedding is Worth 1000 Words - Start Using Word Embedd...
Rehgan Avon
 
PDF
Understanding natural language processing
jbene mourad
 
PPTX
PPT Unit 5=software- engineering-21.pptx
sasad51302
 
PPTX
Natural Language Processing (NLP).pptx
SHIBDASDUTTA
 
PDF
Natural Language Processing with Python
Benjamin Bengfort
 
PDF
Natural language processing (nlp)
Kuppusamy P
 
PPTX
https://www.slideshare.net/amaresimachew/hot-topics-132093738
Assosa University
 
PDF
MACHINE-DRIVEN TEXT ANALYSIS
Massimo Schenone
 
PPTX
Module 1-NLP (2).pptxiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii
vgpriya1132
 
PPTX
NLP.pptx
Rahul Borate
 
PDF
Crash Course in Natural Language Processing (2016)
Vsevolod Dyomkin
 
PPTX
From NLP to text mining
Yi-Shin Chen
 
PPTX
Embedding for fun fumarola Meetup Milano DLI luglio
Deep Learning Italia
 
PPTX
Introduction to natural language processing (NLP)
Alia Hamwi
 
NLP Introduction and basics of natural language processing
mailtoahmedhassan
 
Natural Language Processing
punedevscom
 
Pycon ke word vectors
Osebe Sammi
 
Representation Learning of Text for NLP
Anuj Gupta
 
Generative Artificial Intelligence and Large Language Model
Shiwani Gupta
 
Bijaya Zenchenko - An Embedding is Worth 1000 Words - Start Using Word Embedd...
Rehgan Avon
 
Understanding natural language processing
jbene mourad
 
PPT Unit 5=software- engineering-21.pptx
sasad51302
 
Natural Language Processing (NLP).pptx
SHIBDASDUTTA
 
Natural Language Processing with Python
Benjamin Bengfort
 
Natural language processing (nlp)
Kuppusamy P
 
https://www.slideshare.net/amaresimachew/hot-topics-132093738
Assosa University
 
MACHINE-DRIVEN TEXT ANALYSIS
Massimo Schenone
 
Module 1-NLP (2).pptxiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii
vgpriya1132
 
NLP.pptx
Rahul Borate
 
Crash Course in Natural Language Processing (2016)
Vsevolod Dyomkin
 
From NLP to text mining
Yi-Shin Chen
 
Embedding for fun fumarola Meetup Milano DLI luglio
Deep Learning Italia
 
Introduction to natural language processing (NLP)
Alia Hamwi
 
Ad

More from Saurabh Kaushik (9)

PDF
MLOps with Kubeflow
Saurabh Kaushik
 
PDF
Building AI Product using AI Product Thinking
Saurabh Kaushik
 
PDF
AI Product Thinking for Product Managers
Saurabh Kaushik
 
PDF
Explainable AI (XAI) - A Perspective
Saurabh Kaushik
 
PDF
Project Management Using Design Thinking
Saurabh Kaushik
 
PDF
Design Thinking - Case Studies
Saurabh Kaushik
 
PDF
An Assessment Framework for Strategic Digital Marketing Effectiveness
Saurabh Kaushik
 
PPTX
A Consulting Model - Strategic Digital marketing
Saurabh Kaushik
 
PPT
Air Pollution Control by Tax and Subsidies
Saurabh Kaushik
 
MLOps with Kubeflow
Saurabh Kaushik
 
Building AI Product using AI Product Thinking
Saurabh Kaushik
 
AI Product Thinking for Product Managers
Saurabh Kaushik
 
Explainable AI (XAI) - A Perspective
Saurabh Kaushik
 
Project Management Using Design Thinking
Saurabh Kaushik
 
Design Thinking - Case Studies
Saurabh Kaushik
 
An Assessment Framework for Strategic Digital Marketing Effectiveness
Saurabh Kaushik
 
A Consulting Model - Strategic Digital marketing
Saurabh Kaushik
 
Air Pollution Control by Tax and Subsidies
Saurabh Kaushik
 

Recently uploaded (20)

DOCX
CS-802 (A) BDH Lab manual IPS Academy Indore
thegodhimself05
 
PPTX
美国电子版毕业证南卡罗莱纳大学上州分校水印成绩单USC学费发票定做学位证书编号怎么查
Taqyea
 
PPTX
Introduction to Design of Machine Elements
PradeepKumarS27
 
PPTX
Product Development & DevelopmentLecture02.pptx
zeeshanwazir2
 
PDF
Electrical Engineer operation Supervisor
ssaruntatapower143
 
PDF
AI TECHNIQUES FOR IDENTIFYING ALTERATIONS IN THE HUMAN GUT MICROBIOME IN MULT...
vidyalalltv1
 
PDF
smart lot access control system with eye
rasabzahra
 
PPTX
The Role of Information Technology in Environmental Protectio....pptx
nallamillisriram
 
PDF
International Journal of Information Technology Convergence and services (IJI...
ijitcsjournal4
 
PDF
Reasons for the succes of MENARD PRESSUREMETER.pdf
majdiamz
 
PPT
Carmon_Remote Sensing GIS by Mahesh kumar
DhananjayM6
 
PPTX
Shinkawa Proposal to meet Vibration API670.pptx
AchmadBashori2
 
PDF
PORTFOLIO Golam Kibria Khan — architect with a passion for thoughtful design...
MasumKhan59
 
PPTX
Introduction to Basic Renewable Energy.pptx
examcoordinatormesu
 
PDF
Zilliz Cloud Demo for performance and scale
Zilliz
 
PDF
Design Thinking basics for Engineers.pdf
CMR University
 
PPTX
fatigue in aircraft structures-221113192308-0ad6dc8c.pptx
aviatecofficial
 
PDF
Halide Perovskites’ Multifunctional Properties: Coordination Engineering, Coo...
TaameBerhe2
 
PDF
Water Industry Process Automation & Control Monthly July 2025
Water Industry Process Automation & Control
 
PPTX
Mechanical Design of shell and tube heat exchangers as per ASME Sec VIII Divi...
shahveer210504
 
CS-802 (A) BDH Lab manual IPS Academy Indore
thegodhimself05
 
美国电子版毕业证南卡罗莱纳大学上州分校水印成绩单USC学费发票定做学位证书编号怎么查
Taqyea
 
Introduction to Design of Machine Elements
PradeepKumarS27
 
Product Development & DevelopmentLecture02.pptx
zeeshanwazir2
 
Electrical Engineer operation Supervisor
ssaruntatapower143
 
AI TECHNIQUES FOR IDENTIFYING ALTERATIONS IN THE HUMAN GUT MICROBIOME IN MULT...
vidyalalltv1
 
smart lot access control system with eye
rasabzahra
 
The Role of Information Technology in Environmental Protectio....pptx
nallamillisriram
 
International Journal of Information Technology Convergence and services (IJI...
ijitcsjournal4
 
Reasons for the succes of MENARD PRESSUREMETER.pdf
majdiamz
 
Carmon_Remote Sensing GIS by Mahesh kumar
DhananjayM6
 
Shinkawa Proposal to meet Vibration API670.pptx
AchmadBashori2
 
PORTFOLIO Golam Kibria Khan — architect with a passion for thoughtful design...
MasumKhan59
 
Introduction to Basic Renewable Energy.pptx
examcoordinatormesu
 
Zilliz Cloud Demo for performance and scale
Zilliz
 
Design Thinking basics for Engineers.pdf
CMR University
 
fatigue in aircraft structures-221113192308-0ad6dc8c.pptx
aviatecofficial
 
Halide Perovskites’ Multifunctional Properties: Coordination Engineering, Coo...
TaameBerhe2
 
Water Industry Process Automation & Control Monthly July 2025
Water Industry Process Automation & Control
 
Mechanical Design of shell and tube heat exchangers as per ASME Sec VIII Divi...
shahveer210504
 

Engineering Intelligent NLP Applications Using Deep Learning – Part 1

  • 1. Engineering Intelligent NLP Applications Using Deep Learning – Part 1 Saurabh Kaushik
  • 2. • Part 1: • Why NLP? • What is NLP? • What is the Word & Sentence Modelling in NLP? • What is Word Representation in NLP? • What is Language Processing in NLP? Agenda • PART 2 : • WHY DL FOR NLP? • WHAT IS DL? • WHAT IS DL FOR NLP? • HOW RNN WORKS FOR NLP? • HOW CNN WORKS FOR NLP?
  • 4. What are Generally Known NLPApplications? Search Customer SupportQ & A Summarization
  • 5. Are there More DeeperApplications of NLP? Group 1 Cleanup, tokenization Stemming Lemmatization Part-of-speech tagging Query expansion Parsing Topic segmentation and recognition Morphological segmentation (word/Sentences) Group 2 Information retrieval and Extraction (IR) Relationship Extraction Named entity recognition (NER) Sentiment analysis /Sentence boundary disambiguation Word sense and disambiguation Text similarity Coreference resolution Discourse analysis Group 3 Machine translation Automatic summarization / Paraphrasing Natural language generation Reasoning over Knowledge base Question answering System Dialog System Image Captioning & other multimodal tasks
  • 7. • According to Wikipedia: • Natural language processing (NLP) is a field of Computer science and Linguistics concerned with the • Interactions between computers and human (natural) languages. What is NLP? So far, Computing Device and its Interaction with Human are two separate thing. But in true Digital World, this gap needs to bridged by integrating Human Conversational Understanding into Intelligent Apps/Systems/Things, in order to achieve its true potential. Ref: https://en.wikipedia.org/wiki/Natural_language_processing
  • 8. Why Language is so Challenging for Computer? • Every sentence has many possible interpretations. Language is ambiguous • We will always encounter new words or new constructions Language is productive • Same word has different meaning. Language is culturally specific
  • 9. • Lexical Analysis − It involves identifying and analyzing the structure of words. Lexicon of a language means the collection of words and phrases in a language. Lexical analysis is dividing the whole chunk of txt into paragraphs, sentences, and words. • Syntactic Analysis (Parsing) − It involves analysis of words in the sentence for grammar and arranging words in a manner that shows the relationship among the words. The sentence such as “The school goes to boy” is rejected by English syntactic analyzer. • Semantic Analysis − It draws the exact meaning or the dictionary meaning from the text. The text is checked for meaningfulness. It is done by mapping syntactic structures and objects in the task domain. The semantic analyzer disregards sentence such as “hot ice-cream”. Also called Compositional Semantic. • Discourse Integration − The meaning of any sentence depends upon the meaning of the sentence just before it. In addition, it also brings about the meaning of immediately succeeding sentence. • Pragmatic Analysis − During this, what was said is re- interpreted on what it actually meant. It involves deriving those aspects of language which require real world knowledge. What is NLP Processing?
  • 10. • Grammar Parsing: • Articles (DET) − a | an | the • Nouns − bird | birds | grain | grains • Noun Phrase (NP) − Article + Noun | Article + Adjective + Noun = DET N | DET ADJ N • Verbs − pecks | pecking | pecked • Verb Phrase (VP) − NP V | V NP • Adjectives (ADJ) − beautiful | small | chirping • POS Tagging: • Parsing: • S → NP VP • NP → DET N | DET ADJ N • VP → V NP • Lexicon: • DET → a | the • ADJ → beautiful | perching • N → bird | birds | grain | grains • V → peck | pecks | pecking What are Basics Component of NLP? “The bird pecks the grains” Parse Tree:
  • 11. How does NLP understand Syntactically? Part of Speech – Tagging
  • 12. WHAT WORD & SENTENCE MODELLED IN NLP?
  • 13. • What is the meaning of words? • Most words have many different senses: • E.g. dog = animal or sausage? How does NLP get Word Meanings? Word Meaning: • Polysemy: • A lexeme is polysemous if it has different related senses • E.g. bank = financial institution or building • Homonyms: • Two lexemes are homonyms if their senses are unrelated, but they happen to have the same spelling and pronunciation • E.g. bank = (financial) bank or (river) bank
  • 14. • How are the meanings of different words related? • Specific relations between senses: • E.g. Animal is more general than dog. • Semantic fields: • E.g. money is related to bank How does NLP get Word Relationships? Word Relationships:  Symmetric Relations: – Synonyms: couch/sofa  Two lemmas with the same sense – Antonyms: cold/hot, rise/fall, in/out  Two lemmas with the opposite sense  Hierarchical relations:  Hypernyms and Hyponyms: pet/dog – The hyponym (dog) is more specific than the hypernym (pet)  Homonyms and Meronyms: car/wheel – The meronym (wheel) is a part of the holonym (car)
  • 15. • Principle of compositionality: • The meaning (vector) of a complex expression (sentence) is determined by: • the meanings of its constituent expressions (words) and • the rules (grammar) used to combine them” How does NLP get Sentence Composability? • SCENE PARSING: • THE MEANING OF A SCENE IMAGE IS ALSO A FUNCTION OF SMALLER REGION. • HOW THEY COMBINE TO FORM AN LARGE OBJECT. • AND HOW OBJECT INTERACT. • Sentence Parsing: • The meaning of a sentence is a function of words. • How they combine to form an large sentences. • And how Word Interact in a given sentence.
  • 17. What is basic Linear Representation of Words? Definition • Documents are treated as a “bag” of words or terms. • Any document can be represented as a vector: a list of terms and their associated weights Pros • Simple Model to start with Cons • Disregarding grammar (term.baseform?) • Disregarding word order (term.position) • Keeping only multiplicity (term.frequency) • Less Accurate Technique : TFIDF: • Term frequency – inverse document frequency • TF - is term frequency in a document function - i.e. measure on how much information the term brings in one document • IDF - is inverse document frequency of the term function - i.e. inversed measure on how much information the term brings in all documents (corpus) • Formula: • t - term, d - one document, D - all documents Bag of Words
  • 18. • Statistical Modeling • Word ordering information lost • Data sparsity • Words as atomic symbols • Very hard to find higher level features • Features other than BOW What is Distributed Representation? Neural Network Modeling • Trained in a completely unsupervised way • Reduce data sparsity • Semantic Hashing • Appear to carry semantic information about the words • Freely available for Out of Box usage Linguistic items with similar distributions have similar meanings. Generally, it is based on co-occurrence/ context and based on the Distributional hypothesis. Distributional meaning as co-occurrence vector.
  • 19. What is One Hot Encoding? Definition: • The vast majority of rule-‐based and Statistical NLP work regards words as atomic symbols. • Form vocabulary of words that maps lemmatized words to a unique ID (position of word in vocabulary). • Typical vocabulary sizes will vary between 10 000 and 250 000. • The one-hot vector of an ID is a vector filled with 0s, except for a 1 at the position associated with the ID. • ex.: for vocabulary size D=10, the one-hot vector of word ID w=4 is e(w) = [ 0 0 0 1 0 0 0 0 0 0 ] • A one-hot encoding makes no assumption about word similarity. All words are equally different from each other. Pros • Simplicity Cons • Notion of word similarity is undefined with one-hot encoding social [0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0] public [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0] • Impossible to generalize to unseen words • One-hot encoding can be memory inefficient
  • 20. • One of the most successful ideas of modern statistical NLP! What is Word Embedding? “You shall know a word by the company it keeps” (J. R. Firth 1957) these words represent banking Definition: • Help to find Syntactical as well as Semantical Similarity Pros • Simplicity • Possible to generalize to unseen words Cons • All words are equal, but some words are more equal than others.
  • 21. What is Word Embedding? Cosine similarity Vector Representation • Allow ability to map each document in a corpus to a n- dimensional vector, where n is the size of the vocabulary. • represent each unique word as a dimension and the magnitude along this dimension is the count of that word in the document. • Given such vectors a, b, …, we can compute the vector dot product and cosine of the angle between them. • The angle is a measure of alignment between 2 vectors and hence similarity. • An example of its use in information retrieval is to: Vectorize both the query string and the documents and find similarity(q, di) for all from 1 to n. Word2Vec Vector for “Sweden”
  • 22. What is Word Embedding? Classical Example to show, How vector can help computer understand semantic meanings between words of a language.
  • 24. • A language model is a probabilistic model that assigns probabilities to any sequence of words p(w1, ... ,wT) • Language modeling is the task of learning a language model that assigns high probabilities to well formed sentences • Plays a crucial role in speech recognition and machine translation systems • There are three Types of Language Modelling • Linear Language Modelling – Addressed by finding probability of a word appearing in corpus • Statistical Language Modelling – Addressed by finding probability of a word in sequence/presence of other words. • Neural Language Modelling – Addressed by understanding the context of word in its neighbor? • Recursive Language Modelling – Addressed by understanding the sequence of words appearing one after another. . What is Language Modeling?
  • 25. • An n-gram is a sequence of n words • unigrams(n=1):’‘is’’,‘‘a’’,‘‘sequence’’,etc. • bigrams(n=2): [‘‘is’’,‘‘a’’], [‘’a’’,‘‘sequence’’],etc. • trigrams(n=3): [‘’is’’,‘‘a’’,‘‘sequence’’], [‘‘a’’,‘‘sequence’’,‘‘of’’], etc. • n-gram models estimate the conditional from n-grams counts What is Linear Language Modelling? (N-Gram)
  • 26. What is Statistical Language Modelling? • Problem: • How can we handle co-occurrence of language in our models? • Solution • Using probabilistic modeling any co-occurrence of word can be modelled. • A language model is a probabilistic model that assigns probabilities to any sequence of words p(w1, ... ,wT) • Language modeling is the task of learning a language model that assigns high probabilities to well formed sentences • Plays a crucial role in speech recognition and machine translation systems • Language models define probability distributions over (natural language) strings or sentences • Joint and Conditional Probability
  • 27. • Problem: • How can we handle context of language in our models? • Solution: • Can theoretically (given enough units) approximate “any” function and fit to “any” kind of data. • Efficient for NLP: hidden layers can be used as word lookup tables • Dense distributed word vectors + efficient NN training algorithms: Can scale to billions of words ! Neural Language Modelling
  • 28. • Problem • How do we handle the compositionality of language in our models? • Solution: • Recursion: the same operator (same parameters) is applied repeatedly on different components. Also called Recurrent Neural Networks (RNN). What is Recursive Language Modelling? Recursive Neural Networks (RNN)