0% found this document useful (0 votes)
22 views

Endsem NLP IMPORTANT QUESTIONS

ENDSEM NLP IMP QUE

Uploaded by

divyansh7010
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views

Endsem NLP IMPORTANT QUESTIONS

ENDSEM NLP IMP QUE

Uploaded by

divyansh7010
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 2

### **UNIT I: Introduction to NLP**

1) What are the main challenges in Natural Language Processing?


2) How does NLP differ from other fields of artificial intelligence?
3) Explain the difference between grammar-based and statistical language models.

4) What are the advantages and limitations of using a statistical language


model?
5) Describe how N-grams are used in language modeling.
6) Explain the role of regular expressions used in NLP? and provide examples of
how they are used in language processing tasks?
7) Explain the working of finite-state automata in the context of NLP.
8) What are the key components of English morphology?
9) Describe how transducers are used in morphology processing.
10) Explain the process of tokenization and why it is important for NLP tasks.
11) How is the minimum edit distance algorithm used for spelling correction?
12) What is the significance of tokenization in tasks such as POS tagging and
parsing?
13) Explain the concept of grammerians language model and how it functions in
nlp?
14) Explain challanges associated with detecting and correcting spelling errors
in nlp?
15) Explain with an example how bigram probability is calculated
16) Explain how is perplexity calculated in evaluating N-grams
17) Explain english morphology in nlp?

### **UNIT II: Word Level Analysis**


1) What is an unsmoothed N-gram model, and how does it differ from smoothed
models?
2) How would you evaluate the performance of an N-gram model?
3) Describe the techniques of smoothing, interpolation, and backoff in the
context of N-grams.
4) How does backoff improve the performance of a language model?
5) Explain the different approaches to Part-of-Speech (PoS) tagging: Rule-based,
Stochastic, and Transformation-based tagging.
6) What are the main challenges in PoS tagging?
7) Compare Hidden Markov Models (HMM) and Maximum Entropy models for PoS
tagging.
8) Explain how the Viterbi algorithm works in the context of HMMs.
9) How is the Expectation-Maximization (EM) algorithm used for training in NLP?
10) What is smoothing in the context of N-gram models and why is it necessary?
Provide example to show how smoothing works?
11) Discuss in detail the Hidden Markov Model
12) What is ambiguity in parse tree? Explain with an example.

### **UNIT III: Syntactic Analysis**


1) Define Context-Free Grammar (CFG) and explain its role in syntactic analysis.

2) How are grammar rules constructed for English in CFG?


3) What is Dependency Grammar, and how does it differ from CFG?
4) Explain how syntactic parsing works and the challenges involved.
5) What is syntactic ambiguity, and how is it handled in NLP?
6) Describe the role of Dynamic Programming (DP) in parsing algorithms.
7) How does a probabilistic CFG differ from a standard CFG?
8) Explain the CYK (Cocke-Younger-Kasami) parsing algorithm and its significance
in syntactic analysis.
9) What are feature structures in syntax, and how are they unified during
parsing?
10) How do CFGs capture the syntactic structure of sentences, and what are the
key components of a CFG?
11) Explain the process of syntactic
12) What is the process involved in constructing treebanks, and how do they play
a role in both the development and assessment of syntactic parsers?
13) Discuss in detail Probabilistic CYK and cfg
14) Explain in detail about quantifiers with an example
15) what do you mean by unification of feature structure

###**UNIT IV: Semantics and Pragmatics**


1) What are the requirements for semantic representation in NLP?
2) Explain the role of First-Order Logic and Description Logics in semantic
representation.
3) What is Word Sense Disambiguation, and why is it important in NLP? explain
its these methods a) Supervised b) Dictionary c) Thesaurus
4) Describe different methods for WSD: supervised, dictionary-based, and
bootstrapping.
5) How does the distributional hypothesis relate to word similarity?
6) Explain the concept of compositional semantics and how it contributes to
understanding the meaning of complex expressions?
7) How do thematic roles and selectional restrictions influence semantic
interpretation?

### **UNIT V: Applications of NLP**


1) Explain the different approaches to machine translation: rule-based,
statistical, and neural.
2) What are the challenges faced by machine translation systems?
3) What is speech recognition, and how is it related to NLP?
4) Describe the main components of a speech recognition system and their NLP
integration.
5) How do natural language querying systems work?
6) Explain how NLP is applied in intelligent user interfaces and man-machine
interfaces.
7) How is NLP used in commercial applications such as virtual assistants,
content recommendation, and sentiment analysis?
8) Write short notes on following a) Dictionary and Thesaurus b) First order
logic c) Tokenization d) Finite-State Automata e) Laplace Smoothing f)
Probabilistic context-free grammar g) Word sense
9) Describe transfer model of machine translation. List out its three phases
10) Tutoring and authority system.

# IS LIST ME ABHI EK BHI ASSIGNMENT KE QUE AND NOV 23 AND NOV 22 KE ALAWA KOI OR
PYQ KE QUESTIONS NAHI HAIN

You might also like