Deep Learning - IIT Ropar - - Unit 15 - Week 12
Deep Learning - IIT Ropar - - Unit 15 - Week 12
Assessment submitted.
(https://swayam.gov.in) (https://swayam.gov.in/nc_details/NPTEL)
X
Click to register
for Certification
exam
Thank you for taking the Week
12: Assignment 12.
(https://examform.nptel.ac.in/2025_01/exam_form/dashboard)
If already
registered, click
to check your
Week 12: Assignment 12
payment status Your last recorded submission was on 2025-04-16, 22:36 Due date: 2025-04-16, 23:59 IST.
IST
1) What is the primary purpose of the attention mechanism in neural networks? 1 point
Course
outline To reduce the size of the input data
To increase the complexity of the model
About To eliminate the need for recurrent connections
NPTEL () To focus on specific parts of the input sequence
How does an 2) Which of the following are the benefits of using attention mechanisms in neural 1 point
NPTEL networks?
online
course Improved handling of long-range dependencies
work? ()
Enhanced interpretability of model predictions
Ability to handle variable-length input sequences
Week 0 ()
Reduction in model complexity
Week 1 ()
3) If we make the vocabulary for an encoder-decoder model using the given sentence. 1 point
Week 2 () What will be the size of our vocabulary?
Sentence: Attention mechanisms dynamically identify critical input components, enhancing
Week 3 () contextual understanding and boosting performance
13
week 4 ()
14
Week 5 () 15
16
Week 6 ()
https://onlinecourses.nptel.ac.in/noc25_cs21/unit?unit=162&assessment=322 1/3
16/04/2025, 22:37 Deep Learning - IIT Ropar - - Unit 15 - Week 12
Week 12 () 5) Which of the following attention mechanisms is most commonly used in the 1 point
Transformer model architecture?
Introduction to
Encoder
Additive attention
Decoder
Models (unit? Dot product attention
unit=162&less Multiplicative attention
on=163)
None of the above
Applications of
Encoder 6) Which of the following is NOT a component of the attention mechanism? 1 point
Decoder
models (unit? Decoder
unit=162&less
Key
on=164)
Value
Attention
Query
Mechanism
(unit? Encoder
unit=162&less
on=165) 7) In a hierarchical attention network, what are the two primary levels of attention? 1 point
Attention
Character-level and word-level
Mechanism
(Contd.) (unit? Word-level and sentence-level
unit=162&less Sentence-level and document-level
on=166)
Paragraph-level and document-level
Attention over
images (unit? 8) Which of the following are the advantages of using attention mechanisms in 1 point
unit=162&less encoderdecoder models?
on=167)
Reduced computational complexity
Hierarchical
Attention Ability to handle variable-length input sequences
(unit? Improved gradient flow during training
unit=162&less
Automatic feature selection
on=168)
Reduced memory requirements
Lecture
Material for
9) In the encoder-decoder architecture with attention, where is the context vector 1 point
Week 12 (unit?
typically computed?
unit=162&less
on=169)
In the encoder
Week 12 In the decoder
Feedback
Between the encoder and decoder
https://onlinecourses.nptel.ac.in/noc25_cs21/unit?unit=162&assessment=322 2/3
16/04/2025, 22:37 Deep Learning - IIT Ropar - - Unit 15 - Week 12
Books ()
Text
Transcripts
()
Problem
Solving
Session -
Jan 2025 ()
https://onlinecourses.nptel.ac.in/noc25_cs21/unit?unit=162&assessment=322 3/3