0% found this document useful (0 votes)
13 views3 pages

Deep Learning - IIT Ropar - - Unit 15 - Week 12

The document details the Week 12 assignment for the Deep Learning course at IIT Ropar, focusing on the attention mechanism in neural networks. It includes various questions related to the encoder-decoder model, benefits of attention mechanisms, and specific components of the attention mechanism. The assignment was submitted on April 16, 2025, with a due date of the same day.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views3 pages

Deep Learning - IIT Ropar - - Unit 15 - Week 12

The document details the Week 12 assignment for the Deep Learning course at IIT Ropar, focusing on the attention mechanism in neural networks. It includes various questions related to the encoder-decoder model, benefits of attention mechanisms, and specific components of the attention mechanism. The assignment was submitted on April 16, 2025, with a due date of the same day.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

16/04/2025, 22:37 Deep Learning - IIT Ropar - - Unit 15 - Week 12

Assessment submitted.
(https://swayam.gov.in) (https://swayam.gov.in/nc_details/NPTEL)
X

[email protected]

NPTEL (https://swayam.gov.in/explorer?ncCode=NPTEL) » Deep Learning - IIT Ropar (course)


Click to register
for Certification
exam
Thank you for taking the Week
12: Assignment 12.
(https://examform.nptel.ac.in/2025_01/exam_form/dashboard)

If already
registered, click
to check your
Week 12: Assignment 12
payment status Your last recorded submission was on 2025-04-16, 22:36 Due date: 2025-04-16, 23:59 IST.
IST

1) What is the primary purpose of the attention mechanism in neural networks? 1 point
Course
outline To reduce the size of the input data
To increase the complexity of the model
About To eliminate the need for recurrent connections
NPTEL () To focus on specific parts of the input sequence

How does an 2) Which of the following are the benefits of using attention mechanisms in neural 1 point
NPTEL networks?
online
course Improved handling of long-range dependencies
work? ()
Enhanced interpretability of model predictions
Ability to handle variable-length input sequences
Week 0 ()
Reduction in model complexity
Week 1 ()
3) If we make the vocabulary for an encoder-decoder model using the given sentence. 1 point
Week 2 () What will be the size of our vocabulary?
Sentence: Attention mechanisms dynamically identify critical input components, enhancing
Week 3 () contextual understanding and boosting performance

13
week 4 ()
14
Week 5 () 15
16
Week 6 ()

https://onlinecourses.nptel.ac.in/noc25_cs21/unit?unit=162&assessment=322 1/3
16/04/2025, 22:37 Deep Learning - IIT Ropar - - Unit 15 - Week 12

4) We are performing the task of Machine Translation using an encoder-decoder 1 point


Week
Assessment 7 ()
submitted. model. Choose the equation representing the Encoder model.
X
Week 8 ()
s0 = CNN(xi )
Week 9 ()
s0 = RNN(st−1 , e(y^t−1 ))
week 10 ()
s0 = RNN(xit )
Week 11 () s0 = RNN(ht−1 , xit )

Week 12 () 5) Which of the following attention mechanisms is most commonly used in the 1 point
Transformer model architecture?
Introduction to
Encoder
Additive attention
Decoder
Models (unit? Dot product attention
unit=162&less Multiplicative attention
on=163)
None of the above
Applications of
Encoder 6) Which of the following is NOT a component of the attention mechanism? 1 point
Decoder
models (unit? Decoder
unit=162&less
Key
on=164)
Value
Attention
Query
Mechanism
(unit? Encoder
unit=162&less
on=165) 7) In a hierarchical attention network, what are the two primary levels of attention? 1 point

Attention
Character-level and word-level
Mechanism
(Contd.) (unit? Word-level and sentence-level
unit=162&less Sentence-level and document-level
on=166)
Paragraph-level and document-level
Attention over
images (unit? 8) Which of the following are the advantages of using attention mechanisms in 1 point
unit=162&less encoderdecoder models?
on=167)
Reduced computational complexity
Hierarchical
Attention Ability to handle variable-length input sequences
(unit? Improved gradient flow during training
unit=162&less
Automatic feature selection
on=168)
Reduced memory requirements
Lecture
Material for
9) In the encoder-decoder architecture with attention, where is the context vector 1 point
Week 12 (unit?
typically computed?
unit=162&less
on=169)
In the encoder
Week 12 In the decoder
Feedback
Between the encoder and decoder

https://onlinecourses.nptel.ac.in/noc25_cs21/unit?unit=162&assessment=322 2/3
16/04/2025, 22:37 Deep Learning - IIT Ropar - - Unit 15 - Week 12

Form:Deep After the decoder


Assessment submitted.
Learning - IIT
X Ropar!! (unit? 10) Which of the following output functions is most commonly used in the decoder of an 1 point
unit=162&less encoder-decoder model for translation tasks?
on=195)
Softmax
Quiz: Week
12: Sigmoid
Assignment ReLU
12
Tanh
(assessment?
name=322) You may submit any number of times before the due date. The final submission will be
considered for grading.
Download
Videos () Submit Answers

Books ()

Text
Transcripts
()

Problem
Solving
Session -
Jan 2025 ()

https://onlinecourses.nptel.ac.in/noc25_cs21/unit?unit=162&assessment=322 3/3

You might also like