0% found this document useful (0 votes)
12 views2 pages

Attention Mechanism

The document is a quiz on the attention mechanism in machine learning, where the user scored 85%, surpassing the passing score of 75%. It covers key concepts such as the steps of the attention mechanism, its advantages over traditional models, and the purpose of attention weights. The quiz includes multiple-choice questions with feedback on the correctness of the answers.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views2 pages

Attention Mechanism

The document is a quiz on the attention mechanism in machine learning, where the user scored 85%, surpassing the passing score of 75%. It covers key concepts such as the steps of the attention mechanism, its advantages over traditional models, and the purpose of attention weights. The quiz includes multiple-choice questions with feedback on the correctness of the answers.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Attention Mechanism:

Quiz
Your score: 85% Passing score: 75%
Congratulations! You passed this assessment.
Retake
check
1.
What are the two main steps of the attention mechanism?
Calculating the context vector and generating the attention weights
Calculating the context vector and generating the output word
checkCalculating the attention weights and generating the context vector
Calculating the attention weights and generating the output word

That's correct!

close
2.
What is the name of the machine learning technique that allows a neural network to focus on
specific parts of an input sequence?
Convolutional neural network (CNN)
closeEncoder-decoder
Attention mechanism
Long Short-Term Memory (LSTM)

That's incorrect, please revisit the content.

check
3.
What is the advantage of using the attention mechanism over a traditional sequence-to-
sequence model?
The attention mechanism lets the model formulate parallel outputs.
The attention mechanism lets the model learn only short term
dependencies.
checkThe attention mechanism lets the model focus on specific parts of
the input sequence.
The attention mechanism reduces the computation time of prediction.

That's correct!

check
4.
How does an attention model differ from a traditional model?
The traditional model uses the input embedding directly in the decoder
to get more context.
The decoder does not use any additional information.
checkAttention models pass a lot more information to the decoder.
The decoder only uses the final hidden state from the encoder.

That's correct!

check
5.
What is the name of the machine learning architecture that can be used to translate text from
one language to another?
Convolutional neural network (CNN)
Neural network
checkEncoder-decoder
Long Short-Term Memory (LSTM)

That's correct!

check
6.
What is the purpose of the attention weights?
To incrementally apply noise to the input data.
To calculate the context vector by averaging words embedding in the
context.
checkTo assign weights to different parts of the input sequence, with the
most important parts receiving the highest weights.
To generate the output word based on the input data alone.

That's correct!

check
7.
What is the advantage of using the attention mechanism over a traditional recurrent neural
network (RNN) encoder-decoder?
The attention mechanism requires less CPU threads than a traditional
RNN encoder-decoder.
The attention mechanism is faster than a traditional RNN encoder-
decoder.
checkThe attention mechanism lets the decoder focus on specific parts of
the input sequence, which can improve the accuracy of the translation.
The attention mechanism is more cost-effective than a traditional RNN
encoder-decoder.

That's correct!

You might also like