50MCQ_Lecture1[1]
50MCQ_Lecture1[1]
11. What is the main advantage of word embeddings over one-hot vectors?
a) They are easier to compute
b) They capture semantic relationships between words
c) They require less memory
d) They are faster to train
29. What is the main purpose of the cross-entropy loss function in CBOW?
a) To measure the difference between predicted and actual center words
b) To normalize the input data
c) To reduce the dimensionality of the word embeddings
d) To calculate the accuracy of the model
31. Which of the following is a key challenge in using word embeddings for
machine translation?
a) Handling out-of-vocabulary words
b) Capturing rare words
c) Aligning words in different languages
d) All of the above
32. What is the main purpose of the ReLU activation function in CBOW?
a) To introduce non-linearity into the model
b) To calculate the loss function
c) To normalize the input data
d) To reduce the dimensionality of the word embeddings
38. What is the main purpose of the cross-entropy loss function in CBOW?
a) To measure the difference between predicted and actual center words
b) To normalize the input data
c) To reduce the dimensionality of the word embeddings
d) To calculate the accuracy of the model
Answer: a) To measure the difference between predicted and actual center
words
40. Which of the following is a key challenge in using word embeddings for
machine translation?
a) Handling out-of-vocabulary words
b) Capturing rare words
c) Aligning words in different languages
d) All of the above
41. What is the main purpose of the ReLU activation function in CBOW?
a) To introduce non-linearity into the model
b) To calculate the loss function
c) To normalize the input data
d) To reduce the dimensionality of the word embeddings
47. What is the main purpose of the cross-entropy loss function in CBOW?
a) To measure the difference between predicted and actual center words
b) To normalize the input data
c) To reduce the dimensionality of the word embeddings
d) To calculate the accuracy of the model
49. Which of the following is a key challenge in using word embeddings for
machine translation?
a) Handling out-of-vocabulary words
b) Capturing rare words
c) Aligning words in different languages
d) All of the above