OBJECTIVE SET - 2 With Answers
OBJECTIVE SET - 2 With Answers
III B.TECH II SEMESTER (R18) CSE & CSD I MID EXAMINATIONS, MAY - 2023
MACHINE LEARNING
OBJECTIVE EXAM
NAME_____________________________ HALL TICKET NO A
Answer all the questions. All questions carry equal marks. Time: 20min. 10 marks.
I choose correct alternative:
1. [ ]
Feature can be used as a________________ (CO2,K5)
A. predictor B. binary split C. All of above D. None of above
2. [ ]
The effectiveness of an SVM depends upon________________ (CO2,k6)
A. kernel parameters B. selection of kernel C. soft margin parameter
D. All of the above
3. Which of the following evaluation metrics can not be applied in case of logistic [ ]
regression output to compare with target? CO3,k1)
D.
A. accuracy B. auc-roc C. logloss
mean-squared-error
4. [ ]
A measurable property or parameter of the data-set is______________(CO1,k4)
A. training data B. test data C. feature D. validation data
5. Which of the following can only be used when training data are linearly separable [ ]
?(CO3,k5)
A. linear logistic D. the centroid
B. linear soft margin svm C. linear hard-margin svm
regression method
6. Impact of high variance on the training set ? (CO3,k5) [ ]
A. under fitting B. over fitting C. both under fitting & D. depends upon the
A. Choose k to
be the
C. Choose k to be D. Choose k to be
smallest
99% of m (k = the largest value
value so that B. Use the elbow
0.99*m, rounded so that 99% of
at least 99% method
to the nearest the variance is
of the retained
integer)
variance is
retained
11. Missing data items are __________________ with Bayes classifier. (CO3,k3)
_________algorithm finds the most specific hypothesis that fits all the positive examples.
14.
(C01,k2)
15.
______________ learning is a logical approach to machine learning (CO1,k4)
16. _____________ is basically the learning task of the machine (CO1,k6)
____________ is defined as the supposition or proposed explanation based on insufficient
17.
evidence (CO2,k4)
18.
__________ theory predictions are used to predict generative learning algorithms (CO2,k5)
_________________ is considered a latent variable model to find the local maximum likelihood
19. parameters of a statistical model (CO3,k3)
20. Gibbs algorithm is at most _______________ error of the Bayes optimal classifier. (CO3,k4)
-ooOoo—
Multiple Choice:
variables.)
data.)
6. B. over fitting (High variance leads to models that fit the training data too closely but
generalize poorly.)
7. D. None of the above (Machine learning has a long history with many contributors.)
9. B. Use the elbow method (The elbow method visually helps choose the number of
bagging.)
11. imputed (Missing data in Bayes classifiers is often imputed using techniques like
13. Support Vector Machine (SVM) (SVM is a supervised learning algorithm for
15. Symbolic learning (Symbolic learning represents knowledge using symbols and
rules.)
16. Concept learning (Concept learning is the core task of learning a general concept
20. guaranteed to achieve (The Gibbs sampling algorithm achieves at most the Bayes