100% found this document useful (1 vote)
181 views

Mid Exam Midterm Exam of Applied Machine Learning

Uploaded by

mh4070685
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
181 views

Mid Exam Midterm Exam of Applied Machine Learning

Uploaded by

mh4070685
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

lOMoARcPSD|21830762

Mid-exam - Midterm exam of Applied machine learning

Applied Machine Learning (Stevens Institute of Technology)

Scan to open on Studocu

Studocu is not sponsored or endorsed by any college or university


Downloaded by Muhammad Hamza ([email protected])
lOMoARcPSD|21830762

CPE 695 A/WS: Applied Machine Learning Midterm Exam Spring 2021

Name: ________________________ Stevens ID#: _________________________

Question 1 (40 points):


Answer the following questions:
1) What is the bias-variance trade-off? How to address bias and variance respectively?

2) What are the pros and cons of batch gradient descent and stochastic gradient descent
respectively?

3) What is learning rate? Explain why learning rate cannot be too large or too small.

4) What is the sigmoid function? Please sketch its shape. Explain why initial parameters shall be
small when training artificial neural networks with sigmoid function?

Downloaded by Muhammad Hamza ([email protected])


lOMoARcPSD|21830762

CPE 695 A/WS: Applied Machine Learning Midterm Exam Spring 2021

Question 2 (15 points):


Assume the following confusion matrix of a classifier. Please compute its
1) precision,
2) recall, and
3) F1-score.
Predicted results
Actual values

Class 1 Class 2
Class 1 70 10
Class 2 30 70

Downloaded by Muhammad Hamza ([email protected])


lOMoARcPSD|21830762

CPE 695 A/WS: Applied Machine Learning Midterm Exam Spring 2021

Question 3 (35 points):


Consider the following set of training examples:

Sky Temperature Wind EnjoySport


Cloudy High Strong Yes
Cloudy Low Mild No
Sunny Low Mild Yes
Sunny Low Strong No

1) draw a decision tree for the training examples (using information gain to select attributes).
Please include detailed steps in your solution.

2) Can the tree in 1) be further pruned without reducing its accuracy? If yes, please draw the
pruned tree. If no, explain why.

Downloaded by Muhammad Hamza ([email protected])


lOMoARcPSD|21830762

CPE 695 A/WS: Applied Machine Learning Midterm Exam Spring 2021

Question 4 (15 points): Design a two-input perceptron that implements the Boolean function
¬𝐴 ∧ 𝐵.
[Hint: Lecture 7 pages 12-13 give examples for AND (∧) and OR (∨) gates; You use similar
method here, i.e., first draw the truth table, then list equations and solve them.]

Question 5 (30 points):


Consider the training error and test error observed as we train for a neural network using full
(i.e., batch) gradient descent.
1) Is there overfitting with the trained model? If yes, list several possible causes of
overfitting.
2) List several possible techniques that can help reduce overfitting.
3) Choose one of the possible techniques in 2). Draw the possible new curves for training
error and test error respectively after applying this technique. Briefly explain why we may
have the new curves.

Downloaded by Muhammad Hamza ([email protected])


lOMoARcPSD|21830762

CPE 695 A/WS: Applied Machine Learning Midterm Exam Spring 2021

Question 6 (25 points):

In ensemble learning, there are several popular fusion methods for Class Label type classifiers,
e.g., majority vote, weighted majority vote, and naïve Bayes methods. Assuming we have 3
classifiers, and their results are given in the table 1. The confusion matrix of each classifier is given
in table 2. Please give the final decision using the each of following fusion methods:
1) Majority vote
2) Weighted majority vote, assuming the weights are
Classifier 1: 0.3
Classifier 2: 0.4
Classifier 3: 0.3
3) Naïve Bayes method

Table 1 Results of each classifier


Sample x Result

Classifier 1 Class 1

Classifier 2 Class 2

Classifier 3 Class 1

Table 2 Confusion matrix of each classifier

i) Classifier 1 ii) Classifier 2 iii) Classifier 3

Class1 Class2 Class1 Class2 Class1 Class2

Class1 70 10 Class1 80 30 Class1 60 10

Class2 30 50 Class2 20 30 Class2 40 40

Downloaded by Muhammad Hamza ([email protected])

You might also like