03 Classification Handout
03 Classification Handout
University of Toronto
We will first look at binary problems, and discuss multi-class problems later
in class
A dimensional
One 1D example: example (input x is 1-dim)
The colors indicate labels (a blue plus denotes that t (i) is from the first
class, red circle that t (i) is from the second class)
Greg Shakhnarovich (TTIC) Lecture 5: Regularization, intro to classification October 15, 2013 11 / 1
y
ŷ = +1 ŷ = −1
+1
x
w0 + w T x
-1
w0 + wT x = 0
w0 + wT x = 0
y (x) = sign(w0 + wT x)
Absolute Error
Labsolute (y (x), t) = |t − y (x)|
Zemel, Urtasun, Fidler (UofT) CSC 411: 03-Classification 17 / 24
More Complex Loss Functions
What if the movie predictions are used for rankings? Now the predicted
ratings don’t matter, just the order that they imply.
In what order does Alice prefer E.T., Amelie and Titanic?
Possibilities:
I 0-1 loss on the winner
I Permutation distance
I Accuracy of top K movies.
Should we make the model complex enough to have perfect separation in the
training data?