Linear Classi Ers: Prediction Equations: Michael (Mike) Gelbart
Linear Classi Ers: Prediction Equations: Michael (Mike) Gelbart
prediction equations
LIN EAR CLAS S IF IERS IN P YTH ON
14
array([0, 1, 2])
x@y
y = np.arange(3,6)
y
14
array([0, 4, 10])
lr = LogisticRegression()
lr.fit(X,y)
lr.predict(X)[10]
lr.predict(X)[20]
array([-33.78572166])
array([ 0.08050621])
Michael Gelbart
Instructor, The University of British
Columbia
Least squares: the squared loss
scikit-learn's LinearRegression minimizes a loss:
n
∑(true ith target value − predicted ith target value)2
i=1
This is the 0-1 loss: it's 0 for a correct prediction and 1 for an
incorrect prediction.
minimize(np.square, 0).x
array([0.])
minimize(np.square, 2).x
array([-1.88846401e-08])