chapter 8
chapter 8
Y = f (X).
COMMON REGRESSION ALGORITHMS
• The most common regression algorithms are
• Simple linear regression
• Multiple linear regression
• Polynomial regression
• Multivariate adaptive regression splines
• Logistic regression
• Maximum likelihood estimation (least squares)
Simple Linear Regression
• involves only one predictor.
• This model assumes a linear relationship
between the dependent variable and the
predictor variable.
Simple linear regression
Slope of the simple linear regression model
Scenario 1 for negative slope: Delta (Y) is positive and Delta (X) is negative
Scenario 2 for negative slope: Delta (Y) is negative and Delta (X) is positive
Curve linear negative slope
ŷ = 1.89395X + 19.0473
Multiple Linear Regression
• In a multiple regression model, two or more independent
variables, i.e. predictors are involved in the model.
• Ŷ = a + b1 X1 + b2 X2 (two predictor variables, namely X1 and
X2)
• The model describes a plane in the three-dimensional space of
Ŷ, X1 , and X2 .
• ‘a’ : intercept of this plane.
• ‘b 1’ and ‘b2 ’ : partial regression coefficients.
• Parameter b1 represents the change in the mean response
corresponding to a unit change in X1 when X2 is held constant.
• Parameter b2 represents the change in the mean response
corresponding to a unit change in X2 when X1 is held constant.
Polynomial Regression Model
• Polynomial regression model is the extension
of the simple linear model by adding extra
predictors obtained by raising (squaring) each
of the original predictors to a power.
x = input value
y = predicted output
b0 = bias or intercept term
b1 = coefficient for input (x)
Differences Between Linear and Logistic Regression
# Standardize features
scaler = StandardScaler()
X_train = scaler.fit_transform(X_train)
X_test = scaler.transform(X_test)
Step5: Train The Model
Step6: Evaluation Metrics