0% found this document useful (0 votes)
8 views5 pages

Regression_Examples_and_Solutions_Fixed

Uploaded by

oneu9724
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views5 pages

Regression_Examples_and_Solutions_Fixed

Uploaded by

oneu9724
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Formulas and Examples with Solutions

**Formulas and Examples with Solutions**

1. Simple Linear Regression

Formula:

Y = b0 + b1 X

Y: Dependent variable

X: Independent variable

b0: Intercept

b1: Slope

Example Question:

Given: X = [1, 2, 3] and Y = [2, 4, 5]. Find the regression line equation.

Solution:

1. Calculate means: X_mean = 2, Y_mean = 3.67

2. Compute slope:

b1 = Sum((X - X_mean)(Y - Y_mean)) / Sum((X - X_mean)^2) = 1.5

3. Intercept:

b0 = Y_mean - b1 * X_mean = 0.67

Final Equation: Y = 0.67 + 1.5X

2. Multiple Linear Regression

Formula:

Y = b0 + b1 * X1 + b2 * X2

Page 1
Formulas and Examples with Solutions

Example Question:

Predict Y using X1 = [1, 2] and X2 = [3, 4] where Y = [5, 6]. Find coefficients.

Solution:

Final coefficients: b0 = 1, b1 = 0.5, b2 = 1

Equation: Y = 1 + 0.5*X1 + 1*X2

3. Cost Function for Linear Regression

Formula:

J(theta) = (1 / 2m) * Sum((h_theta(X) - Y)^2)

Example Question:

Predicted: [2.5, 4.5], Actual: [3, 5]. Compute cost function.

Solution:

J(theta) = (1 / 4)[(2.5 - 3)^2 + (4.5 - 5)^2] = 0.0625

4. Ordinary Least Squares (OLS)

Slope (b1):

b1 = Sum((xi - X_mean)(yi - Y_mean)) / Sum((xi - X_mean)^2)

Intercept (b0):

b0 = Y_mean - b1 * X_mean

Example Question:

Given: X = [1, 2, 3], Y = [2, 4, 5]. Find b1 and b0.

Page 2
Formulas and Examples with Solutions

Solution:

b1 = 1.5, b0 = 0.67

Final equation: Y = 0.67 + 1.5X

5. Gradient Descent

Formula:

theta = theta - alpha * dJ(theta)/dtheta

Example Question:

Initial theta = 0.5, learning rate (alpha) = 0.1, partial derivative = 0.2.

Solution:

theta = 0.5 - 0.1 * 0.2 = 0.48

6. Mean Absolute Error (MAE)

Formula:

MAE = (1/n) * Sum(|Y - Y_hat|)

Example Question:

Actual: [3, 5, 7], Predicted: [2.5, 5.5, 6.5]. Calculate MAE.

Solution:

MAE = (0.5 + 0.5 + 0.5)/3 = 0.5

7. Mean Squared Error (MSE)

Formula:

Page 3
Formulas and Examples with Solutions

MSE = (1/n) * Sum((Y - Y_hat)^2)

Example Question:

Actual: [3, 5, 7], Predicted: [2.5, 5.5, 6.5]. Calculate MSE.

Solution:

MSE = (0.25 + 0.25 + 0.25)/3 = 0.25

8. Root Mean Squared Error (RMSE)

Formula:

RMSE = sqrt(MSE)

Example Question:

Find RMSE using above MSE = 0.25.

Solution:

RMSE = sqrt(0.25) = 0.5

9. R-squared (R^2)

Formula:

R^2 = 1 - Sum((Y - Y_hat)^2) / Sum((Y - Y_mean)^2)

Example Question:

Actual: [3, 5, 7], Predicted: [3, 5, 7]. Calculate R^2.

Solution:

Page 4
Formulas and Examples with Solutions

R^2 = 1.0 (perfect fit)

10. Ridge Regression (L2 Regularization)

Formula:

J(theta) = (1 / 2m) * Sum((Y - Y_hat)^2) + lambda * Sum(theta^2)

Example Question:

Predicted: [2.5, 4.5], Actual: [3, 5], lambda = 0.1. Compute cost.

Solution:

Cost: J(theta) = 0.0625 + 0.1 * Sum(theta^2)

11. Lasso Regression (L1 Regularization)

Formula:

J(theta) = (1 / 2m) * Sum((Y - Y_hat)^2) + lambda * Sum(|theta|)

Example Question:

Predicted: [2.5, 4.5], Actual: [3, 5], lambda = 0.1. Compute cost.

Solution:

Cost: J(theta) = 0.0625 + 0.1 * Sum(|theta|)

Page 5

You might also like