0% found this document useful (0 votes)
12 views

Week 2 Lasso and Ridge Regression

Uploaded by

fardosa1904
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

Week 2 Lasso and Ridge Regression

Uploaded by

fardosa1904
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Lasso, and Ridge Regression

Slides by Dr Sameena Naaz


Photo by Pexels
Table of Contents

01 Linear, Lasso, and Ridge Regression


02 Linear Regression
03 Lasso Regression
04 Ridge Regression
05 Comparison of Methods
1

Linear, Lasso, and Ridge


Regression
Understanding Key Differences and
Applications
• Linear Regression: A simple model that assumes a linear
linear relationship between variables. Minimizes sum of
of squared errors.
• Lasso Regression: A model with L1 regularization, reducing
reducing feature complexity by setting some coefficients to
coefficients to zero.
• Ridge Regression: A model with L2 regularization that shrinks
shrinks coefficients without setting them to zero, addressing
addressing multicollinearity.
2

Linear Regression

Key Features and Applications

• Equation: y = B + w1x1 + ... + wnxn.


• Minimizes sum of squared errors (SSE).
• Prone to overfitting if too many variables are used.
are used.
• Applications: Predictive modeling in finance,
biology, etc.
3

Lasso Regression

Key Features and Applications



• Equation: ∑(y – y )² + λ ∑|w|
• Applies L1 regularization, which forces some coefficients to
zero, leading to feature selection.
• Useful when there are many irrelevant features.
• Applications: Sparse models, reducing model complexity.
4

Ridge Regression

Key Features and Applications



• Equation: ∑(y – y )² + λ ∑w²
• Uses L2 regularization, shrinking coefficients but not to zero.
• Addresses multicollinearity by preventing large coefficients.
• Applications: Models where all features are expected to
contribute.
5

Comparison of Methods
Key Differences Between Linear, Lasso,
Lasso, and Ridge
• Linear: No regularization, may overfit with many features.
• Lasso: L1 regularization, performs feature selection by forcing
coefficients to zero.
• Ridge: L2 regularization, shrinks coefficients without zeroing
them, managing multicollinearity.
• Choosing the right model depends on the dataset and goals.

Photo by Pexels

You might also like