0% found this document useful (0 votes)
7 views14 pages

LAB4_ML_EAC22050

Uploaded by

am.en.u4eac22015
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views14 pages

LAB4_ML_EAC22050

Uploaded by

am.en.u4eac22015
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

19EAC38 Machine Learning with Python

Name: PARTTHIV MURALI R P Roll no.: AM.EN.U4EAC22050

DEPARTMENT OF ELECTRONICS AND COMMUNICATION


ENGINEERING

19EAC381 MACHINE LEARNING WITH PYTHON

LABSHEET 4

Linear Regression

Course Outcome: CO3 Date: - -

Aim: (Objective)

• To understand and implement data cleaning and preprocessing techniques.


• To apply linear regression for predictive modeling.
• To evaluate model performance using statistical metrics.

Introduction:
Linear regression is widely used for predictive analysis by modeling the relationship between
a dependent variable and one or more independent variables. In this session, we have focused
on utilizing Python libraries such as NumPy, Pandas, Seaborn, and Matplotlib for data
preprocessing, model training, evaluation, and visualization. The dataset is divided into
training and testing sets to build a regression model, followed by evaluating the model's
performance using metrics like R-squared, Mean Absolute Error, and Root Mean Squared
Error. Finally, a regression plot is generated to visualize the linear relationship between
variables and assess the model's fit.

Lab sheet 4 35
19EAC38 Machine Learning with Python

Lab sheet 4 36
19EAC38 Machine Learning with Python

Result and Inference:

Inference:

The Linear Regression model successfully captures the underlying relationship


between the independent and dependent variables, as indicated by the performance
metrics. A higher R-squared value suggests that the model explains a significant
portion of the variance in the target variable, while lower values for Mean
Absolute Error (MAE) and Root Mean Squared Error (RMSE) demonstrate the
model's accuracy in predictions. The regression plot further confirms this by
displaying a linear trend that closely aligns with the data points, indicating a well-
fitted model. These results suggest that linear regression is an effective method for
predictive modeling in cases where a linear relationship exists within the dataset.

Lab sheet 4 37
19EAC38 Machine Learning with Python

Lab sheet 4 38
19EAC38 Machine Learning with Python

Lab sheet 4 39
19EAC38 Machine Learning with Python

Result and Inference:

Inference:

This linear regression analysis on the CO₂ emissions dataset, we observe that the
model effectively captures the relationship between vehicle features and CO₂
emissions. The performance metrics, including Mean Absolute Error (MAE),
Mean Squared Error (MSE), and Root Mean Squared Error (RMSE), indicate the
model’s accuracy in predicting emission levels based on vehicle characteristics.
The correlation heatmap further supports these findings, revealing strong
correlations between certain features and CO₂ emissions. The regression plot
shows a clear trend, validating the model's predictions. Overall, this analysis
demonstrates linear regression's utility in estimating CO₂ emissions and highlights
the influence of specific vehicle attributes on environmental impact.

Lab sheet 4 40
19EAC38 Machine Learning with Python

Lab sheet 4 41
19EAC38 Machine Learning with Python

Result and Inference:

Lab sheet 4 42
19EAC38 Machine Learning with Python

Lab sheet 4 43
19EAC38 Machine Learning with Python

Lab sheet 4 44
19EAC38 Machine Learning with Python

Lab sheet 4 45
19EAC38 Machine Learning with Python

Lab sheet 4 46
19EAC38 Machine Learning with Python

Result and Inference:

Lab sheet 4 47
19EAC38 Machine Learning with Python

Inference:

The implementation of linear regression with gradient descent offers insight into
optimizing model parameters by iteratively minimizing the cost function. By
adjusting the model’s weights to reduce the error between predicted and actual
values, gradient descent efficiently converges to an optimal solution, even for large
datasets. The iterative process demonstrates how gradient descent achieves
accurate predictions without requiring closed-form solutions, making it highly
scalable. This approach reinforces the effectiveness of gradient descent in linear
regression for data-driven tasks, where learning rates and convergence play crucial
roles in the model's performance and accuracy.

Lab sheet 4 48

You might also like