0% found this document useful (0 votes)
3 views

1_Linear Regression with MSE Function PLot

The document contains a Python script that demonstrates a simple linear regression using Matplotlib and Scikit-learn. It generates random data, fits a linear regression model, calculates the Mean Squared Error, and visualizes the data points along with the regression line and errors. Additionally, it suggests exploring a similar graph on Desmos for further analysis of minima through differentiation.

Uploaded by

sambha7896
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

1_Linear Regression with MSE Function PLot

The document contains a Python script that demonstrates a simple linear regression using Matplotlib and Scikit-learn. It generates random data, fits a linear regression model, calculates the Mean Squared Error, and visualizes the data points along with the regression line and errors. Additionally, it suggests exploring a similar graph on Desmos for further analysis of minima through differentiation.

Uploaded by

sambha7896
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

1/4/25, 11:34 AM Function_OLS_plot.

ipynb - Colab

import matplotlib.pyplot as plt


import numpy as np
from sklearn.linear_model import LinearRegression

# Creating a simple linear regression example


np.random.seed(0)
x = 2 * np.random.rand(100, 1)
y = 4 + 3 * x + np.random.randn(100, 1)

# Fitting the linear regression model


lin_reg = LinearRegression()
lin_reg.fit(x, y)
y_pred = lin_reg.predict(x)

# Calculating the Mean Squared Error -> This is what needs to be minimized for the line of best fit
#What can be the possible y=f(x) relationship if x denotes the error y is the outcome
mse = np.mean((y - y_pred)**2)

# Plotting the data points


plt.scatter(x, y, color='blue', label='Data points')

# Plotting the linear regression line


plt.plot(x, y_pred, color='red', label='Linear regression line')

# Plotting the errors (distances from the points to the line)


for i in range(len(x)):
plt.plot([x[i], x[i]], [y[i], y_pred[i]], color='green', linestyle='--')

# Adding labels and title


plt.xlabel('x')
plt.ylabel('y')
plt.title(f'Linear Regression with MSE: {mse:.2f}')
plt.legend()

# Show the plot


plt.show()

#Now let's plot a similar graph on the https://www.desmos.com/calculator


# and see if it is possible to find a minima of this -> We will explore this more in Differentiation

https://colab.research.google.com/drive/16uRXrSm8m0ARl7A9kJfBpeuVdLzYGUie#printMode=true 1/2

You might also like