Home Ai Machine Learning Dbms Java Blockchain Control System Selenium HTML Css Javascript
Home Ai Machine Learning Dbms Java Blockchain Control System Selenium HTML Css Javascript
Linear regression algorithm shows a linear relationship between a dependent (y) and one or more independent (y)
variables, hence called as linear regression. Since linear regression shows the linear relationship, which means it finds how
the value of the dependent variable is changing according to the value of the independent variable.
The linear regression model provides a sloped straight line representing the relationship between the variables. Consider
the below image:
y= a0+a1x+ ε
Here,
The values for x and y variables are training datasets for Linear Regression model representation.
The different values for weights or the coefficient of lines (a0, a1) gives a different line of regression, so we need to calculate
the best values for a0 and a1 to find the best fit line, so to calculate this we use cost function.
Cost function-
◦ The different values for weights or coefficient of lines (a0, a1) gives the different line of regression, and the cost function
is used to estimate the values of the coefficient for the best fit line.
◦ Cost function optimizes the regression coefficients or weights. It measures how a linear regression model is performing.
◦ We can use the cost function to find the accuracy of the mapping function, which maps the input variable to the
output variable. This mapping function is also known as Hypothesis function.
For Linear Regression, we use the Mean Squared Error (MSE) cost function, which is the average of squared error occurred
between the predicted values and actual values. It can be written as:
Where,
N=Total number of observation
Yi = Actual value
(a1xi+a0)= Predicted value.
Residuals: The distance between the actual value and predicted values is called residual. If the observed points are far from the
regression line, then the residual will be high, and so cost function will high. If the scatter points are close to the regression line,
then the residual will be small and hence the cost function.
Gradient Descent:
◦ Gradient descent is used to minimize the MSE by calculating the gradient of the cost function.
◦ A regression model uses gradient descent to update the coefficients of the line by reducing the cost function.
◦ It is done by a random selection of values of coefficient and then iteratively update the values to reach the minimum cost
function.
Model Performance:
The Goodness of fit determines how the line of regression fits the set of observations. The process of finding the best model out
of various models is called optimization. It can be achieved by below method:
1. R-squared method:
◦ It measures the strength of the relationship between the dependent and independent variables on a scale of 0-100%.
◦ The high value of R-square determines the less difference between the predicted values and actual values and hence
represents a good model.
◦ It is also called a coefficient of determination, or coefficient of multiple determination for multiple regression.
◦ Homoscedasticity Assumption:
Homoscedasticity is a situation when the error term is the same for all the values of independent variables. With
homoscedasticity, there should be no clear pattern distribution of data in the scatter plot.
◦ No autocorrelations:
The linear regression model assumes no autocorrelation in error terms. If there will be any correlation in the error term, then
it will drastically reduce the accuracy of the model. Autocorrelation usually occurs if there is a dependency between residual
errors.
← Prev Next →
Feedback
◦ Send your Feedback to [email protected]
Splunk tutorial SPSS tutorial Swagger tutorial T-SQL tutorial Tumblr tutorial React tutorial
Splunk SPSS Swagger Transact-SQL Tumblr ReactJS
Regex tutorial Reinforcement R Programming RxJS tutorial React Native Python Design
Regex learning tutorial tutorial RxJS tutorial Patterns
Reinforcement R Programming React Native Python Design
Learning Patterns
Preparation
Trending Technologies
Artificial AWS Tutorial Selenium tutorial Cloud Computing Hadoop tutorial ReactJS Tutorial
Intelligence AWS Selenium Cloud Computing Hadoop ReactJS
Artificial
Intelligence
Data Science Angular 7 Tutorial Blockchain Git Tutorial Machine Learning DevOps Tutorial
Tutorial Angular 7 Tutorial Git Tutorial DevOps
Data Science Blockchain Machine Learning
B.Tech / MCA
DBMS tutorial Data Structures DAA tutorial Operating System Computer Compiler Design
DBMS tutorial DAA Operating System Network tutorial tutorial
Data Structures Computer Network Compiler Design
Cyber Security Automata Tutorial C Language C++ tutorial Java tutorial .Net Framework
tutorial Automata tutorial C++ Java tutorial
Cyber Security C Programming .Net
Python tutorial List of Programs Control Systems Data Mining Data Warehouse
Python Programs tutorial Tutorial Tutorial
Control System Data Mining Data Warehouse