Linear Regression ( Fun with Mathematics )
So in this Series of Machine Learning , I plan to teach machine learning in a fun as well as mathematically.
This is the starting point in any Machine learning Engineer career.
Linear Regression...........
y = a*x + b
Everyone one Linear Regressions , a Regression technique to fit the best line in a given data sets.
So here in this article we will derive a Linear Regression Formula using the help of our beloved Calculus....
So , what is the point of doing this linear regression ?
Yes , Obviously to predict. We fit the best line in our datasets.
The equation looks like this
y = a * x + b
Here "y" is the dependent variable or anything we are going to predict , "x" is independent variable , or a predictor , and "a" and "b" are the slopes and y-intercept of the lines , in general , the values of a and b are responsible on making how good the line fits on data.
So our main concern is to find the optimal values of a and b which fits the data.
There are many technique that can be used to find values of a and b ,
Here in this article we will be going through the simplest method using SSE(Sum of Squared Error) method, where our main concern is to minimize the error.
Terminologies used
y(i) = Actual Value
y^(i) = Predicted Value
and our SSE ( E ) (equation ......a ) is the differ of sum of Squares between real(actual) and predicted value.
a , b are to be determined,
As there are two unknown variables "a" and "b" but we got only one equation that is SSE , so to solve for "a" and "b" , we perform partial derivative on E( equation ...... a ) with respect to a and b simultaneously.
We get two equations ( equation.......... b and equation........ c ) as in picture below.
Solving for a and b from these two equations as
equation..........b = 0
and
equation.........c = 0
we get the following two equations marked inside rectangular box.

Now we can easily solve these two equations for a and b ,which i leave for you and you get the solved equations as follows...
Now ,
we have our parameters a and b , our regression model looks like y = a*x + b, plug in x variable and get y , easy peasyyyyyyyy
In next Article we will implement this Formula in Python and also Visualize Linear Regression.
This is the starting point in any Machine learning Engineer career.
Linear Regression...........
y = a*x + b
Everyone one Linear Regressions , a Regression technique to fit the best line in a given data sets.
So here in this article we will derive a Linear Regression Formula using the help of our beloved Calculus....
So , what is the point of doing this linear regression ?
Yes , Obviously to predict. We fit the best line in our datasets.
The equation looks like this
y = a * x + b
Here "y" is the dependent variable or anything we are going to predict , "x" is independent variable , or a predictor , and "a" and "b" are the slopes and y-intercept of the lines , in general , the values of a and b are responsible on making how good the line fits on data.
So our main concern is to find the optimal values of a and b which fits the data.
There are many technique that can be used to find values of a and b ,
Here in this article we will be going through the simplest method using SSE(Sum of Squared Error) method, where our main concern is to minimize the error.
Terminologies used
y(i) = Actual Value
y^(i) = Predicted Value
and our SSE ( E ) (equation ......a ) is the differ of sum of Squares between real(actual) and predicted value.
a , b are to be determined,
As there are two unknown variables "a" and "b" but we got only one equation that is SSE , so to solve for "a" and "b" , we perform partial derivative on E( equation ...... a ) with respect to a and b simultaneously.
We get two equations ( equation.......... b and equation........ c ) as in picture below.
Solving for a and b from these two equations as
equation..........b = 0
and
equation.........c = 0

Now we can easily solve these two equations for a and b ,which i leave for you and you get the solved equations as follows...
Now ,
we have our parameters a and b , our regression model looks like y = a*x + b, plug in x variable and get y , easy peasyyyyyyyy
In next Article we will implement this Formula in Python and also Visualize Linear Regression.


Comments
Post a Comment