Posts

Showing posts from January, 2019

Gradient Descent - The deep Learning Entry point

Image
Lets bring all our attention here , Gradient descent one of the most popular and most used optimization technique in Machine learning and deep learning. Gradient descent's job is to optimize the given equation , so what might be the equation ? It might be any equation a linear equation ( y = mx + b ) , a multi variate Linear equation , a polynomial equation etc any equation you can think of. So how does it optimize ? Before answering this question lets recall our Linear regression , how we optimized our linear regression model ? How we find the best values of m and b, that best fits the data ? you remember, yes we used Least square method ( Minimizing the Sum of squares method ) and for that we had derived a formula, In this article , I will explain gradient descent being on top of Linea regression as it would be easy to teach and understand, Gradient descent's job is also to give us the best values of m and b , as our previous linear regression model did but using...

Multiple Linear Regression From Scratch + Implementation in Python

Image
Before Moving Further , if you are not familiar with Single variate Linear Regression , please do read my previous 2 posts and get familiar with it. Today we are going to learn Multivariate Linear Regression , Lets take an example of Real World, You want to predict if a Person has diabetes or not, for that what are the things you consider , may be age , gender, heredity, lifestyle , drinking habits etc, more than one variable is responsible for correct prediction , this is what we call Multiple Linear Regression where more than one variables effects the Model. Lets directly dive into the mathematics behind it. Below in Pic 1 , there is a Simple Uni-variate Linear Regression which we have already Solved , and now we have equation for Multiple Linear Regression where parameters range from beta(1) to beta(n) and we have to find the values for these parameters solving the equation of Multiple Linear Regression. In pic 1 , we have now represented our Multiple Linear Regression equation in t...

Linear Regression From Scratch

Image
So as upto now you can already Solve a Linear Regression , as we already discussed in earlier lessons mathematically. Today we are going to do a Linear regression Model in Python from scratch , which means without the use of any library, For that we need to first have our mathematical  equations for our Linear Regression Model , I strongly and very strongly suggest you to solve each of these maths on your own , take your time. Here I have solved the Linear Regression Model and made it simplified so that it would be easy to code in python. Below in Picture 1 ,  we have simply derived a equation( equation....1 )  , using partial derivative on "a"  on SSE ( that is our Cost Function ) Below in picture 2 , we again derive another equation( equation..... 2 )  performing partial derivative on "b" in our cost function and also used a little bit of trick to simplify in simpler form. pic 2 In pic 3 and Pic 4 Now from equation......1 and equation...

Linear Regression ( Fun with Mathematics )

Image
So in this Series of Machine  Learning , I plan to teach machine learning in a fun as  well as mathematically. This is the starting point in any Machine  learning Engineer career. Linear Regression........... y = a*x  + b Everyone one Linear Regressions , a Regression technique to fit  the best line in a given data sets. So here in this article we will derive a Linear Regression Formula using the help of our beloved Calculus.... So , what is the point of doing this linear regression ? Yes , Obviously to predict. We fit the best line in our datasets. The equation looks like this y = a * x + b Here "y" is the dependent variable or anything we are going to predict  , "x" is independent variable , or a predictor  , and "a" and "b" are the slopes and y-intercept of the lines , in general , the values of a and b are responsible on making how good the line fits on data. So our main concern is to find the optimal values of a an...