Posts

einsum is all you need.

Image
 To get more context about what is einsum notation, follow the blogs mentioned at the end of the article. Eninsum notation helps in simplifying dot products, outer products, hadamard products, matrix-matrix multiplication, matrix-vector multiplication, etc. It's difficult to remember the shapes every time, we  always somehow stuck somewhere on the matrix shapes or finding difficulty. Einsum helps to mitigate that. In this article, we will try to see how can we use einsum notation in build deep learning models.  Einsum notation in numpy is implemented as numpy.einsum, in pytorch as torch.einsum, tf.eninsum in tensorflow. A typical call to einsum notation would look like : result=einsum("□□,□□□,□□->□□",arg1,arg2,arg3) where, □ are the placeholder for a character the specify the dimension, and arg1, arg2 are the actual arguments. After ->  we need to specify the output shape we want to have. The internal working is handled by einsum. Let's look at some basic exam...

Vectors and Projection : Geometrically !!!

Image
The Simplest Example of a Vector is x = [1 ,2 ]. It is a 2 element or 2-dimensional vector. The 2 elements can be taken as x and y, they are exactly the cartesian co-ordinates in a 2-d space. The vector corresponds to the point in that space. Vectors with n elements will represent points in d-dimensional space. Let's Visualize vector by plotting. We would be using python matpolitlib for plotting. The following pictures show the vector in a 2d space.  We can plot as many vectors in a 2d space with 2 elements, and there would be as many vectors in vector space. The Linear combination of these vectors falls on the same space, that is what we simply call vector space. Projection of Vector onto another Vector. In the following picture, what would be the projection of the Vector (Red) on Vector(blue) ? Visualize in your mind first. first, let's recall the formula of projection. The upper partis just the dot product and the lower part is the magnitude ...

python super() in two mins.

Let's start with a simple class Contact that tracks the name and e-mail addresses of several people. The Contact class is responsible for maintaining a list of all contacts in a class variable, and for initializing the name and email of individual contact. class Contact (): all_contacts = [] def __init__ ( self , name , email ): self .name = name self .email = email Contact.all_contacts.append( self ) Now, we want to change the behavior of this class. Our Contact class only allows to add name and email address, but what if we also want to add a phone number? Well, one way is by creating a subclass that inherits the superclass, the newly created subclass method is automatically called instead of superclass's method. For Example : class Friend ( Contact ): def __init__ ( self , name , email , phone ): self .name = name self .phone = phone self .email = email Any method can be o...

PCA contstrained Optimization using Lagrangian Multiplier

Image
Pre Requisite:- 1) knowledge of Co-variance Matrix 2) What is eigen vectors and values and its uses, 3) PCA Constrained Optimization of PCA using lagrangian Multiplier. Below in the picture 1 , we have the PCA formulation which is maximizing the variance, our task to find the u such that the obtained u maximizes the variance and we also have a constraint that u'u = 1, which is saying u must be a unit vector. Before , getting further in Optimizing PCA first lets learn a general case for constrained Optimization problem, General Constraint Optimization Problem for any general constraint optimization problem , given that                                 find maximum x* for a function f(x)                          ...

Gradient Descent - The deep Learning Entry point

Image
Lets bring all our attention here , Gradient descent one of the most popular and most used optimization technique in Machine learning and deep learning. Gradient descent's job is to optimize the given equation , so what might be the equation ? It might be any equation a linear equation ( y = mx + b ) , a multi variate Linear equation , a polynomial equation etc any equation you can think of. So how does it optimize ? Before answering this question lets recall our Linear regression , how we optimized our linear regression model ? How we find the best values of m and b, that best fits the data ? you remember, yes we used Least square method ( Minimizing the Sum of squares method ) and for that we had derived a formula, In this article , I will explain gradient descent being on top of Linea regression as it would be easy to teach and understand, Gradient descent's job is also to give us the best values of m and b , as our previous linear regression model did but using...

Multiple Linear Regression From Scratch + Implementation in Python

Image
Before Moving Further , if you are not familiar with Single variate Linear Regression , please do read my previous 2 posts and get familiar with it. Today we are going to learn Multivariate Linear Regression , Lets take an example of Real World, You want to predict if a Person has diabetes or not, for that what are the things you consider , may be age , gender, heredity, lifestyle , drinking habits etc, more than one variable is responsible for correct prediction , this is what we call Multiple Linear Regression where more than one variables effects the Model. Lets directly dive into the mathematics behind it. Below in Pic 1 , there is a Simple Uni-variate Linear Regression which we have already Solved , and now we have equation for Multiple Linear Regression where parameters range from beta(1) to beta(n) and we have to find the values for these parameters solving the equation of Multiple Linear Regression. In pic 1 , we have now represented our Multiple Linear Regression equation in t...

Linear Regression From Scratch

Image
So as upto now you can already Solve a Linear Regression , as we already discussed in earlier lessons mathematically. Today we are going to do a Linear regression Model in Python from scratch , which means without the use of any library, For that we need to first have our mathematical  equations for our Linear Regression Model , I strongly and very strongly suggest you to solve each of these maths on your own , take your time. Here I have solved the Linear Regression Model and made it simplified so that it would be easy to code in python. Below in Picture 1 ,  we have simply derived a equation( equation....1 )  , using partial derivative on "a"  on SSE ( that is our Cost Function ) Below in picture 2 , we again derive another equation( equation..... 2 )  performing partial derivative on "b" in our cost function and also used a little bit of trick to simplify in simpler form. pic 2 In pic 3 and Pic 4 Now from equation......1 and equation...