Posts

Showing posts from May, 2021

einsum is all you need.

Image
 To get more context about what is einsum notation, follow the blogs mentioned at the end of the article. Eninsum notation helps in simplifying dot products, outer products, hadamard products, matrix-matrix multiplication, matrix-vector multiplication, etc. It's difficult to remember the shapes every time, we  always somehow stuck somewhere on the matrix shapes or finding difficulty. Einsum helps to mitigate that. In this article, we will try to see how can we use einsum notation in build deep learning models.  Einsum notation in numpy is implemented as numpy.einsum, in pytorch as torch.einsum, tf.eninsum in tensorflow. A typical call to einsum notation would look like : result=einsum("□□,□□□,□□->□□",arg1,arg2,arg3) where, □ are the placeholder for a character the specify the dimension, and arg1, arg2 are the actual arguments. After ->  we need to specify the output shape we want to have. The internal working is handled by einsum. Let's look at some basic exam...