I’m glad you are here. This page is a backup of all the blogs I have written (not all of my past blogs are here yet, WIP). I write about Machine Learning, Open Source and Computer Science in general.
-
SEE-2-SOUND🔉: Zero-Shot Spatial Environment-to-Spatial Sound
In this blog post, I will introduce our recent work, SEE-2-SOUND. I will try to explain intuitively the motivation behind our work, how I ended up thinking of this idea, explain how the method works, discuss some subtle aspects of our implementation, and discuss some future prospects. -
An Intuitive Look at the Dynamics of SGD
This blog post is an intuitive look through the chronicles of Stochastic Gradient Descent (SGD). I try to explain the reasoning behind some of the interesting aspects of SGD, like the stochastic gradient noise, the SGN covariance matrix, and SGD's preference for flat minima. Some ideas have particularly taken a bit of time for me to truly understand and with this, I hope to make it easier for others. -
The Ferromagnetic Potts Model
The ferromagnetic Potts model is a canonical example of a Markov random field from statistical physics that is of great probabilistic and algorithmic interest. This is a distribution over all 1-colorings of the vertices of a graph where monochromatic edges are favored. The algorithmic problem of efficiently sampling approximately from this model is known to be #BIS-hard, and has seen a lot of recent interest. This blog outlines some recently developed algorithms for approximately sampling from the ferromagnetic Potts model on d-regular weakly expanding graphs. -
Compression techniques in Statistical Learning
Characterizing the sample complexity of different machine learning tasks is one of the central questions in statistical learning theory. This blog reviews the less conventional approach of using compression schemes for proving sample complexity upper bounds, with specific applications in learning under adversarial perturbations and learning Gaussian mixture models. -
Kubeflow Pipelines: Orchestrating Machine Learning Workflows - Part 3
Kubeflow Pipelines is a great way to build and deploy end-to-end scalable and portable Machine Learning workloads. In this article, we take a look at how to use Kubeflow Pipelines for your own tasks and how Kubeflow Pipelines works under the hood. -
How to Use GitHub Super Linter in Your Projects
When you're starting a new project, you might have to add multiple linting tools to beautify your code and prevent simple errors. In this article, I will show you how you can use GitHub Super Linter, a single linter to solve all these problems. Most of my personal projects use GitHub Super Linter as well, and I have personally found it be a huge lifesaver. -
Kubeflow Notebooks: ML Experimentation Made Easier - Part 2
Machine learning is a very iterative process with a ton of experimentation you might need to do. In this article, I introduce you to Kubeflow Notebooks, a way to run development environments inside your Kubernetes cluster, as well as how you could extend the default capabilities of Kubeflow Notebooks for your own use cases and how it works under the hood. -
Kubeflow: Machine Learning on Kubernetes - Part 1
Developing and deploying machine learning systems could be a pain with multiple things you need to manage. In this article, I introduce you and help you get started with Kubeflow while also understanding how Kubeflow works. This is the first article in the Kubeflow series and I will try to help you answer the question Why and When Kubeflow? -
My first in-person KubeCon + CloudNativeCon
This year I got a chance to attend my first in-person KubeCon + CloudNativeCon in Valencia, Spain under the generous Dan Kohn scholarship by CNCF and Linux Foundation. In this article, I give you some glimpses of my experiences KubeCon + CloudNativeCon Europe. These would be paired with some of my learnings from the conference which might help you too! -
What Are Graph Neural Networks? How GNNs Work, Explained with Examples
In this article, I help you get started and understand how graph neural networks work while also trying to address the question why at each stage. Finally we will also take a look at implementing some of the methods we talk about in this article in code. -
The Conductor of Two Naturals is the largest number which cannot be written as mb+nc
This article presents a short but non-obvious and interesting theorem in Number Theory that I originally discovered while working on a problem. -
How to Start an Open Source Project on GitHub – Tips from Building My Trending Repo
Quite recently I ended up on the coveted GitHub Trending page. I was the #2 trending developer on all of GitHub – and for Python as well, which was a pleasant surprise for me. I will be sharing some tips in this article which you should be able to apply to all kinds of projects. -
Self-supervised contrastive learning with NNCLR
Implementation of NNCLR, a self-supervised learning method for computer vision. -
Image classification with Swin Transformers
Image classification using Swin Transformers, a general-purpose backbone for computer vision. -
How to Monitor Machine Learning Projects on Your Mobile Device📱
What if you could monitor your Colab, Kaggle, or AzureML Machine Learning projects on your mobile phone? You'd be able to check in on your models on the fly – even while taking a walk🚶. -
Gradient Centralization for Better Training Performance
Implement Gradient Centralization to improve training performance of DNNs. -
Skewness and Kurtosis – Positively Skewed and Negatively Skewed Distributions in Statistics Explained
In this article, I'll explain two important concepts in statistics: skewness and kurtosis. And don't worry – you won't need to know very much math to understand these concepts and learn how to apply them. -
Plant AI: Student Ambassador Green-A-Thon activity report
In this article, we introduce our project “Plant AI” which won the Microsoft Green Hackathon and walk you through our motivation behind building this project, how it could be helpful to the community, the process of building this project, and finally our future plans with this project. -
How to Build Better Machine Learning Models
If you have built Deep Neural Networks before, you might know that it can involve a lot of experimentation. In this article, I will share with you some useful tips and guidelines that you can use to better build better deep learning models. These tricks should make it a lot easier for you to develop a good network. -
The Art of Hyperparameter Tuning in Deep Neural Nets by Example
If you have worked on building Deep Neural Networks earlier you might know that building neural nets can involve setting a lot of different hyperparameters. In this article, I will share with you some tips and guidelines you can use to better organize your hyperparameter tuning process which should make it a lot more efficient for you to stumble upon a good setting for the hyperparameters. -
Machine Learning with Android 11: What’s new?
This blog demonstrates how you can get started with on-device ML with tools or plugins specifically launched with Android 11. If you have earlier worked with ML in Android, you will explore easier ways to integrate your ML applications with your Android apps. If you have not worked with ML in Android earlier, this could be a starting point for you to do so and start super powering your Android app with Machine Learning.