



A Collection of 3D Reconstruction Datasets and Trained Splats
A dataset of 3D reconstruction scenes.
An Intuitive Look at the Dynamics of SGD
This blog post is an intuitive look through the chronicles of Stochastic Gradient Descent (SGD). I try to explain the reasoning behind some of the interesting aspects of SGD, like the stochastic gradient noise, the SGN covariance matrix, and SGD's preference for flat minima. Some ideas have particularly taken a...
The Ferromagnetic Potts Model
The ferromagnetic Potts model is a canonical example of a Markov random field from statistical physics that is of great probabilistic and algorithmic interest. This is a distribution over all 1-colorings of the vertices of a graph where monochromatic edges are favored. The algorithmic problem of efficiently sampling approximately from...
Compression techniques in Statistical Learning
Characterizing the sample complexity of different machine learning tasks is one of the central questions in statistical learning theory. This blog reviews the less conventional approach of using compression schemes for proving sample complexity upper bounds, with specific applications in learning under adversarial perturbations and learning Gaussian mixture models.
The Conductor of Two Naturals is the largest number which cannot be written as mb+nc
This article presents a short but non-obvious and interesting theorem in Number Theory that I originally discovered while working on a problem.
Self-supervised contrastive learning with NNCLR
Implementation of NNCLR, a self-supervised learning method for computer vision.
Image classification with Swin Transformers
Image classification using Swin Transformers, a general-purpose backbone for computer vision.
Gradient Centralization for Better Training Performance
Implement Gradient Centralization to improve training performance of DNNs.