Image 1 Image 2 Image 3 Image 4
Random captures.

Welcome to my blog. I'm interested broadly in learning algorithms: vision, graphics, and machine learning.

An Intuitive Look at the Dynamics of SGD

This blog post is an intuitive look through the chronicles of Stochastic Gradient Descent (SGD). I try to explain the reasoning behind some of the interesting aspects of SGD, like the stochastic gradient noise, the SGN covariance matrix, and SGD's preference for flat minima. Some ideas have particularly taken a...

The Ferromagnetic Potts Model

The ferromagnetic Potts model is a canonical example of a Markov random field from statistical physics that is of great probabilistic and algorithmic interest. This is a distribution over all 1-colorings of the vertices of a graph where monochromatic edges are favored. The algorithmic problem of efficiently sampling approximately from...

Compression techniques in Statistical Learning

Characterizing the sample complexity of different machine learning tasks is one of the central questions in statistical learning theory. This blog reviews the less conventional approach of using compression schemes for proving sample complexity upper bounds, with specific applications in learning under adversarial perturbations and learning Gaussian mixture models.

Subscribe to RSS