Image 1 Image 2 Image 3 Image 4
Random captures.

Welcome to my blog. I'm interested broadly in learning algorithms: vision, graphics, and machine learning.

Why Does SGD Love Flat Minima?

This article takes a look through the chronicles of Stochastic Gradient Descent (SGD). We take a look at why SGD and the stochastic gradient noise is responsible for SGD working so well.

#BIS-Hard but Not Impossible: Ferromagnetic Potts Model on Expanders

How do you efficiently sample from a distribution that's algorithmically #BIS-hard? The ferromagnetic Potts model is a canonical Markov random field where monochromatic edges win the popularity contest. This article is about how polymer methods and extremal graph theory crack the sampling puzzle on d-regular weakly expanding graphs.

Compression Unlocks Statistical Learning Secrets

Characterizing the sample complexity of different machine learning tasks is an important question in learning theory. This article reviews the less conventional approach of using compression schemes for proving sample complexity upper bounds, with specific applications in learning under adversarial perturbations and learning Gaussian mixture models.

Subscribe to RSS