Image 1 Image 2 Image 3 Image 4
Random captures.

Welcome to my blog. I'm interested broadly in learning algorithms: vision, graphics, and machine learning.

What is Induction?

I try to understand how finite evidence supports claims about unobserved cases through confirmation puzzles, probability spaces, Bayesian updating, chance, statistics, and learning in the limit. Not to be confused with mathematical induction.

Geometry of Motion

A geometric lens on simulation: cotangent bundles, Hamiltonian flows, and how geodesics, magnetism, and relativity emerge in one coherent framework.

Simulating Stuff

An illustrated primer on physics-based simulation for graphics: mass-spring systems as a unifying idea, time integration (Euler, RK, backward/symplectic), and mass-spring models, and constraints.

Why Does SGD Love Flat Minima?

This article takes a look through the chronicles of Stochastic Gradient Descent (SGD). We take a look at why SGD and the stochastic gradient noise is responsible for SGD working so well.

#BIS-Hard but Not Impossible: Ferromagnetic Potts Model on Expanders

How do you efficiently sample from a distribution that's algorithmically #BIS-hard? The ferromagnetic Potts model is a canonical Markov random field where monochromatic edges win the popularity contest. This article is about how polymer methods and extremal graph theory crack the sampling puzzle on d-regular weakly expanding graphs.

Compression Unlocks Statistical Learning Secrets

Characterizing the sample complexity of different machine learning tasks is an important question in learning theory. This article reviews the less conventional approach of using compression schemes for proving sample complexity upper bounds, with specific applications in learning under adversarial perturbations and learning Gaussian mixture models.

Subscribe to RSS