DescriptionGradient descent Hamiltonian Monte Carlo comparison.gif
English: In Bayesian statistics, two classes of technique are commonly used to reconstruct unobserved parameters based on observed data. This plot shows the application of each to a two-dimensional toy problem.
The blue triangle shows maximum a posteriori estimation, in which an optimization algorithm such as gradient descent is used to find the set of parameters that maximizes the posterior probability density. Starting from an arbitrary guess, the triangle eventually arrives at the optimal point through gradient descent, and this set of parameters is accepted as the best answer.
The red circles show Hamiltonian Monte Carlo, in which a physics simulation is used to sample the posterior probability distribution. Starting from an arbitrary guess, the simulation stochastically travels to a variety of likely points, which are all accepted as plausible answers.
The two axes of the plot represent two coupled parameters. The shading and contours represent the posterior probability distribution, where white is lower and green is higher.
The person who associated a work with this deed has dedicated the work to the public domain by waiving all of their rights to the work worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law. You can copy, modify, distribute and perform the work, even for commercial purposes, all without asking permission.
http://creativecommons.org/publicdomain/zero/1.0/deed.enCC0Creative Commons Zero, Public Domain Dedicationfalsefalse
Captions
An animation comparing maximum likelihood estimation with Hamiltonian Monte Carlo in two dimensions