Variational inference is a family of techniques for approximating intractable integrals arising in Bayesian inference and machine learning. It approximates posterior densities for Bayesian models as an alternative to Markov chain Monte Carlo that is faster and easier to scale to large data. The core idea of variational inference is to restrict the approximate posterior to a family of distributions and optimize it to minimize its Kullback-Leibler divergence from the true posterior. This results in an optimization problem of maximizing the evidence lower bound. Mean field variational inference uses a mean field approximation that assumes independent factors and optimizes each factor in turn using coordinate ascent. Variational inference was applied to a Bayesian mixture of Gaussians model as an example.