Auto-Encoding Variational Bayes (VAE)Product23/100 via “continuous latent space sampling for generative modeling”
* 🏆 2014: [Generative Adversarial Networks (GAN)](https://papers.nips.cc/paper/2014/hash/5ca3e9b122f61f8f06494c97b1afccf3-Abstract.html)
Unique: Generates samples by sampling from a simple, tractable prior distribution rather than learning a complex implicit distribution (as in GANs) or requiring rejection sampling. The prior is fixed (e.g., standard Gaussian) and chosen for computational convenience, while the decoder learns to transform prior samples into realistic data. This provides a principled probabilistic framework for generation with explicit likelihood evaluation, unlike GANs which lack a tractable likelihood.
vs others: Provides more stable and interpretable generation than GANs because the prior is fixed and tractable, enabling likelihood-based evaluation and principled sampling; enables smoother interpolation than autoregressive models because latent space is continuous and low-dimensional, whereas autoregressive models generate sequentially without explicit latent structure.