About 50 results
Open links in new tab
  1. What does "variational" mean? - Cross Validated

    Apr 17, 2018 · Does the use of "variational" always refer to optimization via variational inference? Examples: "Variational auto-encoder" "Variational Bayesian methods" "Variational renormalization …

  2. deep learning - When should I use a variational autoencoder as …

    Jan 22, 2018 · I understand the basic structure of variational autoencoder and normal (deterministic) autoencoder and the math behind them, but when and why would I prefer one type of autoencoder to …

  3. bayesian - What are variational autoencoders and to what learning …

    Jan 6, 2018 · Even though variational autoencoders (VAEs) are easy to implement and train, explaining them is not simple at all, because they blend concepts from Deep Learning and Variational Bayes, …

  4. Understanding the Evidence Lower Bound (ELBO) - Cross Validated

    Jun 24, 2022 · I am reading this tutorial about Variational Inference, which includes the following depiction of ELBO as the lower bound on log-likelihood on the third page. In the tutorial, $x_i$ is the …

  5. How to weight KLD loss vs reconstruction loss in variational auto …

    Mar 7, 2018 · How to weight KLD loss vs reconstruction loss in variational auto-encoder? Ask Question Asked 7 years, 11 months ago Modified 2 years, 5 months ago

  6. regression - What is the difference between Variational Inference and ...

    Jul 13, 2022 · Variational Bayes is a general technique for approximating posteriors in Bayesian inference. Say that we have a posterior that doesn't turn out to be a nice simple distribution, and …

  7. How to do dimension reduction from a variational autoencoder

    Dec 19, 2023 · I am thinking about a variational autoencoder. As far as I understand it, in the encoding section you compress to a px1 tensor and then you create a $\\mu$ and $\\sigma$ of dimensions of …

  8. Understanding the set of latent variables $Z$ in variational inference

    Mar 4, 2021 · Variational inference approximates this posterior by using the "best" distribution within a family of distributions referred to as the mean-field family: This family is characterised by the fact that …

  9. Normalizing flows as a generalization of variational autoencoders ...

    Apr 24, 2021 · Normalizing flows are often introduced as a way of restricting the rigid priors that are placed on the latent variables in Variational Autoencoders. For example, from the Pyro docs: In …

  10. variational bayes - Why don’t diffusion models suffer posterior ...

    Apr 16, 2024 · A answer to help provide clarification on posterior collapse, why it happens in the training of VAEs and how these ideas relate to diffusion models. As a first step in understanding posterior …