当前位置:   article > 正文

Glow: Generative Flow with Invertible 1x1 Convolutions_glow: generative flow with invertible 1 x 1 convol

glow: generative flow with invertible 1 x 1 convolutions

Glow: Generative Flow with Invertible 1×1 Convolutions

Diederik P. Kingma, Prafulla Dhariwal

Abstract

flow-based generative models: tractability of the exact log-likelihood, tractability of exact latent-variable inference, and parallelizability of both training and synthesis
Glow: a simple type of generative flow using an invertible 1 × 1 convolution
capable of efficient realistic-looking synthesis and manipulation of large images
code:https://github.com/openai/glow

Introduction

two major unsolved problems of machine learning:
(1) data-efficiency: the ability to learn from few datapoints, like humans;
(2) generalization: robustness to changes of the task or its context

generative models:
(1) learning realistic world models
(2) learning meaningful features of the input while requiring little or no human supervision or labeling

generative model{likelihood-based methods{autoregressive modelVAEflow-based generative modelGAN

merits of flow-based generative models:
1. Exact latent-variable inference and log-likelihood evaluation
2. Efficient inference and efficient synthesis
3. Useful latent space for downstream tasks
4. Significant potential for memory savings

Background: Flow-based Generative Models

x: high dimensional random vector with unknown true distribution p(x)
D: i.i.d dataset from p(x)
pθ(x): model
log-likelihood objective(the expected compression cost):

minθL(D)=E(logpθ(x))=1|D|xDlogpθ(x)

z: latent variable with tractable/simple pdf

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/小小林熬夜学编程/article/detail/296086
推荐阅读
相关标签
  

闽ICP备14008679号