Multimeasurement Generative Models

12/18/2021
by   Saeed Saremi, et al.
16

We formally map the problem of sampling from an unknown distribution with density p_X in ℝ^d to the problem of learning and sampling p_𝐘 in ℝ^Md obtained by convolving p_X with a fixed factorial kernel: p_𝐘 is referred to as M-density and the factorial kernel as multimeasurement noise model (MNM). The M-density is smoother than p_X, easier to learn and sample from, yet for large M the two problems are mathematically equivalent since X can be estimated exactly given 𝐘=𝐲 using the Bayes estimator x(𝐲)=𝔼[X|𝐘=𝐲]. To formulate the problem, we derive x(𝐲) for Poisson and Gaussian MNMs expressed in closed form in terms of unnormalized p_𝐘. This leads to a simple least-squares objective for learning parametric energy and score functions. We present various parametrization schemes of interest, including one in which studying Gaussian M-densities directly leads to multidenoising autoencoders–this is the first theoretical connection made between denoising autoencoders and empirical Bayes in the literature. Samples from p_X are obtained by walk-jump sampling (Saremi Hyvarinen, 2019) via underdamped Langevin MCMC (walk) to sample from p_𝐘 and the multimeasurement Bayes estimation of X (jump). We study permutation invariant Gaussian M-densities on MNIST, CIFAR-10, and FFHQ-256 datasets, and demonstrate the effectiveness of this framework for realizing fast-mixing stable Markov chains in high dimensions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro