Learning Energy-Based Models in High-Dimensional Spaces with Multiscale Denoising-Score Matching

Author:

Li Zengyi12ORCID,Chen Yubei13ORCID,Sommer Friedrich T.145

Affiliation:

1. Redwood Center for Theoretical Neuroscience, Berkeley, CA 94720, USA

2. Department of Physics, University of California Berkeley, Berkeley, CA 94720, USA

3. Berkeley AI Research, University of California Berkeley, Berkeley, CA 94720, USA

4. Helen Wills Neuroscience Institute, University of California Berkeley, Berkeley, CA 94720, USA

5. Neuromorphic Computing Group, Intel Labs, 2200 Mission College Blvd., Santa Clara, CA 95054, USA

Abstract

Energy-based models (EBMs) assign an unnormalized log probability to data samples. This functionality has a variety of applications, such as sample synthesis, data denoising, sample restoration, outlier detection, Bayesian reasoning and many more. But, the training of EBMs using standard maximum likelihood is extremely slow because it requires sampling from the model distribution. Score matching potentially alleviates this problem. In particular, denoising-score matching has been successfully used to train EBMs. Using noisy data samples with one fixed noise level, these models learn fast and yield good results in data denoising. However, demonstrations of such models in the high-quality sample synthesis of high-dimensional data were lacking. Recently, a paper showed that a generative model trained by denoising-score matching accomplishes excellent sample synthesis when trained with data samples corrupted with multiple levels of noise. Here we provide an analysis and empirical evidence showing that training with multiple noise levels is necessary when the data dimension is high. Leveraging this insight, we propose a novel EBM trained with multiscale denoising-score matching. Our model exhibits a data-generation performance comparable to state-of-the-art techniques such as GANs and sets a new baseline for EBMs. The proposed model also provides density information and performs well on an image-inpainting task.

Funder

NSF

NIH

Intel Corporation

Publisher

MDPI AG

Subject

General Physics and Astronomy

Reference56 articles.

1. Stacked denoising autoencoders: Learning useful representations in a deep network with a local denoising criterion;Vincent;J. Mach. Learn. Res. (JMLR),2010

2. Zhai, S., Cheng, Y., Lu, W., and Zhang, Z. (2016, January 16). Deep Structured Energy Based Models for Anomaly Detection. Proceedings of the International Conference on Machine Learning (ICML), New York, NY, USA.

3. Choi, H., Jang, E., and Alemi, A.A. (2018). Waic, but why? Generative ensembles for robust anomaly detection. arXiv.

4. Nijkamp, E., Hill, M., Han, T., Zhu, S.C., and Wu, Y.N. (February, January 27). On the Anatomy of MCMC-based Maximum Likelihood Learning of Energy-Based Models. Proceedings of the Conference on Artificial Intelligence (AAAI), Honolulu, HI, USA.

5. Du, Y., and Mordatch, I. (2019). Implicit generation and generalization in energy-based models. arXiv.

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3