Hybrid / Machine und Deep Learning Seminar / November 08, 2023, 11:00 - 12:00
Analyzing and Lifting Architectural Constraints on Normalizing Flows
Speaker: Computer Science and Mathematics PhD Felix Matthias Draxler
Abstract – Analyzing and Lifting Architectural Constraints on Normalizing Flows
Generative models are one of the fastest moving fields in machine learning, with applications ranging from data generation to uncertainty quantification and out-of-distribution detection. One of the foundational models are normalizing flows, which directly optimize the likelihood of the data. In this talk, we analytically analyze the expressivity of normalizing flow architectures based on invertible neural networks. Then, we show that in fact arbitrary neural networks can be trained via maximum likelihood. This shifts the focus for practitioners away from fulfilling the needs of specialized invertible architectures; instead, likelihood models can be built with inductive biases tailored to the data at hand. The resulting models are fast and provide good sample quality.