Nutan Chen
The graph of the fashion MNIST dataset in a 2D latent space, along with the magnification factor.

Neural samplers such as variational autoencoders (VAEs) or generative adversarial networks (GANs) approximate distributions by transforming samples from a simple random source—the latent space—to a more complex distribution—corresponding to the distribution from which is data is sampled.

Typically, the data set is sparse, while the latent space is compact. Consequently, points that are separated by low-density regions in observation space will be pushed together in latent space, and the spaces get distored. In effect, stationary distances in the latent space are poor proxies for similarity in the observation space. How can this be solved?

Georgi Dikov

Intuition and experience. Probably, that's the answer you would get if you happen to ask deep learning engineers how they chose the hyperparameters of a neural network. Depending on their familiarity with the problem, they might have done some good three to five full dataset runs until a satisfactory result popped up. Now, you might say, we surely could automate this, right, after all we do it implicitly in our heads? Well, yes, we definitely could, but should we?

Deep Variational Bayes Filter

DVBF: filter to learn what to filter

Maximilian Soelch
Learnt latent representation of a swinging pendulum

Machine-learning algorithms thrive in environments where data is abundant. In the land of scarce data, blessed are those who have simulators. The recent successes in Go or Atari games would be much harder to achieve without the ability to parallelise millions of perfect game simulations.


Machine Learning Research Lab, Volkswagen Group

Patrick van der Smagt
The front view of the lab building

Established in 2016, Volkswagen Group Machine Learning Research Lab, located in Munich, was set up as a fundamental research lab on topics close to what we think artificial intelligence is about: machine learning and probabilistic inference, efficient exploration, and optimal control.