Abstract: In this talk, I will describe two recent contributions in the area of probabilistic machine learning. The first one is MIRACLE, a method for finding compressed representations of neural network weights which can be very useful for the design of mobile apps and energy-efficient hardware. We encode the network weights using a random sample, requiring only a number of bits corresponding to the Kullback-Leibler divergence between the sampled variational distribution and the encoding distribution. Unlike other methods, we can explicitly control the compression rate while optimizing the expected loss on the training set. The employed encoding scheme can be shown to be close to the optimal information-theoretical lower bound. The second contribution is Successor Uncertainties (SU), a probabilistic Q-learning method for balancing exploration and exploitation in reinforcement learning. SU can incorporate uncertainty about long-term consequences of actions accounting for the fundamental dependencies in state-action values implied by the Bellman equation. SU outperforms existing algorithms on several tabular benchmarks and attains strong performance on the Atari benchmark suite.
Speaker: José Miguel Hernández-Lobato
Affiliation: Professor of Computer Science, University of Cambridge
Place of Seminar: Lecture Hall T2, Konemiehentie 2, Aalto University