Time and place: September 26th, 14:00 on Zoom. Register online!
Speaker: Erik Englesson, KTH Royal Institute of Technology
Title: Exploiting Properties of the Gaussian Likelihood for Label Noise Robustness in Classification
Abstract: A natural way of estimating heteroscedastic label noise in regression is to model the observed (potentially noisy) target as a sample from a normal distribution, whose parameters can be learned by minimizing the negative log-likelihood. This formulation has desirable loss attenuation properties, as it reduces the contribution of high-error examples. Intuitively, this behaviour can improve robustness against label noise by reducing overfitting. We propose an extension of this simple and probabilistic approach to classification, that has the same desirable loss attenuation properties. We evaluate the effectiveness of the method by measuring its robustness against label noise in classification. In follow-up work, we improve the method’s robustness by modelling and estimating a shift (non-zero mean) in the Gaussian noise distribution, which we show makes it possible for the method to correct noisy labels.
Bio: Erik is a postdoc at the Division of Robotics, Perception and Learning at KTH, supervised by Hossein Azizpour. His research interests are related to robustness and uncertainty in deep learning. Erik likes to bring time-tested ideas from fields such as statistics and information theory to deep learning. His PhD thesis was about robustness to label noise, which arises from aleatoric uncertainty in the data generation process. In the near future, Erik’s plans to connect aleatoric uncertainty, label noise, and epistemic uncertainty, and also bring ideas from Gaussian processes to deep learning.