Abstract: We discuss a new probabilistic modelling paradigm, termed "networked exponential families", for large collections of local datasets that are related by some notation "similarity". This similarity notion can be conveniently encoded by a network whose nodes represent local datasets. Similar datasets are connected by an edge. Networked exponential families couple complex network
structure with the information geometry of statistical models for local datasets. The estimation of networked exponential families from partially observed data amounts to a network Lasso problem. We solve this estimation problem using a primal-dual method for non-smooth convex optimization. This resulting method can be implemented as scalable message passing on the data network. This method allows recovering an inherent cluster structure of the data network even from observing only a few local datasets. Moreover, this method is robust against various sources of errors (network failures, noisy data). The analysis of the learning method reveals an interesting interplay between the geometry of the network structure and the (information) geometry of statistical models for local datasets.
Speakers: Alex Jung
Alex Jung is Assistant Professor for Machine Learning at Aalto University.
He serves as an Associate Editor for IEEE Signal Processing Letters and as a Chapter
Chair within the IEEE Finland Section. Alex has received a Best Student Paper Award
at IEEE ICASSP 2011, an Amazon Web Services Machine Learning Award in 2018 and
chosen as the Teacher of the Year 2018 by the Department of Computer Science at
Aalto University.
Affiliation: Aalto University
Place of Seminar: Zoom (Available afterwards on Youtube)
Meeting ID: 698 1461 2109
Passcode: 433838