Back to All Events

Simo Särkkä: GPU Computing for Large-Scale Learning in State Space Models

Abstract: State space models (SSMs), including Gaussian state space models and hidden Markov models (HMMs), are important tools in machine learning for time series data. Bayesian filters and smoothers as well as their special cases such as Kalman filters and smoothers are computationally optimal O(N) algorithms for learning and inference in these models on classical CPU architectures. However, on massively parallel architectures such as graphics processing units (GPUs) they are no longer optimal, because they are inherently sequential algorithms. The aim of this talk is to discuss parallel algorithms called associative scans and show how they can be used to make state space learning and inference optimally parallelizable leading to parallel O(log N) span complexity.

arXiv: https://arxiv.org/abs/1905.13002

DOI: https://doi.org/10.1109/TAC.2020.2976316

Code: https://github.com/EEA-sensors/sequential-parallelization-examples/tree/main/python/temporal-parallelization-bayes-smoothers

Speakers:  Simo Särkkä

Simo Särkkä is an Associate Professor with Aalto University and an Adjunct Professor with Tampere University and LUT University. He is also a Fellow of European Laboratory for Learning and Intelligent Systems (ELLIS) and the leader of AI Across Fields (AIX) program in Finnish Center for Artificial Intelligence (FCAI). His and his group's research interests are in multi-sensor data processing systems with applications in location sensing, health and medical technology, machine learning, inverse problems, and brain imaging. He has authored or coauthored around 150 peer-reviewed scientific articles and his books "Bayesian Filtering and Smoothing" and "Applied Stochastic Differential Equations" along with the Chinese translation of the former were recently published via the Cambridge University Press.

Affiliation: Aalto University

Place of Seminar:  Zoom (Available afterwards on Youtube)