Abstract: While much ML research and practice is devoted to improving the accuracy of trained models, efficiency is another aspect of performance that poses challenges for ML practitioners: For example, how to select a small but appropriate data sample for efficient model training? Or, once a model is trained, how to make inferences as fast as possible? And, if data change over time, how much change justifies the computational cost of re-training the model? In this talk, I will present recent work by my group that addresses questions like the above (e.g., [1,2,3]).
[1] Wang, Y., Mathioudakis, M., Li, J., & Fabbri, F. (2023). Max-Min Diversification with Fairness Constraints: Exact and Approximation Algorithms. SIAM International Conference on Data Mining (SDM). https://doi.org/10.1137/1.9781611977653.ch11
[2] Aslay, C., Ciaperoni, M., Gionis, A., & Mathioudakis, M. (2021). Workload-aware materialization for efficient variable elimination on Bayesian networks. IEEE 37th International Conference on Data Engineering (ICDE). https://doi.org/10.1109/ICDE51399.2021.00104
[3] Mahadevan, A., & Mathioudakis, M. (2024). Cost-Aware Retraining for Machine Learning. Knowledge-Based Systems. https://doi.org/10.1016/j.knosys.2024.111610
Speaker: Michael Mathioudakis is an associate professor at the Department of Computer Science, University of Helsinki. Group webpage: http://www.helsinki.fi/algorithmic-data-science
Affiliation: University of Helsinki
Place of Seminar: Kumpula Exactum CK111 (in person) & zoom ( Meeting ID: 640 5738 7231 ; Passcode: 825217)