Abstract: A widely used design principle for federated learning (FL) systems is total variation (TV) minimization. TV minimization is an instance of regularized empirical risk minimization, using the  variation of local model parameters as regularizer. Mainstream FL flavors, including personalized, clustered, vertical or horizontal FL, can obtained as special cases of TV minimization. This talk surveys computational and statistical aspects of TV minimization that are relevant for the design of trustworthy FL systems. 
Bio: Alexander Jung received his Phd (with "sub auspiciis") in 2012 from TU Vienna. After Post-Doc stints at TU Vienna and ETH Zurich, he joined Aalto as an Assistant Professor in 2015. He has been chosen as the Computer Science Teacher of the Year and received an Amazon Web Services ML Award in 2018. He serves as an Associate Editor of the IEEE Signal Processing Letter and Editorial 
Board Member of the Machine Learning Journal (Springer). He authored the textbook "Machine Learning: The Basics" which has been published by Springer in 2022. 
Time and location: Monday 29.5. 14:15, may 29th zoom (virtual only)
  Back to All Events
  
  
    
    
      
  
  
  
      
      
      
    
    
   
  
  
  
    
    
      
        
          
      
      
        
          
      
      
    
    
  
              Earlier Event: May 25
          Artificial Intelligence for Sustainability: Special Session at Sustainability Science Days
        Later Event: May 30
          AIX Forum: UNITE + FCAI: Tekoäly ja metsät