• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

HDI Lab seminar: Taming Heterogeneity in Federated Linear Stochastic Approximation

12+
*recommended age

On September 5, 2024, Paul Mangold (École Polytechnique) will speak on 'Taming Heterogeneity in Federated Linear Stochastic Approximation and TD Learning.'

Abstract:

In federated learning, multiple agents collaboratively train a machine learning model without exchanging local data. To achieve this, each agent locally updates a global model, and the updated models are periodically aggregated. In this talk, I will focus on federated linear stochastic approximation (FedLSA), with a strong focus on agents heterogeneity. I will derive upper bounds on the sample and communication complexity of FedLSA, and present a new method to reduce communication cost using control variates. Particular attention will be put on the "linear speed-up" phenomenon, showing that the sample complexity scales with the inverse of the number of agents in both methods.

Start time: 14:40

The seminar will be held online. Please contact Elena Alyamovskaya to get access to the seminar.

Add to calendar