Google AI Introduces ‘Federated Reconstruction’ Framework That Enables Scalable Partially Local Federated Learning
Federated learning is a machine learning technique in which an algorithm is trained across numerous decentralized edge devices or servers, keeping local data samples without being exchanged. This prevents the collecting of personally identifiable information. It is frequently accomplished by learning a single global model for all users, although their data distributions may differ. Due to this variability, an algorithm that can personalize a global model for each user has been developed.
However, privacy concerns may prevent a truly global model from being learned in some cases. While sending user embedding updates to a central server may reveal the preferences encoded in the embeddings, it is required to train a completely global federated model. Even if models do not include user-specific embeddings, having some parameters local to user devices reduces server-client communication and allows for responsible personalization of those parameters for each user.
Google AI introduces an approach that enables scalable partially local federated learning in their work “Federated Reconstruction: Partially Local Federated Learning”. Some model parameters are never aggregated on the server in this approach. This strategy trains a piece of the model to be personal for each user while eliminating transmission of these parameters for models other than Matrix Factorization. In the case of matrix factorization, a recommender model is trained. The model retains user embeddings local to each user service.
Google AI’s scalable approach:
In large-scale federated learning situations, approaches based on stateful algorithms tend to degrade. Most users do not attend training and those who do most likely do so only once. As a result, this state is rarely available and can turn stale over time. Furthermore, all non-participating users are left with untrained local parameters, preventing practical use.
Federated Reconstruction has been developed to address this problem. Federated Reconstruction is stateless, which means it does not require user devices to maintain local parameters because it reconstructs them as needed. The local parameters can be randomly initialized and trained using gradient descent on local data with the global parameters frozen. Furthermore, updates to global parameters can be calculated with local values frozen.
This primary strategy allows for large-scale training because it does not presuppose users have a state from earlier training rounds. To avoid staleness, local parameters are constantly recreated from scratch. Users not present during the training can obtain trained models and execute inference by simply recreating local parameters from local data. Compared to other approaches, Federated Reconstruction trains better performance models for unseen users.
Federated Reconstruction develops global parameters that enable unseen users to rebuild local parameters quickly and accurately. Federated Reconstruction, in other words, is learning to learn local parameters. This creates a link between Meta-Learning and the rest of the system.
Federated Reconstruction also allows developers to customize models for different users while decreasing model parameter transmission – even for models that do not have user-specific embeddings. When used at a fixed communication level, Federated Reconstruction outperforms other personalization methods.
Deployment of Federated Reconstruction in GBoard:
GBoard is a popular mobile keyboard app that has millions of users. Gboard users use expressions such as stickers and GIFs to connect with others. Since these expressions have a wide range of preferences among users, it is ideal for utilizing matrix factorization to forecast new expressions that a user may want to share.
Federated Reconstruction was used to train a matrix factorization model over user-expression co-occurrences. This ensures that each Gboard user’s embeddings are unique. The model was then implemented on GBoard, resulting in a 23.9 percent increase in click-through rate.
The Road Ahead:
Federal Reconstruction is still a topic that is being explored on various levels. According to preliminary findings, Federated Reconstruction enables tailoring to diverse users while decreasing communication of privacy-sensitive factors. Following Google’s AI Principles, the research team scaled this technique to Gboard. As a result, several people benefitted from improved recommendations.
Paper: https://arxiv.org/pdf/2102.03448.pdf
Github: https://github.com/google-research/federated/tree/master/reconstruction
Reference: https://ai.googleblog.com/2021/12/a-scalable-approach-for-partially-local.html
Suggested
Credit: Source link
Comments are closed.