Being Compatible With Any Programming Language And Machine Learning Framework; Flower Team Releases Flower 0.18 With Cool New Updates For Federated Learning
Flower is an end-to-end federated learning framework that allows for a smoother transition from simulation-based experimental research to system research on many real-world edge devices. Flower has individual strengths in both domains (i.e., simulation and real-world devices) and the capacity to switch back and forth between the two extremes as needed throughout exploration and development. Researchers present use cases that drive our viewpoint, design goals, the resultant framework architecture, and comparisons to other frameworks in this part.
Federated Learning (FL) has shown to be a viable option for enabling edge devices to develop a shared prediction model cooperatively while maintaining their training data on the device, divorcing the capacity to execute machine learning from the requirement to store data in the cloud. However, FL is challenging to implement practically in size and system heterogeneity. Although there are several research frameworks for simulating FL algorithms, none of them facilitate the investigation of scalable FL workloads on heterogeneous edge devices.
Flower 0.18 released
Thanks to a longer gap than usual, the latest Flower release has more upgrades than any previous release. Also, thanks to the wonderful community for your continuing support and generosity.
How to Upgrade v0.18
pip install -U flwr
What’s Included in Flower 0.18?
- Compatibility with Jupyter Notebook / Google Colab has been improved in the Virtual Client Engine.
After upgrading Flower with the simulation extra, simulations (using the Virtual Client Engine through start_simulation) now run more smoothly on Jupyter Notebooks (including Google Colab) (pip install flwr[simulation]).
- A new Jupyter Notebook code sample is now available.
Flower simulations utilizing the Virtual Client Engine through Jupyter Notebook are demonstrated in a new code example (quickstart simulation) (incl. Google Colab). It’s never been easier to use Flower.
- Properties owned by clients (feature preview) To allow server-side strategies to query client properties, clients can implement a new method to get properties.
- With TFLite, you can try out Android support.
Android support is now available in the main menu! By design, Flower is client-agnostic as well as framework-agnostic. With this edition, any client platform may be integrated, and utilizing Flower on Android has been more accessible.
On the client-side, TFLite is used, and a new FedAvgAndroid approach. The Android client and FedAvgAndroid are currently in beta. Still, they’re the first step toward a fully working Android SDK and a unified FedAvg implementation that incorporates the new FedAvgAndroid features.
- Make the gRPC keepalive time customizable by the user and reduce the default keepalive time.
To make Flower more compatible with additional cloud settings, the default gRPC keepalive time has been decreased (for example, Microsoft Azure). Users can modify the gRPC stack by adjusting the keepalive time to meet their individual needs.
- Opacus and PyTorch are used in a new differential privacy example.
With Opacus, PyTorch, and Flower, a new code sample (opacus) illustrates differentially-private federated learning.
- Code sample for Hugging Face Transformers
A new code example demonstrates hugging Face Transformers with Flower (quickstart huggingface).
- A new MLCube code sample is now available.
The use of MLCube with Flower is demonstrated in a new code example (quickstart mlcube).
- SSL-enabled client and server
SSL allows clients and servers to communicate in a safe, encrypted manner. This release makes the Flower secure gRPC implementation open-source, allowing all Flower users to use encrypted communication channels.
- FedAdam and FedYogi’s tactics have been updated.
The current version of the Adaptive Federated Optimization study is FedAdam and FedAdam.
- Provide a list of client IDs to start_simulation.
A list of client IDs (clients_ids, type: List[str]) can now be sent to start_simulation. When a client needs to be initialized, the IDs will be supplied to the client_fn, making it easier to load data partitions that aren’t available through int identifiers.
- Minor alterations
- In PyTorch code examples, update the num_examples calculation.
- Use flwr._version_ to expose the Flower version.
- In app.py, start_server now returns a History object with training metrics.
- Configurable max_workers (used by ThreadPoolExecutor)
- In all code examples, increase the sleep time after the server starts to three seconds.
- Documentation now includes a new FAQ section.
- Plus a slew of other under-the-hood tweaks, library upgrades, documentation updates, and tooling enhancements!
Paper: https://arxiv.org/pdf/2007.14390.pdf
Github: https://github.com/adap/flower
Reference: https://flower.dev/blog/2022-03-02-flower-0.18-release
Suggested
Credit: Source link
Comments are closed.