Meta AI Open Sources Flashlight: Fast and Flexible Machine Learning Toolkit in C++

While deep learning and machine learning ML frameworks perform well, customizing their underlying components has always been challenging. Low-level internals can be mistakenly obfuscated, closed-source, or hand-tuned for specific purposes, making it difficult and time-consuming to find the proper code to alter.

To fuel ground-breaking research, FAIR developed Flashlight, a new open-source machine learning (ML) toolkit based in C++ that allows teams to quickly and efficiently change deep and ML frameworks to better suit their needs.

Flashlight was built from the ground up to be fully adjustable by the user. It’s easy to use because it includes the fundamental elements of a study environment. Because of its basic design and lack of language bindings, rebuilding the whole Flashlight library and its training pipelines takes only a few seconds whenever its essential components are modified.

Since current C++ allows for first-class parallelism and out-of-the-box speed, Flashlight has extremely low framework overhead. Low-level domain-specific languages and libraries can be easily integrated with Flashlight, thanks to its easy bridges.

Flashlight is based on a simple stack of modular, easily-understood abstractions. For this reason, the team first installed the ArrayFire tensor library, which allows for dynamic tensor shapes and types and does away with the requirement for strict compile-time specifications and C++ templates. As an added bonus, ArrayFire’s efficient just-in-time compiler allows operations to be optimized on the fly.

Flashlight extends these fundamentals by providing specialized memory managers and application programming interfaces (APIs) for distributed and mixed-precision training. Flashlight combines modular abstractions for working with data and training at scale with a fast, lightweight autograd. This deep learning standard automatically computes derivatives of chained operations common in deep neural networks. Whether your focus is on deep learning or any other field of study, you’ll find these components useful.

Lightweight domain applications in Flashlight’s single codebase facilitate study in areas as diverse as speech recognition, language modeling, image classification, and segmentation. Because of its clever layout, Flashlight can facilitate multimodal research by eliminating the need to join numerous independent domain-specific libraries. This just necessitates a single incremental rebuild instead of making modifications and rebuilding for each upstream domain-specific framework.

Flashlight allows researchers to work in C++ without requiring them to configure external fixtures or bindings and without requiring adapters to handle threading, memory mapping, or low-level hardware interoperability. This makes it straightforward to incorporate high-performance code written in parallel.

The team hopes their work will encourage the AI community to optimize deep and ML frameworks for the available hardware and explore the performance limits.

This Article is written as a research summary article by Marktechpost Staff based on the research paper 'FLASHLIGHT: ENABLING INNOVATION IN TOOLS FOR MACHINE LEARNING'. All Credit For This Research Goes To Researchers on This Project. Check out the paper and github link.

Please Don't Forget To Join Our ML Subreddit


Tanushree Shenwai is a consulting intern at MarktechPost. She is currently pursuing her B.Tech from the Indian Institute of Technology(IIT), Bhubaneswar. She is a Data Science enthusiast and has a keen interest in the scope of application of artificial intelligence in various fields. She is passionate about exploring the new advancements in technologies and their real-life application.


Credit: Source link

Comments are closed.