How Can Meta-Learning, Self-Attention And JAX Power The Next Generation of Evolutionary Optimizers?

Black box optimization methods are used in every domain, from Artificial Intelligence and Machine Learning to engineering and finance. These methods are used to optimize functions when an algebraic model is absent. Black box optimization looks into the design and analysis of algorithms for those problem statements where the structure of the objective function or the limitations defining the set is not known or explainable. Given a set of input parameters, black box optimization methods are designed to evaluate the optimal value of a function. This is done by iteratively assessing the function at multiple points in the input space so as to find the point that generates the optimal output.

Though gradient descent is the most used optimization approach for deep learning models, it is unsuitable for every problem. In cases where gradients cannot be calculated directly or where an objective function’s accurate analytical form is unknown, other approaches like Evolution Strategies (ES) are used. Evolution strategies come from evolutionary algorithms, which refer to a division of population-based optimization algorithms inspired by natural selection. Basically, Evolution Strategies (ES) is a type of Black Box Optimization method that operates by refining a sampling distribution based on the fitness of candidates and updating rules based on equations.

In a new AI paper, researchers from Deepmind, have introduced and developed a new way to use machine learning to learn the update rules from data, called meta-black-box optimization (MetaBBO), to make ES more flexible, adaptable, and scalable. MetaBBO works by meta-learning a neural network parametrization of a BBO update rule. The researchers have used MetaBBO to discover a new type of ES called learned evolution strategy (LES). The learned evolution strategy LES is a type of Set Transformer that updates its solutions based on the fitness of candidates and not depending upon the ordering of candidate solutions within the Black box evaluations. After meta-training, the LES can learn to choose the best-performing solution or update solutions based on a moving average. 

🔥 Recommended Read: Leveraging TensorLeap for Effective Transfer Learning: Overcoming Domain Gaps

The proposed solution basically involves discovering effective update rules for evolution strategies (ES) through meta-learning. Some of the major contributions are – 

  1. A self-attention-based Evolution Strategy parametrization has been introduced, which makes it possible to meta-learn black-box optimization algorithms. 
  1. This approach outperforms the existing handcrafted ES algorithms on neuroevolution tasks, and this approach generalizes across optimization problems, compute resources, and search space dimensions.
  1. The researchers have found that for meta-evolving a good ES, only a number of core optimization classes are required at the meta-training time, including separable, multi-modal, and high conditioning functions.
  1. The approach involves removing the black-box components to recover an interpretable strategy. It indicates that all neural network components positively influence the search strategy’s early performance.
  1. This discovered evolution strategy is a highly competitive alternative to traditional ES methods and very easy to implement.
  1. The study has showcased the process of creating a novel LES from scratch that was randomly initialized to initiate its learning progress. This process allows for self-referential meta-learning of its own weights.

In conclusion, with this study, meta-learning can be used to find out the effective update rules for evolution strategies. This way, meta-learning, and self-attention can be promising for the next generation of Evolutionary Optimizers.


Check out the Paper. All Credit For This Research Goes To the Researchers on This Project. Also, don’t forget to join our 15k+ ML SubRedditDiscord Channel, and Email Newsletter, where we share the latest AI research news, cool AI projects, and more.


Tanya Malhotra is a final year undergrad from the University of Petroleum & Energy Studies, Dehradun, pursuing BTech in Computer Science Engineering with a specialization in Artificial Intelligence and Machine Learning.
She is a Data Science enthusiast with good analytical and critical thinking, along with an ardent interest in acquiring new skills, leading groups, and managing work in an organized manner.


Credit: Source link

Comments are closed.