A New Study Combines Recurrent Neural Networks (RNNs) With The Concept Of Annealing To Address Real-World Optimization Problems

Source: https://arxiv.org/pdf/2101.10154.pdf

Optimization problems involve determining the best viable answer from a variety of options, which can be seen frequently in both real-world situations and most scientific research domains. However, there are many complicated problems that can’t be solved with simple computing methods or that solving them would take an excessive amount of time. 

Because simple algorithms are ineffective in addressing these issues, experts around the world have been working to develop more efficient strategies that can solve them in realistic timeframes. Artificial neural networks are at the heart of some of the most promising techniques explored thus far (ANNs)

A new study conducted by Vector Institute, University of Waterloo, and Perimeter Institute for Theoretical Physics in Canada presents variational neural annealing. This new optimization method combines recurrent neural networks (RNNs) with the notion of annealing. Using a parameterized model, this innovative technique generalizes the distribution of feasible solutions to a particular problem. Its goal is to address real-world optimization problems using a new algorithm based on annealing theory and RNNs from natural language processing (NLP).

The proposed framework is based on the annealing principle, inspired by metallurgical annealing, which involves heating material and slowly cooling it to bring it to a lower energy state that is more resilient and stable. Simulated annealing was developed inspired by this process, and it seeks to identify numerical solutions to optimization problems.

The greatest distinguishing feature of this optimization method is that it combines the efficiency and processing capacity of ANNs with the benefits of simulated annealing techniques. The team used the RNNs algorithm that has been found to be especially promising for NLP applications. While these algorithms are typically used in NLP studies to interpret human language, the researchers repurposed them to address optimization problems.

Compared to more traditional numerical annealing implementations, their RNN-based method produced better decisions, increasing the efficiency of both classical and quantum annealing procedures. With autoregressive networks, the researchers were able to encode the annealing paradigm. Their strategy takes optimization problem solving to a new level by directly exploiting the infrastructures used to train modern neural networks, such as TensorFlow or Pytorch, both GPU and TPU accelerated.

The team conducted multiple tests to compare the method’s performance with traditional annealing optimization methods based on numerical simulations. On many paradigmatic optimization problems, the proposed approach surpassed all of the techniques.

This algorithm can be used in a wide range of real-world optimization problems in the future, allowing experts in various fields to solve difficulties more quickly.

The researchers would like to evaluate their algorithm’s performance on more realistic issues in the future, as well as compare it to the performance of existing state-of-the-art optimization techniques. They also intend to improve their technique by swapping out some of the components or incorporating new ones.

Paper: http://dx.doi.org/10.1038/s42256-021-00401-3

Reference: https://techxplore.com/news/2021-11-neural-network-based-optimization-technique-principle.html

Credit: Source link

Comments are closed.