Baidu Researchers Propose a Quantum Self-Attention Neural Network (QSANN) by Introducing the Self-Attention Mechanism to Quantum Neural Networks

This Article Is Based On The Research Paper 'Quantum Self-Attention Neural Networks for Text Classification'. All Credit For This Research Goes To The Researchers of This Project đź‘Źđź‘Źđź‘Ź

Please Don't Forget To Join Our ML Subreddit

As the size of computational operations grows, some processes become infeasible for a classical computer to handle. Unlike classical systems that rely on binary bits, quantum systems rely on the quantum particles called qubits. These qubits can move back and forth in time, exist in 2 places at a single point in time, or even teleport, for that matter. The properties of these quantum particles have been exploited to build quantum systems. These systems can significantly reduce processing time for a specific task.

Because of its enormous potential for solving complicated real-world issues in optimization, cryptography, chemistry, and the emerging subject of quantum natural language processing (QNLP), quantum computing has become an increasingly appealing study field. Most of these QNLP ideas, while cutting-edge, lack scalability since they are focused on syntactic analysis, which is a time-consuming preprocessing activity, especially for substantial real-world data sets.

A team of researchers from the Baidu Research Institute for Quantum Computing and the University of Technology Sydney addressed these limitations in their new paper Quantum Self Attention Neural Networks for Text Classification. They propose a simple and effective quantum self-attention neural network architecture(QSANN), outperforming QNLP and classical attention methods on large-scale real-world datasets. 

The research paper boils down to 3 broad contributions:

  1. A QNLP algorithm with a detailed circuit implementation scheme based on the self-attention mechanism implemented on NISQ devices
  2. Introducing a Gaussian projected quantum self-attention can efficiently dig out correlations between words in high-dimensional data.
  3. Experiment results that demonstrate QSANN outperforming existing QNLP methods and resilience of numerical results of QSANN to quantum noise

A quantum self-attention layer (QSAL), a loss function, and analytical gradients make up the proposed QSANN design. QSANN encodes input words into a huge quantum Hilbert space before projecting them back to a low-dimensional classical feature space using quantum measurement to accomplish text classification. As a result, the researchers may exploit the quantum edge by leveraging the high-dimensional quantum feature space and its projected quantum models to detect hidden text correlations. These features would be difficult to find, if not impossible, in traditional classical approaches.

Source: https://arxiv.org/pdf/2205.05625.pdf

Sketch of a quantum self-attention layer (QSAL).

The researchers assessed the proposed QSANN’s text classification performance against a syntactic analysis-based quantum model on basic tasks like MC (meaning classification) and RP (relative clause evaluation) datasets, as well as against a classical self-attention network (CSANN) and the native algorithm on the Yelp, IMDb, and Amazon public sentiment analysis datasets. QSANN surpassed the CSANN benchmark on the Yelp, IMDb, and Amazon datasets in the assessments, achieving 100 percent accuracy on the MC challenge.

More sophisticated approaches, such as positional encoding and multi-head attention, can be used in quantum neural networks in the future for generative models and other more complex tasks. The coming times will see a boom in quantum algorithms for significant real-world problems.

Paper: https://arxiv.org/pdf/2205.05625.pdf

Credit: Source link

Comments are closed.