IBM Researchers Showcase Their Non-Von Neumann AI Hardware Breakthrough in Neuromorphic Computing That Can Help Create Machines To Recognize Objects Just Like Humans

This research summary article is based on the paper. 'Phase-change memtransistive synapses for mixed-plasticity neural computations'

Human brains are exceptionally good at remembering and learning new information. This brain capacity is credited to how the data is stored and processed by the building blocks called synapses and neurons. Artificial intelligence employs neural network elements that mimic the biophysical features of these components.

To distinguish the numerous properties of dynamically changing objects, AI requires extensive training. However, there are still a variety of activities that are simple for humans to complete but need much computational power for AI. Some examples include sensory perception, motor learning, and iteratively solving mathematical problems with continuous and sequential input streams.

Researchers from IBM’s Zurich lab have argued that these recognition techniques could be improved by enhancing AI hardware. The goal is to use the well-known phase-change memory (PCM) technology to develop a new type of artificial synapse. The researchers employ a PCM memtransistive synapse, which combines memristors, a nonvolatile electronic memory element, and transistors into a single low-power device. This shows a non-Von Neumann in-memory computing architecture that offers various powerful cognitive frameworks for ML applications, such as short-term spike-timing-dependent plasticity and probabilistic Hopfield Neural Networks.

The main goal behind today’s AI hardware is to produce smaller synapses or synaptic junctions so that more of them can fit in a given space. They’re designed to mimic biological synapses’ long-term plasticity. The synaptic weights remain constant over time, changing only during updates.

For scalable AI processors, current hardware that can represent these complicated plasticity rules and dynamics of biological synapses relies on intricate transistor circuits, which are bulky and energy wasteful. On the other hand, memristive devices are more efficient, but they lack the properties required to record various synaptic processes.

The researchers wanted to build a memristive synapse that could express diverse plasticity rules at the nanoscale, such as long and short-term plasticity and their combinations, utilizing commercial PCM. They combined non-volatility (from amorphous-crystalline phase transitions) and volatility (from changes in electrostatics) in PCMs.

Source: https://research.ibm.com/blog/artificial-memtransistive-synapse

Although phase change materials have been studied separately for memory and transistor applications, they have never been combined for neuromorphic computing. The team’s research demonstrates how the devices’ non-volatility enables long-term plasticity while volatility enables short-term plasticity. Their combination enables additional mixed-plasticity computations, similar to the mammalian brain.

The researchers used a curated version of an algorithm for sequential learning created to use spiking networks for learning highly dynamic settings to verify the device’s functionality. The notion was further broadened to show the team’s emulation of different biological processes for meaningful computations. They also demonstrated how the brain’s homeostatic processes could be used to design effective artificial hardware for addressing complex combinatorial optimization tasks. This was accomplished by using stochastic Hopfield neural networks, in which the noise processes at the synaptic junction give computational algorithms efficiency advantages.

The team’s findings are more exploratory than demonstrations at the system level. While they intend to develop this method further, they also feel that the current proof-of-concept results already reveal substantial scientific interest in the broader areas of neuromorphic engineering — in computing and a better understanding of the brain through more faithful emulations. The devices are straightforward to construct and operate and are based on well-researched technology. The at-scale implementation, which involves tying together all the computing primitives and other hardware pieces, is the researchers’ main problem. The current findings demonstrate the value of mixed-plasticity neural computations in neuromorphic engineering. This not only makes, for example, visual cognition more human-like, but it also saves money on costly training methods.

Paper: https://www.nature.com/articles/s41565-022-01095-3

Reference: https://research.ibm.com/blog/artificial-memtransistive-synapse

Credit: Source link

Comments are closed.