Imperial College London Researchers Develop a New Method That Could Drastically Cut Artificial Intelligence’s (AI’s) Energy Use

Source: https://arxiv.org/pdf/2107.08941.pdf
This Article is written as a summay by Marktechpost Staff based on the Research Paper 'Reconfigurable Training and Reservoir Computing in
an Artificial Spin-Vortex Ice via Spin-Wave Fingerprinting'. All Credit For This Research Goes To The Researchers of This Project. Check out the paper, reference article.

Please Don't Forget To Join Our ML Subreddit

Engineers hope to implant Artificial Intelligence into all kinds of technology, no matter how little or huge, due to the evolution brought in by the Internet of Things. Even though the current context necessitates AI development on large and complicated models operating in massive data centers, ignoring the growing demand for solutions to run smaller AI applications on remote, power-constrained devices would be wrong. Data transmission to cloud-based AI systems is not practical for various applications, ranging from wearables to industrial sensors to drones, due to the inherent latency and privacy concerns. However, many of these devices are too small to accommodate AI’s high-performance processors. They also run on batteries or energy harvested from the environment, so they cannot fulfill the high power demands of traditional deep learning methods. As a result, a novel “nanomagnetic” computing method might be able to help. These nanomagnetic computing techniques result from research into novel hardware and computing technologies that allow AI to be run on smaller distant platforms. The human brain, capable of tremendous processing feats while using the same amount of electricity as a light bulb, has inspired much of this work.

According to a team of Imperial College London scientists, computing with networks of tiny magnets could be a feasible alternative. In their recently published paper, the scientists proclaimed that by applying magnetic fields to an array of tiny magnetic elements, a system could analyze complex data and make predictions with a fraction of the power of a traditional computer. The core of their research is based on metamaterial, an artificial substance with a precisely constructed internal physical structure that gives it unique features not found in nature. The team was particularly successful in creating an “artificial spin system,” a collection of several nanomagnets that work together to produce unusual magnetic behavior. A lattice of hundreds of 600-nanometer-long bars of permalloy, a highly magnetic nickel-iron alloy, makes up the design. These bars are arranged in an X design with thick upper arms than lower arms in a repeating pattern. Typically, artificial spin systems have a single magnetic texture representing the magnetization pattern across their nanomagnets. The Imperial team’s metamaterial, on the other hand, has two textures and the capacity to transition between them in reaction to magnetic fields.

Source: https://arxiv.org/pdf/2107.08941.pdf

The researchers used these qualities to create reservoir computing, a subtype of AI. Unlike deep learning, where a neural network’s connections are rewired as it learns a task, this method feeds data into a network with fixed connections and only trains a single output layer to understand its output. The researchers also intend to replace the fixed network with physical devices such as memristors or oscillators. They maintain specific qualities such as a non-linear response to inputs and some memory for past inputs. The team also decided to employ the new artificial spin system as a reservoir to perform several data-processing operations since it meets these requirements. Before the system’s internal dynamics are permitted to process the data, it is subjected to sequences of magnetic fields. The ultimate distribution of the nanomagnets is then determined using a technique known as ferromagnetic resonance imaging. Although these data-processing tasks are impractical, the team has demonstrated that its device can match leading reservoir computing systems in many prediction tests involving data that changes over time. Furthermore, the gadget could also learn quickly on smaller training sets, which is critical in many real-world IoT applications.

The device is not only compact, but it also requires extremely little power because it performs computations using magnetic fields rather than electricity. It is backed up by the fact that the researchers believe the device might be 100,000 times more efficient than traditional computing when scaled up. Although there is still a long way to go before such devices can be used in the real world, the researchers believe that their vision of computers based on magnets could be a game-changer in terms of embedding AI in all forms of devices.

Credit: Source link

Comments are closed.