What Else If Not Back-Propagation? This AI Research Brings A New Perspective

Lately, there has been a lot of discussion on the paper “The Forward-Forward Algorithm: Some Preliminary Investigations” by Geoffrey Hinton. In this paper, he talked about the problems with backpropagation and suggested a new method that seems reasonable and only requires two forward passes. He called it “The Forward-Forward Algorithm.” 

There is one intuitive problem with backpropagation because there is little to no biological evidence of backpropagation in biological brains. Our brain can continuously learn from an incoming stream of data; it does not need to stop to calculate the loss, propagate the gradients back, and update memories and experiences. It’s genuine to ask if there is a way to train Neural networks that is more coherent with how the biological brain functions.

Hinton suggests a mechanism based on two forward passes. Let’s see how it works.

Meet Hailo-8™: An AI Processor That Uses Computer Vision For Multi-Camera Multi-Person Re-Identification (Sponsored)

Martin Gorner explains the working of the forward-forward algorithm wonderfully in this Twitter thread.

Source: https://twitter.com/martin_gorner/status/1599755684941557761

Source: https://twitter.com/martin_gorner/status/1599755684941557761
Source: https://twitter.com/martin_gorner/status/1599755684941557761
Source: https://twitter.com/martin_gorner/status/1599755684941557761

The forward-forward algorithm has certain advantages; unlike backpropagation, the training does not require the entire path to be differentiable. The network can also have non-differentiable components (black box) (see Figure 1). We will have to resort to reinforcement learning to train such a network with backpropagation. The forward-forward algorithm doesn’t necessarily need to convert good data to bad data. For example, we can also feed non-digit images as bad data to learn about digits. So forward-forward algorithm, by default, lends itself to self-supervised learning.

Fig 1- Source: https://twitter.com/martin_gorner/status/1599755684941557761

Hilton also pointed out that forward-forward can be easily implemented on power-effective analog circuits. This makes forward-forward power effective in comparison to backpropagation.

So will it be replacing backpropagation in neural networks?

The forward-forward algorithm is slower than backpropagation and doesn’t work as well on a few of the toy problems studied in this paper, so it’s not likely to replace backpropagation any soon. Moreover, currently, we have highly complex network architectures like UNet trained using backpropagation, which learns to identify different features on different levels. But in the Forward-Forward algorithm, the layers are trained independently, so it’s not quite intuitive how this model will learn to identify structures as the discriminative information is not distributed across the network. We may need to rethink the architectures of these networks, as current networks are designed keeping backpropagation in mind. Like residual connections and skip-connections help in backpropagation. Similarly, there can be modifications after which the model trained using the forward-forward algorithm can perform well.

Some people also claimed it to be similar to contrastive learning. Overall, this algorithm can be very useful in certain use cases, specifically when we need to train a model with a non-differentiable forward pass. The algorithm is promising, and if the drawbacks mentioned earlier are addressed, this new paradigm may transform the way we see deep learning today.


Don’t forget to join our Reddit page and discord channel, where we share the latest AI research news, cool AI projects, and more.


Vineet Kumar is a consulting intern at MarktechPost. He is currently pursuing his BS from the Indian Institute of Technology(IIT), Kanpur. He is a Machine Learning enthusiast. He is passionate about research and the latest advancements in Deep Learning, Computer Vision, and related fields.



Credit: Source link

Comments are closed.