Researchers From China Propose A Novel Attention High-Order Deep Network ‘AHoNet’ For Breast Cancer Histopathological Image Classification

Breast cancer is the most common and deadliest cancer for women. It is therefore essential to detect it early to reduce the risk of death. With the emergence of increasingly powerful machine learning techniques, researchers have focused on detecting breast cancer by using computer vision methods such as deep learning and convolutional neuron networks (CNN).

The most recent research on the classification of breast cancer histopathological images can be divided into two main approaches. The first approach uses CNNs to extract features and then performs classification with traditional classifiers such as SVM. The second approach uses an end-to-end CNN that ensures the feature extraction and the classification steps. 

Recently, researchers from a Chinese University at Dalian proposed a new method called AHoNet to ensure the classification of breast cancer histopathological images. AHoNet is an end-to-end CNN network based on two main tools: the channel attention module and the second-order statistics. The first tool aims to achieve local salient deep features, while the second serves to construct a more robust global features representation.

The authors elected ResNet18 as the backbone of the overall network. Then, they added two additional modules: the channel attention module (ECA-Net) and the second-order network (MPN-COV). Concretely, behind every residual block of the ResNet, ECA-Net is implemented.  That aims to learn each block’s channel attention and ensure its weights redistribution.  ECA-Net is a local cross-channel interactive attention model that avoids dimensionality reduction to keep the direct correspondence between the channel and its weights. After improving the local functionality with the attention module, MPN-COV is added to the output of the last layer of the backbone, replacing the original average pooling layer in ResNet18. This module is added to compute the second-order statistical features, which are more discriminant than the first-order ones. MPN-COV targets to enhance the informative power of the global features extracted by the overall network. The authors proposed to utilize the covariance merging to perform this operation, and they introduced a novel covariance normalization acceleration technique to make its implementation speed on GPU. In the training phase, simple data-augmentation operations such as rotation flipping and cropping were applied to avoid the over-fitting problem.

To evaluate the model proposed in this paper, the authors conducted an experimental study on two datasets, BreakHis and BACH. In addition, they elected seven metrics. A comparative study with state-of-the-art has proven that AHoNet is very competitive and outperforms most existing methods. An ablation study was also performed to show the efficiency of each module implemented in the network. This study has proven that the two modules introduced in the article (ECA-Net and MPN-COV) are complementary and improve the overall network’s performance.

We have seen in this article a novel breast cancer histopathological image classification method that enhances the informative power of local and global features thanks to a channel attention module and a second-order pooling module. An extensive experimental study on large public two datasets has proven the effectiveness of this new approach. The authors plan to improve their methods further by exploring other attention techniques and high-order statistical models.

This Article is written as a paper summary article by Marktechpost Research Staff based on the research paper 'Breast Cancer Histopathological Image Classification Using Attention High-Order Deep Network '. All Credit For This Research Goes To Researchers on This Project. Checkout the paper.

Please Don't Forget To Join Our ML Subreddit

Credit: Source link

Comments are closed.