Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

A similarity measure method fusing deep feature for mammogram retrieval

A similarity measure method fusing deep feature for mammogram retrieval BACKGROUND:Breast cancer is one of the most important malignant tumors among women causing a serious impact on women’s lives and mammography is one the most important methods for breast examination. When diagnosing the breast disease, radiologists sometimes may consult some previous diagnosis cases as a reference. But there are many previous cases and it is important to find which cases are the similar cases, which is a big project costing lots of time. Medical image retrieval can provide objective reference information for doctors to diagnose disease. The method of fusing deep features can improve the retrieval accuracy, which solves the “semantic gap” problem caused by only using content features and location features.METHODS:A similarity measure method combining deep feature for mammogram retrieval is proposed in this paper. First, the images are pre-processed to extract the low-level features, including content features and location features. Before extracting location features, registration with the standard image is performed. Then, the Convolutional Neural Network, the Stacked Auto-encoder Network, and the Deep Belief Network are built to extract the deep features, which are regarded as high-level features. Next, content similarity and deep similarity are calculated separately using the Euclidean distance between the query image and the dataset images. The location similarity is obtained by calculating the ratio of intersection to union of the mass regions. Finally, content similarity, location similarity, and deep similarity are fused to form the image fusion similarity. According to the similarity, the specified number of the most similar images can be returned.RESULTS:In the experiment, 740 MLO mammograms are used, which are from women in Northeast China. The content similarity, location similarity, and deep similarity are fused by different weight coefficients. When only considering low-level features, the results are better with fusing 60% content feature similarity and 40% lesion location feature similarity. On this basis, CNN deep similarity, DBN deep similarity, and SAE deep similarity are fused separately. The experiments show that when fusing 60% DBN deep feature similarity and 40% low-level feature similarity, the results have obvious advantages. At this time, the precision is 0.745, recall is 0.850, comprehensive evaluation index is 0.794.CONCLUSIONS:We propose a similarity measure method fusing deep feature, content feature, and location feature. The retrieval results show that the precision and recall of this method have obvious advantage, compared with the content-based image retrieval and location-based image retrieval. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of X-Ray Science and Technology IOS Press

A similarity measure method fusing deep feature for mammogram retrieval

Loading next page...
 
/lp/ios-press/a-similarity-measure-method-fusing-deep-feature-for-mammogram-dG2V17EuVr

References (26)

Publisher
IOS Press
Copyright
Copyright © 2020 © 2020 – IOS Press and the authors. All rights reserved
ISSN
0895-3996
eISSN
1095-9114
DOI
10.3233/XST-190575
Publisher site
See Article on Publisher Site

Abstract

BACKGROUND:Breast cancer is one of the most important malignant tumors among women causing a serious impact on women’s lives and mammography is one the most important methods for breast examination. When diagnosing the breast disease, radiologists sometimes may consult some previous diagnosis cases as a reference. But there are many previous cases and it is important to find which cases are the similar cases, which is a big project costing lots of time. Medical image retrieval can provide objective reference information for doctors to diagnose disease. The method of fusing deep features can improve the retrieval accuracy, which solves the “semantic gap” problem caused by only using content features and location features.METHODS:A similarity measure method combining deep feature for mammogram retrieval is proposed in this paper. First, the images are pre-processed to extract the low-level features, including content features and location features. Before extracting location features, registration with the standard image is performed. Then, the Convolutional Neural Network, the Stacked Auto-encoder Network, and the Deep Belief Network are built to extract the deep features, which are regarded as high-level features. Next, content similarity and deep similarity are calculated separately using the Euclidean distance between the query image and the dataset images. The location similarity is obtained by calculating the ratio of intersection to union of the mass regions. Finally, content similarity, location similarity, and deep similarity are fused to form the image fusion similarity. According to the similarity, the specified number of the most similar images can be returned.RESULTS:In the experiment, 740 MLO mammograms are used, which are from women in Northeast China. The content similarity, location similarity, and deep similarity are fused by different weight coefficients. When only considering low-level features, the results are better with fusing 60% content feature similarity and 40% lesion location feature similarity. On this basis, CNN deep similarity, DBN deep similarity, and SAE deep similarity are fused separately. The experiments show that when fusing 60% DBN deep feature similarity and 40% low-level feature similarity, the results have obvious advantages. At this time, the precision is 0.745, recall is 0.850, comprehensive evaluation index is 0.794.CONCLUSIONS:We propose a similarity measure method fusing deep feature, content feature, and location feature. The retrieval results show that the precision and recall of this method have obvious advantage, compared with the content-based image retrieval and location-based image retrieval.

Journal

Journal of X-Ray Science and TechnologyIOS Press

Published: Feb 15, 2020

There are no references for this article.