Breast mass classification in sonography with transfer learning using a deep convolutional neural network and color conversion

Michal Byra*, Michael Galperin, Haydee Ojeda-Fournier, Linda Olson, Mary O'Boyle, Christopher Comstock, Michael Andre

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

222 Scopus citations

Abstract

Purpose: We propose a deep learning-based approach to breast mass classification in sonography and compare it with the assessment of four experienced radiologists employing breast imaging reporting and data system 4th edition lexicon and assessment protocol. Methods: Several transfer learning techniques are employed to develop classifiers based on a set of 882 ultrasound images of breast masses. Additionally, we introduce the concept of a matching layer. The aim of this layer is to rescale pixel intensities of the grayscale ultrasound images and convert those images to red, green, blue (RGB) to more efficiently utilize the discriminative power of the convolutional neural network pretrained on the ImageNet dataset. We present how this conversion can be determined during fine-tuning using back-propagation. Next, we compare the performance of the transfer learning techniques with and without the color conversion. To show the usefulness of our approach, we additionally evaluate it using two publicly available datasets. Results: Color conversion increased the areas under the receiver operating curve for each transfer learning method. For the better-performing approach utilizing the fine-tuning and the matching layer, the area under the curve was equal to 0.936 on a test set of 150 cases. The areas under the curves for the radiologists reading the same set of cases ranged from 0.806 to 0.882. In the case of the two separate datasets, utilizing the proposed approach we achieved areas under the curve of around 0.890. Conclusions: The concept of the matching layer is generalizable and can be used to improve the overall performance of the transfer learning techniques using deep convolutional neural networks. When fully developed as a clinical tool, the methods proposed in this paper have the potential to help radiologists with breast mass classification in ultrasound.

Original languageEnglish
Pages (from-to)746-755
Number of pages10
JournalMedical Physics
Volume46
Issue number2
DOIs
StatePublished - Feb 2019
Externally publishedYes

Funding

FundersFunder number
Almen Laboratories, Inc.
National Institutes of Health
National Cancer Institute
Gustavus and Louise Pfeiffer Research Foundation

    Keywords

    • BI-RADS
    • breast mass classification
    • convolutional neural networks
    • transfer learning
    • ultrasound imaging

    Fingerprint

    Dive into the research topics of 'Breast mass classification in sonography with transfer learning using a deep convolutional neural network and color conversion'. Together they form a unique fingerprint.

    Cite this