Breast Cancer

Multi-modality breast cancer Diagnosis

Breast Cancer Diagnosis in Ultrasound Imaging

---
Simple Summary: The findings of predictive and diagnostic systems in cancer are an
intriguing topic for physicians and the oncologic community. Computer-aided
decision (CAD) is vital for breast cancer diagnosis. It aids in higher accuracy
and early, reliable diagnosis. To achieve such aims, diverse imaging modalities
have been used and decision-making was facilitated by artificial intelligence and
machine learning models. High-fidelity automated breast lesion finding along with
their corresponding radiomic feature biomarkers can be delivered by a trained
model. In this study, the potential impact of a machine learning model for
detecting breast lesions and various radiomic biomarkers are examined. This study
presents a model that automatically segments and extracts radiomics and can enable
the clinical practice to find breast lesions while performing diagnosis
concurrently.
---
Four examples of ultrasound (a) that shows normal case with no suspicious lesion. (b) depicts a benign nodule while (c)-(d) show malignant tumors. These images show the challenge in discriminating different group of lesions with different textural complexities.

According to the American Cancer Society and World Health Organization (WHO) reports over 30% of the overall cancer death belongs to breast cancer. Despite high survival rates and the current advancement of various imaging systems, used for diagnostic and treatment, breast cancer still accounts for the most fatal cancer among women. X-ray mammography is the gold standard for breast cancer screening and is often used as a follow-up imaging modality. Other imaging modalities, such as magnetic resonance imaging, are more applicable to high-risk mutation cases and, due to being costly, are not considered for screening. Ultrasound (US) imaging is another common screening modality, which is highly dependent on the experience and expertise of its operator.

The schematic of angiogenesis (blood vessel formation) surrounding tumor cells in the breast area and vasodilation, which can be captured through short acquisition interval thermal imaging using dynamic thermography.

There are inherent limitations concerning medical imaging such as mammography and ultrasound due to being a projection imaging modality and a small field of view, which causes difficulties in finding microcalcification deep inside the breast lesions. This might cause a high recall rate for mammography, approximately 10%, or for digital breast tomosynthesis (DBT). Besides, the tissue superimposition increases false-positive rates in the diagnosis of benign solid mass, pseudo lesion, or calcifications as malignant tumors. The prevalence of false-positive findings during breast imaging is known as the loudest criticism in the field. In the U.S., up to 20% of assessed masses as Breast Imaging Reporting and Data System (BI-RADS) category 3 (probably benign) and recommended for biopsy and short-interval follow up (6 months), while only 9%-11% of biopsies prove to be malignant 36. On the other hand, without biopsy or frequent surveillance diagnosis would be delayed and have an adverse effect on patients’ health. Consequently, it is eminently desired that next-generation breast imaging systems and screening practices must decrease unnecessary biopsies and false-positive call-backs, to reduce invasive procedures, radiation dose, cost, and avoidable anxiety in patients. Computer-aided decision (CAD) systems showed undeniable help to physicians due to recent advancements in artificial intelligence (AI) technology. Particularly, embedded advanced machine learning models, i.e., deep neural networks, helped boost the capabilities of CAD. Important parameters for the diagnosis of breast cancer are related to tumor morphological information, which is often checked by physicians, and baseline characteristic features verified by CAD. Imaging throughputs, or radiomics, decode information on the characteristics that were not visible to the naked/untrained eyes and can have significant effects on cancer diagnosis/prognosis.

Preliminary results indicate a promising outcome of automated breast lesion segmentation. Some examples of suc-cessful (a.i, c.i, b.iii, c.iii), semi-successful (b.i, a.ii, b.ii, c.ii) and unsuccessful segmentations (a.iii) are presented.

Deep learning improves the capability of imaging throughputs for CADs, deep-radiomics, through transfer learning and extraction of hidden weights from the pre-trained models or by creating new models to enhance the deep radiomics. In both scenarios, high dimensional features aid CAD to enhance the ability to interpret the contents for non-imaging experts. Similarly, segmentation of breast lesions in ultrasound images can be a challenging task that often involves physicians with trained eyes. CAD helped in the past to increase the accuracy of tumor findings through multiple methodologies, including deep learning with reasonably high accuracy. Segmentation and classification of the tumors in medical imaging can be challenging due to multiple training processing and limited data in the medical field. Such problems along with imbalanced training or higher segmentation accuracy in the model were ad-dressed using different models’ configurations.

Hyperparameter tuning for the random forest, blue curves are representing deep radiomics and red curves show conventional radiomics fed to the model for the tuning using leave-one-out cross validation.

We tackle this challenge by designing a deep convolutional neural network model to segment and simultaneously extract deep-radiomics to be used for the classification of the breast lesion types. The segmentation and radiomic extraction tasks are embedded in a single unit of the deep neural network, which mitigates the amount of data and training time required for the training of the model.

---
This study presented an automated system for breast cancer screening in ultrasound
imaging by applying a mix of deep and conventional radiomic features. A deep
learning model was proposed for segmentation and simultaneous low-dimensional
throughput extraction of imaging biomarkers. The model was trained to segment the
breast lesions while extracting radiomic features at once. We performed
dimensionality reduction linked with segmentation using a convolutional deep
autoencoder to extract low-dimensional deep-radiomics (4 features). Similarly,
high dimensional conventional radiomics were used through a spectral mapping to
decrease the dimensionality of features from 354 to 12 radiomics. We trained our
system using 780 ultrasound images and compared each type of radiomic features’
performance to segment and detect malignancy. For diagnosis, we trained, tuned,
and tested a random forest model for detecting breast cancer cases. The maximal
(full multivariate) cross-validated model for a combination of radiomic groups
showed an accuracy of 78.5% (65.1%-84.1%) for the maximal (full multivariate)
cross-validated model. In future work, we would expand our analysis for an
independent dataset by combining it with other types of data to tackle the
system’s generalizability and reliability.
---

Breast Cancer with Infrared Thermography

---
Keywords: breast cancer; thermography; sparse deep convolutional autoencoder;
 matrix factorization; dimensionality reduction; thermomics
---

Cancer is the foremost cause of death worldwide and in the United States. Based on the American Cancer Society and World Health Organization (WHO) reports, newly diagnosed cancer cases in 2021 accounted for an overall estimation of 608,570 cancer deaths, where breast cancer alone was 30% of the overall cases. Despite better survival rates and developments in different imaging modalities for screening and therapy, breast cancer is the most fatal cancer among women (second-most common cancer). Early detection of breast cancer has a crucial role in disease treatment planning and patients’ survival. This study proposes a deep learning-based model for dynamic thermography imaging for a clinical breast exam (CBE) before performing mammography. We hypothesize that deep learning features can track thermal heterogeneity in breast tissue, which may associate with the angiogenesis and vasodilation caused by cancer metabolism.

The schematic of angiogenesis (blood vessel formation) surrounding tumor cells in the breast area and vasodilation, which can be captured through short acquisition interval thermal imaging using dynamic thermography.

Since the 1960s, mammography has been a gold standard method for breast cancer screening. Studies reported some variabilities in the screening results with this modality, facing various clinical factors such as age, breast density, type of malignancy, and family history. Fibrocystic breasts, hormone replacement therapy, and dense breasts undermine the strength of mammography imaging for accurate diagnosis (and screening). As an alternative, magnetic resonance imaging (MRI) is often used as it is relatively expensive and has lower specificity than mammography. A CBE is performed by clinicians, which helps for detecting breast cancer with reasonable accuracy, but it is not usually employed alone. The proposed system can be used in combination with CBE to increase the detection rate of breast cancer as a noninvasive low-cost tool. Infrared imaging captures thermal radiation emitted from tissue within an 8–10 μm bandwidth. Skin’s emissivity is close to black-body emissivity (0.98), which can transfer thermal fluctuations in such a way as to be observed with an infrared camera. The main source of thermal radiation is blood circulation, which becomes more heterogeneous due to the vasodilation and angiogenesis of tumor-adjacent tissues in abnormal tissue. Additionally, metabolic activity in malignant tissue correlating with nitric oxide and estrogen expression causes increased temperatures, which can be detected through the skin. In malignant neoplasia, the tumors’ growth is dependent on angiogenic sprouts, such that without this vascular support, tumors cannot receive vital nutrients. Angiogenesis also promotes breast cancer progression and metastasis through the cessation of C–C chemokine ligand 2 (CCL2) inhibitors. The vascular endothelial growth factor (VEGF) is an important angiogenic factor for growing vascularity and providing nutrients to cancer cells. Several studies have tried to detect such metabolic activities through different imaging tools, such as Raman spectroscopy and infrared imaging. In this study, we focus on the application of infrared imaging as an indicator of cancer’s presence

Scheme of the presented thermographic system to be used prior to mammography for the initial diagnosis of breast cancer along with clinical breast examination (CBE).

The application of infrared thermography in breast cancer screening involves computational methods to extract thermal heterogeneity. There are many methods for thermography that capture variance and reduce the dimensionality of thermal images. In this study, we propose an infrared imaging system equipped with deep learning-based thermomics to assist the clinician in the early diagnosis of breast cancer combined with CBE. Figure 2 shows the workflow of the proposed method. We designed a sparse deep convolutional autoencoder (SPAER) to not only compress the dimensionality of the data but also perform embedding for low-rank matrix approximations for the thermal matrix. The proposed approach significantly aided in early diagnosis of breast cancer. The system showed significant robustness in preserving thermal patterns while facing additive Gaussian noise increasing from 3% to 20%. Moreover, this study performed a comparative assessment for the state-of-the-art matrix factorization techniques to generate multichannel inputs. In the next section, the proposed methodology is presented by de-scribing details about the applied matrix factorization approaches and the SPAER and random forest models to perform the early diagnosis of breast cancer.

Some examples of low-rank thermal matrix approximation for six cases (healthy, healthy with symptoms, and symp-tomatic) using different matrix factorization techniques. (i,ii) The regions of interest (ROIs) of low-rank approximate representations of healthy cases. (iii) A healthy case with pain and changes in the nipple of the patient. (iv) Case in which a patient experienced pain in the breast and the results of the biopsy indicated a noncancerous lesion. (v,vi) Cases rep-resenting cancer patients. Their columns show low-rank thermal matrix representation for the targeted ROI. Columns a-e represent the matrix factorization techniques used to extract the low-rank representation IPCT, PCT, Sparse PCT, NMF, and Sparse NMF, respectively.

The proposed study tackled one of the major challenges of using deep learning imaging biomarkers by developing a sparse deep convolutional autoencoder, called the SPAER model, which generates low-dimensional deep thermomics. It used the low-rank multichannel thermal matrix approximation as an input for the SPAER model. The SPAER model provided significant dimensionality reduction without lessening the diagnostic performance, decreasing from 786,432 to 16 imaging biomarkers. Five state-of-the-art matrix factorizations were employed to extract three initial predominant bases and embed them, which provided a solution for selecting manual bases. A total of 208 breast cancer screening cases with dynamic thermography were used for testing the proposed model. The best accuracy was for NMF-incorporated SPAER with demographics for preserving thermal heterogeneity to classify between symptomatic and asymptomatic cases (yielded to 78.2% (74.3–82.5%)). In addition, SPAER showed significant robustness against additive Gaussian noise, increasing from 3% to 20%, while the highest SNR was obtained by sparse PCT, generating multichannel low-rank thermal matrix approximation. Future work would involve the integration of other available clinical factors to enhance the ability to assess the thermal characteristics of tissues. This could be better evaluated by increasing the study cohort and multimodal analysis.

---
Yousefi, B., Akbari, H., Hershman, M., Kawakita, S., Fernandes, H. C., Ibarra-Castanedo, C., ... & Maldague, X. P. (2021).
SPEAR: Sparse deep convolutional autoencoder model to extract low dimensional imaging biomarkers for early detection of breast cancer using dynamic thermography.
Applied Sciences, 11(7), 3248.
---