Worldwide, the number of people with cancer continues to represent a significant health risk in that new cancer cases are identified before the disease is detected by traditional means (the appearance of signs and/or symptoms). In recent years, new artificial intelligence based technologies have been developed to aid doctors in improving their traditional methods of diagnosing and treating cancer. Convolutional neural networks are one such new diagnostic method which uses large quantities of data including medical imaging (mammograms, ultrasounds, MRIs, etc.) to correctly identify diseases such as breast cancer. In this paper, we describe the use of convolutional neural network technology and explainable artificial intelligence (XAI) to create an entirely new breast cancer risk prediction methodology.
In this study, the Breast Ultrasound Images (BUSI) dataset is used. In this study, images are classified as normal, benign, and malignant. Ultrasonography is an imaging modality that is effective in examining dense breast tissue.
Convolutional Neural Networks are used for feature extraction in the classification of tumor patterns. Despite the high degree of predictability offered by models developed using convolutional neural network (CNN) architecture, the issue of transparency has been raised against such models. In this context, the transparency of the decision-making process has been emphasized.In order to improve the above-mentioned limitations, Explainable AI techniques are utilized in the proposed framework. As a result of applying the Explainable AI techniques (GRAD-CAM images), the decision-making process of the models can be explained, along with the regions of the input image that affect the decision-making process. As a result, the transparency of the decision-making process can be improved.
The results obtained from the experiments suggest that the proposed CNN-XAI model achieves an Area Under the Curve (AUC) of 0.9934 and an accuracy of 97.6%, which are higher when compared to other CNN-based approaches.The implications of this discovery are that potential advantages can be obtained by using deep learning in conjunction with an explanation component.
Introduction
Breast cancer remains one of the most common and deadly diseases worldwide, highlighting the need for more accurate early detection methods. Traditional diagnostic techniques like mammograms and ultrasounds often suffer from limitations such as false results, variability in interpretation, and reduced effectiveness in dense breast tissue.
To address these challenges, this study proposes an AI-based framework using Convolutional Neural Networks (CNNs) combined with Explainable AI (XAI) techniques. CNNs automatically learn complex patterns from medical images and outperform traditional machine learning models like Support Vector Machines (SVM) and Random Forest (RF), which rely on manually extracted features.
The system uses the BUSI ultrasound dataset and applies preprocessing, data augmentation, and class balancing techniques to improve performance. A comparative analysis shows that the CNN model achieves the highest accuracy (97.6%), followed by RF and SVM.
To overcome the “black-box” nature of deep learning, the framework integrates Grad-CAM, an XAI technique that generates heatmaps highlighting important regions in medical images. This improves transparency, helping clinicians understand and trust AI predictions.
Conclusion
The current study emphasizes the use of machine learning to facilitate the technology of earlier detection and diagnosis of breast cancer. In the process of developing an effective prediction model through the application of deep learning techniques and the use of explanations, the potential of deep learning techniques in the development of an effective prediction model with high accuracy and interpretability by clinicians has been established. The prediction model developed in the paper was able to integrate the image data with the data obtained from the clinics. The predictions made by the model would assist the radiologists in their readings and help minimize the level of subjectivity.
The training strategy and model architecture were explicitly created to solve the primary limitations in the field - specifically, interpretability, generalizability and imbalance. Explainable-AI strategies including SHAP and Grad-CAM allow to clinicians to see the decision boundaries of the models and foster trust and clinical validation. The considerable predictive efficiency this study documents is a promise that deep learning strategies combined with transparent strategies for explanation can these to a new generation of diagnostic tools for the discipline of oncology[21].
Early and accurate detection of breast cancer has the potential to improve patient care, which will greatly enhance patient-centered screening and treatment plans. The suggested system will guarantee equity in access to cutting-edge diagnostic technologies and is an open, scalable, and affordable solution that can be used in healthcare systems with both high and low resource availability [22].
The results are really good. The paper also talks about some problems. One problem is that the results were not checked by institutions. Another problem is the kind of data that was used. The data set needs to be bigger. Include more information about genetics and histopathology. Future research on the dataset should include these things. See how the model works in a real clinic setting with the dataset. Future research should also look at the models performance in a setting, with the dataset. Federated learning and data masking will be used to help achieve extensibility while protecting patient privacy.
IncorporatingXAI provides an essential reference for improving the consistency and the trustworthiness of deep learning-based methods for diagnosing. Traditional deep learning-based approaches typically operate in a “black-box” fashion; whereas, XAI methodology can yield valuable explanations for the process in which a predicted outcome is reached. In this study,Grad-CAM was used to illustrate the specific areas of ultrasound images (e.g., tumor margins and lesion texture) that were most influential in the decision-making of the diagnostic system. The visual representation of the decision-making process of the diagnostic system provides the healthcare provider with the ability to ascertain if the decision-making process of the diagnostic system is congruent with important characteristics of medical science, wherever applicable (e.g., tumor margins and lesion texture). Therefore, the use of (CNN-based) diagnostic systems not only increase the level of trust in the decision-making process by providing an interpretable means of providing diagnostic results, but they also enable more effective collaborations between humans and artificial intelligence, ultimately leading to improved results in medical diagnostics. In conclusion, the implementation of XAI solves the both; the issue of the performance versus the interpretability of deep learning-based diagnostic systems, and thus, enhance accessibility, and usability; therefore, helping to increase the acceptance and overall use of AI-based analytical solutions for breast cancer detection.
Ultimately, this reduces the growing body of evidence to suggest that machine intelligence may act as a game changer in the fight against breast cancer.
Addressing the shortcomingsregarding accuracy, interpretability, and applicability in real-world contexts, the framework for AI presented here is a vital step towards devising diagnostic systems that have a high accuracy, clearly understandable, and ultimately applicable advantages that will lead to lives being saved.
References
[1] R. L. Siegel, T. B. Kratzer, N. S. Wagle, H. Sung, and A. Jemal, “Cancer statistics, 2026,” CA. Cancer J. Clin., vol. 76, no. 1, p. e70043, Jan. 2026, doi: 10.3322/caac.70043.
[2] Y. Chen, X. Shao, K. Shi, A. Rominger, and F. Caobelli, “AI in Breast Cancer Imaging: An Update and Future Trends,” Semin. Nucl. Med., vol. 55, no. 3, pp. 358–370, May 2025, doi: 10.1053/j.semnuclmed.2025.01.008.
[3] S. E. Hickman, G. C. Baxter, and F. J. Gilbert, “Adoption of artificial intelligence in breast imaging: evaluation, ethical constraints and limitations,” Br. J. Cancer, vol. 125, no. 1, pp. 15–22, Jul. 2021, doi: 10.1038/s41416-021-01333-w.
[4] D. Saslow et al., “American Cancer Society Guidelines for Breast Screening with MRI as an Adjunct to Mammography,” CA. Cancer J. Clin., vol. 57, no. 2, pp. 75–89, Mar. 2007, doi: 10.3322/canjclin.57.2.75.
[5] J. G. Elmore et al., “Diagnostic Concordance Among Pathologists Interpreting Breast Biopsy Specimens,” JAMA, vol. 313, no. 11, p. 1122, Mar. 2015, doi: 10.1001/jama.2015.1405.
[6] W. Samek, G. Montavon, S. Lapuschkin, C. J. Anders, and K.-R. Muller, “Explaining Deep Neural Networks and Beyond: A Review of Methods and Applications,” Proc. IEEE, vol. 109, no. 3, pp. 247–278, Mar. 2021, doi: 10.1109/JPROC.2021.3060483.
[7] S. H. Kim et al., “Interpretive Performance and Inter-Observer Agreement on Digital Mammography Test Sets,” Korean J. Radiol., vol. 20, no. 2, p. 218, 2019, doi: 10.3348/kjr.2018.0193.
[8] A. N. Giaquinto et al., “Breast Cancer Statistics, 2022,” CA. Cancer J. Clin., vol. 72, no. 6, pp. 524–541, Nov. 2022, doi: 10.3322/caac.21754.
[9] P. Rajpurkar, E. Chen, O. Banerjee, and E. J. Topol, “AI in health and medicine,” Nat. Med., vol. 28, no. 1, pp. 31–38, Jan. 2022, doi: 10.1038/s41591-021-01614-0.
[10] A. Ferro et al., “Clinical applications of radiomics and deep learning in breast and lung cancer: A narrative literature review on current evidence and future perspectives,” Crit. Rev. Oncol. Hematol., vol. 203, p. 104479, Nov. 2024, doi: 10.1016/j.critrevonc.2024.104479.
[11] A. Carriero, L. Groenhoff, E. Vologina, P. Basile, and M. Albera, “Deep Learning in Breast Cancer Imaging: State of the Art and Recent Advancements in Early 2024,” Diagnostics, vol. 14, no. 8, p. 848, Apr. 2024, doi: 10.3390/diagnostics14080848.
[12] T. Li et al., “Deep learning in multi-modal breast cancer data fusion: a literature review,” Quant. Imaging Med. Surg., vol. 15, no. 11, pp. 11578–11610, Nov. 2025, doi: 10.21037/qims-2024-2903.
[13] B. Acs et al., “Variability in Breast Cancer Biomarker Assessment and the Effect on Oncological Treatment Decisions: A Nationwide 5-Year Population-Based Study,” Cancers, vol. 13, no. 5, p. 1166, Mar. 2021, doi: 10.3390/cancers13051166.
[14] W. Al-Dhabyani, M. Gomaa, H. Khaled, and A. Fahmy, “Dataset of breast ultrasound images,” Data Brief, vol. 28, p. 104863, Feb. 2020, doi: 10.1016/j.dib.2019.104863.
[15] M. Alotaibi et al., “Breast cancer classification based on convolutional neural network and image fusion approaches using ultrasound images,” Heliyon, vol. 9, no. 11, p. e22406, Nov. 2023, doi: 10.1016/j.heliyon.2023.e22406.
[16] B. Abunasser, M. R. AL-Hiealy, I. Zaqout, and S. Abu-Naser, “Convolution Neural Network for Breast Cancer Detection and Classification Using Deep Learning,” Asian Pac. J. Cancer Prev., vol. 24, no. 2, pp. 531–544, Feb. 2023, doi: 10.31557/APJCP.2023.24.2.531.
[17] A. Bilal, A. Imran, T. I. Baig, X. Liu, E. Abouel Nasr, and H. Long, “Breast cancer diagnosis using support vector machine optimized by improved quantum inspired grey wolf optimization,” Sci. Rep., vol. 14, no. 1, p. 10714, May 2024, doi: 10.1038/s41598-024-61322-w.
[18] J. Holm et al., “Associations of Breast Cancer Risk Prediction Tools With Tumor Characteristics and Metastasis,” J. Clin. Oncol., vol. 34, no. 3, pp. 251–258, Jan. 2016, doi: 10.1200/JCO.2015.63.0624.
[19] J. Maurer et al., “Random forest algorithm identifies miRNA signatures for breast cancer detection and classification from patient urine samples,” Ther. Adv. Med. Oncol., vol. 16, p. 17588359241299563, Jan. 2024, doi: 10.1177/17588359241299563.
[20] S. Hussain, Y. Lafarga-Osuna, M. Ali, U. Naseem, M. Ahmed, and J. G. Tamez-Peña, “Deep learning, radiomics and radiogenomics applications in the digital breast tomosynthesis: a systematic review,” BMC Bioinformatics, vol. 24, no. 1, p. 401, Oct. 2023, doi: 10.1186/s12859-023-05515-6.
[21] A. Akgündo?du and ?. Çelikba?, “Explainable deep learning framework for brain tumor detection: Integrating LIME, Grad-CAM, and SHAP for enhanced accuracy,” Med. Eng. Phys., vol. 144, no. 1, p. 104405, Oct. 2025, doi: 10.1016/j.medengphy.2025.104405.
[22] A. Yala et al., “Toward robust mammography-based models for breast cancer risk,” Sci. Transl. Med., vol. 13, no. 578, p. eaba4373, Jan. 2021, doi: 10.1126/scitranslmed.aba4373.