This study presents a dual-stage deep learning framework for classifying fetal ultrasound images. The first stage involves identifying the fetus\'s anatomical plane, while the second stage classifies the fetal condition as normal, abnormal, or uncertain.
The methodology includes image preprocessing using OpenCV and implementation of both traditional CNNs and Separable CNNs using TensorFlow and Keras. Experimental results demonstrate high classification accuracy with reduced inference time, with Separable CNNs outperforming traditional models. This approach offers promising support for automated and real-time prenatal diagnostics.
Introduction
This paper presents a deep learning-based dual-stage classification framework for prenatal ultrasound imaging to automate and improve the evaluation of fetal development and anomaly detection. The system uses two convolutional neural network (CNN) architectures—traditional CNN and Separable CNN—to first identify six anatomical planes (Abdomen, Brain, Femur, Thorax, Spine, Profile) and then classify the fetal condition as Normal, Abnormal, or Uncertain.
The methodology involves preprocessing ultrasound images (grayscale conversion, denoising, contrast enhancement, resizing), training on an annotated dataset with data augmentation, and implementing the models using TensorFlow and Keras.
Experimental results demonstrate that the Separable CNN model outperforms the traditional CNN with higher accuracy (94.3% vs. 92.5% for plane detection; 91.5% vs. 89.7% for condition classification), better F1-scores, and faster inference times, making it well-suited for real-time clinical use. The study also includes an Xception model and an ensemble model, with the ensemble achieving the highest overall accuracy (94.8%).
A user-friendly Streamlit web interface allows clinicians to upload ultrasound images and receive diagnostic feedback with confidence scores, enhancing practical usability.
Conclusion
This paper presents an efficient dual-stage deep learning model for fetal ultrasound classification. The use of Separable CNNs enhances both performance and speed, highlighting the potential of AI-assisted prenatal diagnostics. Future research will focus on integrating the system with electronic health records and extending support for 3D ultrasound and multi-modal data.
References
[1] Y. Zhang et al., “Fetal Plane Detection in Ultrasound Images using Deep Learning,” IEEE Transactions on Medical Imaging, 2022.
[2] L. Wang et al., “Abnormality Detection in Prenatal Ultrasound Using CNNs,” Elsevier Neurocomputing, 2021.
[3] M. R. Anwar et al., “Deep Learning for Medical Ultrasound Analysis,” Springer AI in Medicine, 2020.