Acne vulgaris is a common dermatological condition that affects individuals across all age groups and can lead to physical scarring and psychological distress. Traditional diagnosis relies on manual visual examination, which is subjective and dependent on specialist availability. This paper presents Acnelytix, an AI-powered system for automatic acne type detection and severity classification using lightweight deep learning models. The system employs Convolutional Neural Networks (CNNs) such as MobileNetV2 and EfficientNet-Lite for efficient feature extraction and classification. Facial images collected from publicly available datasets on Kaggle are preprocessed, augmented, and used to train the model. The system is implemented as an offline desktop application using ONNX Runtime and ML.NET for real-time inference. Experimental evaluation is conducted using standard performance metrics. The proposed system provides accurate, fast, and scalable diagnostic support, making it suitable for deployment in resource-limited healthcare environments.
Introduction
The text presents Acnelytix, an AI-powered system designed to detect acne and classify its severity using facial images. It highlights how Artificial Intelligence and Deep Learning, particularly lightweight Convolutional Neural Networks (CNNs), are transforming dermatological assessment by enabling fast, reliable, and accessible diagnostic support. Traditional acne diagnosis relies on subjective visual inspection by dermatologists, which is often limited by specialist availability, especially in rural or resource-constrained areas.
To address these challenges, the project proposes an offline-capable, computationally efficient system using lightweight CNN architectures such as MobileNetV2 and EfficientNet-Lite. The system applies image preprocessing, data augmentation, and transfer learning to accurately identify acne types (comedonal, inflammatory, cystic) and classify severity levels (mild, moderate, severe). It emphasizes real-time performance, low hardware requirements, and practical deployment.
The methodology integrates data acquisition from public datasets, model training and evaluation, acne detection and severity classification, and implementation as a standalone desktop application using ML.NET, ONNX Runtime, and OpenCV. The system supports offline operation, role-based access, and patient record management.
Experimental evaluation is based on standard classification metrics such as accuracy, precision, recall, and F1-score, with expected results indicating high accuracy, robust generalization, and fast inference on CPU-based systems. Overall, the work aims to bridge the gap between academic research and real-world healthcare applications by providing a lightweight, scalable, and accessible AI-based acne analysis system.
Conclusion
This project presented Acnelytix, an AI-powered system designed for automatic acne type detection and severity classification using deep learning and computer vision techniques. The system addresses the limitations of traditional dermatological assessment, which can be subjective and dependent on specialist availability. By leveraging lightweight Convolutional Neural Network (CNN) architectures such as MobileNetV2 and EfficientNet-Lite, the proposed solution achieves a balance between accuracy and computational efficiency.
The system integrates image preprocessing, feature extraction, model training, and desktop application deployment into a unified framework. It is capable of identifying acne types such as comedonal, inflammatory, and cystic, while also classifying severity levels into mild, moderate, and severe categories. The use of transfer learning and data augmentation enhances the robustness of the model under varying lighting conditions, skin tones, and image qualities.
One of the key strengths of the system is its lightweight and offline-capable design, making it suitable for deployment in clinics, hospitals, and remote healthcare centers without requiring high-end hardware or continuous internet access. The user-friendly interface ensures accessibility for both medical professionals and general users.
Although the system shows promising performance, further improvements can be made by expanding the dataset, incorporating more diverse skin conditions, and enhancing explainability features such as heatmaps for lesion visualization. Future enhancements may include cloud integration, mobile application support, and extension to other dermatological diseases.
In conclusion, the proposed system demonstrates how artificial intelligence can be effectively applied in healthcare to provide fast, reliable, and scalable diagnostic support, contributing to improved accessibility and early assessment of skin conditions.
References
[1] “Acne Severity Classification Dataset,” Kaggle, https://www.kaggle.com/datasets (accessed Jan. 26, 2026).
[2] A. G. Howard et al., “MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications,” arXiv:1704.04861, 2017.
[3] M. Tan and Q. V. Le, “EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks,” Proc. 36th International Conference on Machine Learning (ICML), 2019.
[4] Y. LeCun, Y. Bengio, and G. Hinton, “Deep Learning,” Nature, vol. 521, no. 7553, pp. 436–444, 2015.
[5] D. P. Kingma and J. Ba, “Adam: A Method for Stochastic Optimization,” Proc. 3rd International Conference on Learning Representations (ICLR), 2015.
[6] L. Perez and J. Wang, “The Effectiveness of Data Augmentation in Image Classification using Deep Learning,” arXiv:1712.04621, 2017.
[7] O. Russakovsky et al., “ImageNet Large Scale Visual Recognition Challenge,” Int. Journal of Computer Vision, vol. 115, pp. 211–252, 2015.
[8] K. Simonyan and A. Zisserman, “Very Deep Convolutional Networks for Large-Scale Image Recognition,” arXiv:1409.1556, 2014.
[9] S. Ioffe and C. Szegedy, “Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift,” Proc. 32nd International Conference on Machine Learning (ICML), 2015.
[10] J. Redmon and A. Farhadi, “YOLOv3: An Incremental Improvement,” arXiv:1804.02767, 2018.