Coral reefs are among the most biologically diverse ecosystems on Earth and play a crucial role in maintaining ma- rine ecological balance. However, climate change–induced ocean warming, acidification, and human activities have accelerated coral bleaching and reef degradation. Continuous and accurate monitoring of coral health is therefore essential, yet manual assessment methods are labor-intensive, time-consuming, and pronetosubjectivity.ThispaperpresentsanIEEE-styleresearch study derived strictly from the project report titled Deep Diving into YOLOv8 CNN Model Driven Coral Health Monitoring for SustainableReefConservation.Theproposedworkintroducesan automated deep learning–based framework using the YOLOv8 convolutionalneuralnetworkforreal-timecoralhealthdetection and classification.A dataset consisting of 923 labeled underwater coral images representing healthy, partially bleached, and fully bleached corals is utilized. The system employs image pre- processing and augmentation techniques to handle underwater distortions,followedbytransferlearning–basedfine-tuningofthe YOLOv8 model. The trained model is evaluated using standard performance metrics including precision, recall, mean Average Precision (mAP), and confusion matrix analysis. Experimental results demonstrate that the proposed approach achieves reli- able detection accuracy and robust generalization, validating its suitabilityforscalableandreal-timereefmonitoringapplications. The framework provides a practical and intelligent solution to support sustainable coral reef conservation efforts.
Introduction
Coral reefs are vital ecosystems that support high marine biodiversity and provide coastal protection, fisheries, and tourism benefits, but they are increasingly threatened by climate change, pollution, and human activities. Coral bleaching—caused by the loss of symbiotic algae due to stress—is a major indicator of reef degradation. Traditional coral monitoring relies on manual surveys by experts, which are costly, limited in scale, and prone to human error. Recent advances in artificial intelligence, particularly deep learning, offer scalable and automated alternatives.
This study focuses on using the YOLOv8 deep learning architecture for automated coral health monitoring from underwater images. Building on prior research that demonstrates the superiority of CNN-based methods over traditional image processing, the work leverages transfer learning and real-time object detection to classify coral health into three categories: healthy, partially bleached, and fully bleached.
A dataset of 923 labeled underwater images is preprocessed and augmented to address underwater imaging challenges and improve model generalization. The proposed methodology employs a structured pipeline involving preprocessing, data augmentation, dataset splitting, and fine-tuning of YOLOv8 using a composite loss function. Model performance is evaluated using precision, recall, mean Average Precision (mAP), and confusion matrix analysis.
Results show steady convergence of training and validation losses, indicating good generalization and minimal overfitting. High precision, recall, and mAP values across all classes demonstrate that YOLOv8 effectively detects and classifies coral health states in complex underwater environments. Overall, the proposed system provides an efficient, accurate, and scalable solution for real-time coral reef health assessment, supporting long-term reef conservation and management.
Conclusion
ThispaperpresentedanIEEEjournal–stylestudyde- rivedstrictlyfromtheprojectreporttitledDeepDiving into YOLOv8 CNN Model Driven Coral Health Monitoring for Sustainable Reef Conservation. The primary objective of the work was to design and evaluate an automated, deep learning–basedframeworkcapableofaccuratelydetectingand classifying coral health conditions from underwater imagery. By leveraging the YOLOv8 convolutional neural network and transfer learning techniques, the proposed system successfully addresses the limitations of traditional manual coral mon- itoringmethods,suchashighlaborcost,limitedscalabil- ity, and subjective interpretation. The experimental analysis demonstratesthat YOLOv8 ishighly effective for coral health monitoring due to its single-stage detection architecture, fast inference speed, and strong feature extraction capability. The integration of image preprocessing and data augmentation techniques significantly improved model robustness against underwater challenges such as low illumination, color distor- tion,andnoise.Performanceevaluationusingmetricssuch as precision, recall, mean Average Precision (mAP), and confusion matrix analysis confirms that the model can re- liably distinguish between healthy, partially bleached, and fullybleachedcorals.
Theseresultsvalidatethesuitability of the proposed framework for real-time and large-scale reef monitoring applications. From a conservation perspective, the proposed system offers a practical and scalable solution that can assist marine biologists, environmental researchers, and conservation agencies in continuous reef health assessment.
The ability to automatically analyze large volumes of under- waterimagesenablesearlydetectionofcoralbleachingevents, therebysupportingtimelyintervention andinformeddecision- making. Moreover, the lightweight nature of the YOLOv8 architecture makes the system adaptable for deployment on edge devices, underwater drones, and autonomous monitoring platforms. Expanding the dataset with more diverse reef im- agery and deploying the system in real-time underwater envi- ronments are also planned. Overall, this study demonstrates that deep learning–driven object detection frameworks like YOLOv8 hold significant potential for advancing intelligent, automated coral reef conservation systems and contributing to the long-term sustainability of marine ecosystems.
References
[1] H. Zhang, M. Li, J. Zhong, and J. Qin, “CNet: A Novel Seabed Coral Reef Image Segmentation Approach Based on Deep Learning,” Proc. IEEE/CVF Winter Conf. Applications of Computer Vision, pp. 767–775, 2024.
[2] C. DeLozier, J. Blanco, R. Rakvic, and J. Shey, “Main- taining Symmetry Between Convolutional Neural Network AccuracyandPerformanceonanEdgeTPUwithaFocus on Transfer LearningAdjustments,” Symmetry,vol.16,no.1,p.91,2024.
[3] J. Su, X. Yu, X. Wang, Z. Wang, and G. Chao, “Enhanced Transfer Learning with Data Augmentation,” En- gineering Applications of Artificial Intelligence, vol. 129, p. 107602, 2024.
[4] J. Dong, H. Jiang, D. Su, Y. Gao, T. Chen, and K. Sheng, “Transfer Learning Rolling Bearing Fault Diagnosis Model Based on Deep Feature Decomposition and Class- Level Alignment,” IEEE Transactions on Instrumentation and Measurement, 2024.
[5] J. Redmon and A. Farhadi, “YOLO9000: Better, Faster, Stronger,” Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR), pp. 7263–7271, 2017.
[6] A. Bochkovskiy, C.-Y. Wang, and H.-Y. M. Liao, “YOLOv4:OptimalSpeedandAccuracyofObjectDetection,” arXiv preprint arXiv:2004.10934, 2020.
[7] G. Jocher etal., “YOLOv8:UltralyticsNext-Generation ObjectDetectionFramework,”Ultralytics,2023.[Online].Available:https://github.com/ultralytics/ultralytics
[8] K. He, X. Zhang, S. Ren, and J. Sun, “Deep Residual LearningforImageRecognition,”Proc.IEEEConf.Computer Vision and Pattern Recognition (CVPR), pp. 770–778, 2016.
[9] A. Krizhevsky, I. Sutskever, and G. E. Hinton, “Im- ageNet Classification with Deep Convolutional Neural Net- works,” Advances in Neural Information Processing Systems, vol. 25, pp. 1097–1105, 2012.
[10] S.Ren,K.He,R.Girshick,andJ.Sun,“FasterR-CNN: Towards Real-Time Object Detection with Region Proposal Networks,” IEEE Transactions on Pattern Analysis andMachineIntelligence,vol.39,no.6,pp.1137–1149, Jun.2017.[11]Y.Tang,M.R.Dehaghani,P.Sajadi,and
[11] G. G. Wang, “Selecting Subsets of Source Data for Transfer LearningwithApplicationsinMetalAdditiveManufacturing,” arXiv preprint arXiv:2401.08715, 2024.
[12] S.?Is¸gu¨zar,M.Tu¨rkog?lu,T.Ates¸s¸ahin,andO¨.Du¨rrani,“FishAgePredictioNet: A Multi-Stage Fish Age Prediction Framework Based on Deep Convolution Networks,” Fisheries Research, vol. 271, p. 106916, 2024.
[13] C. DeCarlo et al., “Automated Detection of Coral Bleaching Using Deep Learning and Underwater Imagery,” Remote Sensing, vol. 15, no. 3, pp. 1–17, 2023.
[14] R.Go´mez-R´?os,J.L.Aranda,andP.J.Herrera,“Deep LearningforMarineHabitatMapping:AReview,”Ecological Informatics, vol. 72, p. 101874, 2023.