Understanding how customers feel is one of the most valuable insights a business can gain. Traditionally, companies depend on surveys, ratings, or written feedbacks to collect reviews—but these methods are often time-consuming, biased, or ignored by users. With the rapid advancement of Artificial Intelligence (AI) and Machine Learning (ML), it is now possible to automatically detect and interpret human emotions using technologies like facial recognition, video analysis, and Natural Language Processing (NLP). This paper presents a comprehensive survey of the latest research efforts aimed at improving customer feedback systems using AI-powered emotion recognition. Many researchers try to understand emotions such as happiness, sad, anger, neutral, surprise by studying people’s facial expressions, often using images or videos taken from cameras or webcams. Popular techniques include Convolutional Neural Networks (CNNs) for image-based emotion classification, Support Vector Machines (SVMs) for quick categorization. This document gives formatting instructions for authors preparing papers for publication in the Proceedings of an IEEE conference. The authors must follow the instructions given in the document for the papers to be published. You can use this document as both an instruction set and as a template into which you can type your own text. Keywords: Emotion Recognition, Facial Expression Recognition, Machine Learning, CNN, Viola-Jones Algorithm
Introduction
In modern retail and e-commerce, understanding customer emotions is crucial for enhancing shopping experiences and making informed business decisions. Traditional feedback methods like surveys and ratings often fail to capture subtle emotional cues. To address this, Facial Expression Recognition (FER) combined with Machine Learning (ML) is used to automatically detect and analyze emotions such as happiness, sadness, anger, surprise, and neutrality in real time.
The proposed system works in several steps: facial detection using the Viola-Jones (Haar Cascade) algorithm, feature extraction, and emotion classification via a Convolutional Neural Network (CNN). Captured emotions are linked to product interactions and stored in a database, with results displayed through a Tkinter-based graphical interface for easy visualization. The system supports real-time monitoring, allowing businesses to track emotional responses continuously during shopping. Multimodal data handling, including product details and interaction timing, improves the reliability of insights.
Experimental results show that the system accurately classifies emotions under varying lighting and facial orientations, providing meaningful, unbiased feedback beyond traditional methods. Visual analytics help identify trends in customer satisfaction, aiding decision-making and product strategy.
Advantages:
Non-intrusive, real-time emotion analysis
Reduces reliance on surveys and ratings
Accurate ML-based classification with minimal human intervention
Supports database storage and trend visualization
Enhances decision-making and customer experience
Limitations & Future Work:
Performance may be affected by lighting, camera quality, facial occlusion, and head pose variations
Limited set of emotion categories
Future enhancements could include multimodal inputs (voice, text, behavior), larger datasets, and advanced deep learning models to improve robustness and applicability across diverse retail environments.
Conclusion
The paper presented an intelligent emotion analysis system designed to understand customer shopping experiences using machine learning techniques. By analyzing facial expressions in real time, the proposed approach captures natural and unbiased emotional responses without relying on traditional feedback mechanisms. The integration of face detection, deep learning–based emotion classification, data storage, and visual analytics demonstrates the system’s effectiveness in providing meaningful insights into customer behavior. The results indicate that such AI-driven emotion analysis can support better decision-making, improve customer satisfaction, and enhance overall shopping experiences, highlighting its potential for practical deployment in modern retail environments.
References
[1] S. K. Shah, “A survey of facial expression recognition methods,” IOSR Journal of Engineering (IOSRJEN), vol. 12, no. 3, 2022.
[2] C. Bera, P. Adhav, and S. Amati, “Product review based on facial expression detection,” ITM Web of Conferences, vol. 44, ICACC-2022, 2022.
[3] R. Hasan, S. M. N. Hasan, A. T. Islam, and F. Yasmeen, “Customer review generation in a shopping mall using sentiment analysis and computer vision,” Journal of Fareast International University, vol. 6, no. 1, pp. 39–48, 2023.
[4] D. N. S. L. Indira, L. Sumalatha, and B. R. Markapudi, “Multi facial expression recognition (MFER) for identifying customer satisfaction on products using deep CNN and Haar cascade classifier,” IOP Conference Series: Materials Science and Engineering, vol. 1042, 2021.
[5] K. Koonsanit and N. Nishiuchi, “Classification of user satisfaction on products using facial recognition and machine learning,” in Proc. IEEE Region 10 Conf. (TENCON), Osaka, Japan, 2020.
[6] D. Gupta, P. Lenka, H. Bedi, A. Ekbal, and P. Bhattacharyya, “Auto analysis of customer feedback using CNN and GRU network,” in Proc. Int. Conf., IIIT Bhubaneswar, India, 2020.
[7] A. Saliq, M. S. Bouzakraoui, and A. Y. Alaoui, “Appreciation of customer satisfaction through facial expression and emotion recognition,” in Proc. IEEE Conf., 2019.
[8] Z. Kasiran and S. Yahya, “Facial expression as an implicit customers’ feedback and the challenges,” Advances in Human-Computer Interaction, vol. 2008, Article ID 583809, 2008.
[9] A. S. Sebyakin and A. V. Zolotaryuk, “Tracking emotional state of a person with artificial intelligence methods and its applications to customer services,” in Proc. IEEE Conf., 2022.
[10] P. Thakre and P. Agarkar, “Customer emotions recognition using facial and textual review,” International Journal of Advance Scientific Research and Engineering Trends, vol. 5, no. 6, 2020.
[11] H. Sharma, D. Sharma, K. Bhatt, and B. Shah, “Facial emotion based review accumulation system,” in Proc. IEEE Int. Conf. (INDICON), 2020.
[12] M. Slim, R. Kachouri, and A. B. Atitallah, “Customer satisfaction measuring based on the most significant facial emotion,” in Proc. 15th Int. Multi-Conf. on Systems, Signals and Devices, 2019.
[13] V. Chaugule, A. D. Aadheeshwar, P. B. Ramteke, and S. G. Koolagudi, “Product review based on optimized facial expression detection,” ITM Web of Conferences, vol. 44, ICACC-2022, 2022.
[14] G. Morshed, H. Ujir, and I. Hipiny, “Customers’ spontaneous facial expression recognition,” Indonesian Journal of Electrical Engineering and Computer Science, vol. 22, no. 1, 2021.
[15] M. G. Violante, F. Marcolin, E. Vezzetti, L. Ulrich, G. Billia, and L. Di Grazia, “3D facial expression recognition for defining users’ inner requirements—An emotional design case study,” Applied Sciences, vol. 9, no. 4, 2019.
[16] W. Mellowk and W. Handouzi, “Facial emotion recognition using deep learning: Review and insights,” in Proc. 2nd Int. Workshop on the Future of Internet of Everything, Leuven, Belgium, Aug. 2020.
[17] S. Depuru, A. Nandam, P. A. Ramesh, M. Saktivel, K. Amla, and S. Sivanantham, “Human emotion recognition system using deep learning technique,” Journal of Pharmaceutical Negative Results, vol. 13, special issue, 2022.
[18] A. Pandey, A. Gupta, and R. Shyam, “Facial emotion detection and recognition,” International Journal of Engineering Applied Sciences and Technology, vol. 7, no. 6, 2022
[19] Z.-Y. Huang et al., “A study on computer vision for facial emotion recognition,” Scientific Reports, vol. 13, Article no. 1746, 2023.
[20] P. Tarnowski, M. Ko?odziej, A. Majkowski, and R. Rak, “Emotion recognition using facial expressions,” Procedia Computer Science, vol. 108, pp. 1175–1184, 2017.