Understanding how customers feel is one of the most valuable insights a business can gain. Traditionally, companies depend on surveys, ratings, or written feedbacks to collect reviews—but these methods are often time-consuming, biased, or ignored by users. With the rapid advancement of Artificial Intelligence (AI) and Machine Learning (ML), it is now possible to automatically detect and interpret human emotions using technologies like facial recognition, video analysis, and Natural Language Processing (NLP). This paper presents a comprehensive survey of the latest research efforts aimed at improving customer feedback systems using AI-powered emotion recognition. Many researchers try to understand emotions such as happiness, sadness, anger, or curiosity by studying people’s facial expressions, often using images or videos taken from cameras or webcams. Popular techniques include Convolutional Neural Networks (CNNs) for image-based emotion classification, Support Vector Machines (SVMs) for quick categorization, and Recurrent Neural Networks (RNNs) like GRU and L.This document gives formatting instructions for authors preparing papers for publication in the Proceedings of an IEEE conference. The authors must follow the instructions given in the document for the papers to be published. You can use this document as both an instruction set and as a template into which you can type your own text.
Introduction
In modern retail environments, understanding customer emotions is critical for enhancing user experience and optimizing services. Traditional feedback methods (surveys, ratings) are limited and subjective. To address this, Facial Expression Recognition (FER) combined with Machine Learning (ML) offers a non-intrusive, real-time alternative to assess customer satisfaction.
2. Core System Overview
The proposed AI-based FER system includes:
Face Detection using the Viola-Jones algorithm
Feature Extraction of facial landmarks
Emotion Classification using Convolutional Neural Networks (CNNs)
Emotions like happiness, sadness, anger, fear, surprise, etc., are automatically recognized to reflect customer sentiment. This system is deployable in malls, supermarkets, service desks, and more, providing insights without requiring verbal feedback.
3. Literature Survey Highlights
The review covers 20+ studies, each exploring various FER techniques. Key contributions include:
A. Face Detection & Emotion Classification Techniques
Viola-Jones, Haar Cascade, and LBPH used for detecting facial regions
CNN, SVM, KNN, MLP, GRU, and LSTM models used for emotion classification
Hybrid approaches (e.g., CNN+SVM, CNN+Softmax) boost accuracy and speed
B. Real-Time Emotion Detection
Many systems offer live emotion analysis using cameras in public areas or kiosks
Some integrate age and gender recognition for deeper insights
C. Feature Engineering & Analysis
Systems analyze facial landmarks, mouth curvature, eye movements, or geometric distances between key points
“Curious ratio” used to gauge product interest from facial movements
D. Explainability & Accuracy
High accuracy levels reported:
CNN models: 76–89%
Advanced models (with MLP or hybrid architectures): up to 96%
Techniques like Grad-CAM and SHAP are also used for interpretability in some cases
E. Multimodal & Future Approaches
Combination of text + facial expression analysis for emotion detection
NLP techniques (TF-IDF, BERT, GPT) used to generate human-like reviews
Systems that work even under low light, background noise, or with head pose variations
Emphasis on multimodal emotion fusion (combining audio, visual, and physiological signals)
4. Applications & Use Cases
Retail: Real-time product evaluation, store layout optimization
Customer Service: Emotion-based service adjustment
Smart Devices: Integration with HRI (Human-Robot Interaction) for emotional responsiveness
Healthcare & Education: Emotion tracking for feedback and engagement
Product Design: Emotional mapping to inform product development
5. Key Challenges Identified
Dataset limitations: Many models struggle with diversity, lighting, and facial occlusion
Emotion complexity: Most systems still focus on basic emotions, missing nuanced or mixed expressions
Scalability: Real-world deployment demands lightweight, cost-effective, and robust systems
Conclusion
In today’s experience-driven market, simply offering quality products is no longer enough understanding how customers truly feel has become just as important.
This paper explored various research efforts that use artificial intelligence and facial expression recognition to decode human emotions in retail settings. From analyzing subtle facial muscle movements to interpreting textual feedback, these systems provide a deeper, more natural understanding of customer satisfaction.
Through our study and literature review, it is evident that real-time emotion recognition powered by Convolutional Neural Networks (CNNs), Viola-Jones face detection, and other machine learning methods offers a powerful alternative to traditional feedback systems. These technologies are capable of capturing honest, unbiased emotional responses without requiring verbal input from the customer. They reduce the reliance on manual surveys and help businesses gain faster, data-driven insights into customer preferences and reactions.
Ultimately, this paper supports the growing shift toward affective computing—where machines don’t just process data, but understand feelings. By integrating such emotion-aware systems into customer experience strategies, businesses can not only improve satisfaction but also build stronger, more empathetic relationships with their users.
References
[1] Shail Kumari shah, “A Survey of Facial Expression Recognition Methods”, IOSR Journal of Engineering (IOSRJEN),2022
[2] Chirag Bera, Prathmesh Adhav, Shridhar Amati, “Product Review Based on Facial Expression Detection”, ITM Web of Conferences 44,
[3] ICACC-2022,2022
[4] Rifat Hasan, S M Nahid Hasan, Anika Tasnim Islam, Fauzia Yasmeen,” Customer Review Generation in a Shopping Mall Using Sentiment Analysis and Computer Vision” Journal of Fareast International University, 6 (1), pp.39-48. ffhal-04545388f, 2023
[5] DNVSLS Indira, L Sumalatha, Babu Rao Markapudi, “Multi Facial Expression Recognition (MFER) for Identifying Customer Satisfaction on Products using Deep CNN and Haar Cascade Classifier”, IOP Conference Series: Material Science and Engineering, 2021
[6] Kitti Koonsanit, Nobuyuki Nishiuchi, “Classification of User Satisfaction on Products using Facial Recognition and Machine Learning “,
[7] IEEE REGION 10 CONFERENCE (TENCON) Osaka, Japan , 2020
[8] Deepak Gupta, Pabitra Lenka, Harsimran Bedi, Asif Ekbal, Pushpak Bhattacharyya “Auto Analysis of Customer Feedback using CNN and GRU Network”, International Institute of Information Technology Bhubaneshwar, India, 2020
[9] Abdelalim Saliq, Moulay Smail Bouzakraoui, Abdessamad Youssfi Alaoui “Appreciation of Customer Satisfaction Through Analysis Facial Expression and Emotions Recognition”, IEEE, 2019
[10] Zolidah Kasiran, Saadiah Yahya (Dr), “Facial Expression as an Implicit Customers’ Feedback and the Challenges”, Advances in Human Computer Interaction,InTech,2008.
[11] A S Sebyakin, A. V. Zolotaryuk, “Tracking Emotional State of a Person with Artificial Intelligence Methods and Its Applications to Customer
[12] Services”, IEEE, 2022.
[13] Miss. Preeti Thakre, Prof. Dr. Pankaj Agarkar, “Customer Emotions Recognition Using Facial And Textual Review”, International Journal of Advance Scientific Research and Engineering Trends, 2020
[14] Himanshu Sharma, Devang Sharma, Krutarth Bhatt, Bhavya Shah, “Facial Emotion Based Review Accumulation System” IEEE International Conference (INDICON), 2020
[15] Mariem Slim, Rostom Kachouri, Ahmed Ben Atitallah, “Customer Satisfaction measuring based on the most significant facial emotion”,
[16] 15th International Multi-Conference on Systems, Signals and Devices, 2019
[17] Vikrant Chaugule, Abhishek D, Aadheeshwar Vijayakumar, Pravin Bhaskar Ramteke, and Shashidhar G. Koolagudi, “Product Review Based on Optimized Facial Expression Detection” , ITM Web of Conferences, Volume 44, for the 2022 International Conference on Automation, Computing and Communication (ICACC?2022), 2022
[18] Golam Morshed, Hamimah Ujir, Irwandi Hipiny, “Customers’ spontaneous facial expression recognition” Indonesian Journal of Electrical Engineering and Computer Science , 2021
[19] Maria Grazia Violante, Federica Marcolin, Enrico Vezzetti, Luca Ulrich, Gianluca Billia and Luca Di Grazia , “3D Facial Expression Recognition for Defining Users’ Inner Requirements – An Emotional Design Case Study”, Applied sciences, 2019
[20] Wafa Mellowk, Wahida Handouzi, “Facial emotion recognition using deep learning: Review and insights” The second International
[21] Workshop on the future of Internet Of Everything August , Leuven, Belgium, 9-12-2020
[22] Sivakumar Depuru, Anjana Nandam, P.A.Ramesh,M.Saktivel, K.Amla, Sivanantham, “Human Emotion Recognition System Using Deep Learning Technique”, Journal of Pharmaceutical Negative Results, 2022
[23] Amit Pandey, Aman Gupta, Radhey Shyam, “Facial Emotion Detection And Recognition” International Journal of Engineering Applied Sciences and Technology, Vol.7, 2022
[24] Zi-Yu Huang , Chia-Chin Chiang , Jian-Hao Chen , Yi-Chian Chen , Hsin-Lung Chung , Yu-Ping Cai & Hsiu-Chuan Hsu , “A
[25] study on computer vision for facial emotion recognition”, Scientific Reports,2023
[26] Pawe? Tarnowski, Marcin Ko?odziej, Andrzej Majkowski, Remigiusz Rak, “Emotion recognition using facial expressions”, Article in Procedia Computer Science , December 2017