Authors: Mr. Soham Jagtap, Mr. Aditya Marne, Mr. Adnan Sheikh, Mr. Vivek Potdar, Prof. Parinita Chate
Certificate: View Certificate
In all forms of learning, students are required to make the most of the time allotted to them during instruction. They will pay close attention to the material being taught to them. Students\' performance will rise as a result. In order to provide a framework for the future, we are attempting to ascertain the feelings and behaviors of the students participating in an online course as well as to examine their distracted behavior. This study on students\' tendency toward distraction can assist teachers in monitoring these students and providing assistance to those who choose to improvise their performances. To investigate the distracted behavior of students during virtual classes, one must possess knowledge in image processing, machine learning, and deep learning techniques. The COVID-19 pandemic has had a drastic effect, forcing the shutdown of traditional classrooms and a shift in teaching strategies to an online format. It is crucial to guarantee that students are appropriately engaged in online learning sessions in order to create an environment that is as engaging as traditional offline classrooms. This study suggests using a face emotion to gauge online learners\' interest in real time. This is accomplished by classifying the students\' moods during the online learning session by examining their facial expressions.,
The use of e-learning has become a revolutionary force in the always changing field of education, providing students of all ages and backgrounds with a level of flexibility and accessibility never before possible. Students are participating in educational experiences that go beyond traditional classroom walls more and more as a result of the widespread availability of online courses, virtual classrooms, and digital resources. But in this digital age, the emotional health and experiences of students have become even more important than the technological skills and content delivery.
A. Current Scenario in Online Education
Due to the pandemic that is currently affecting several nations, it is advised that students use any device to attend classes virtually. When teaching online, teachers need to keep an eye on their students' attentiveness. The observation indicates that teachers fall into several types. 1. The first category consists of teachers who, in addition to instructing, carefully watch how their students behave and call on them repeatedly to ensure that they are learning and staying alert. 2. Some teachers will start a lesson, point out to the students who are becoming side-tracked and give them a warning, then resume. 3. Some instructors never care if their students are distracted or not; they just keep teaching the lessons. 4. Some professors will pique students' interest by consistently inspiring zeal, forcing students to be curious, letting them take part in the activities, and piquing students' interest.
B. Our Study of Students Distraction
We were able to continue our investigation into facial landmark identification, computation of change, rotation, and other face landmarks in a video, thanks to our study of student distraction. Excellent research is being conducted in the areas of emotion detection, facial landmark recognition, driver fatigue, and facial recognition.
We found in our study that there are a number of reasons why online learners become distracted. Looking left or right, turning away from the class, or turning away from the devices can all be considered forms of distraction. When a student looks away from the device and moves their neck and eyes left or right for an extended period of time, it indicates that they are not engaged in the material and are not paying attention.
C. Problem Statement
In online education, understanding how students feel is important. This project aims to use technology to recognize students’ emotions through their facial expressions, making e-learning more responsive and engaging.
To enhance the quality and effectiveness of E-Learning experiences by systematically detecting, understanding, and addressing the emotional states of students, thereby promoting engagement, satisfaction, and improved learning outcomes.
II. LITERATURE SURVEY
III. ANALYSIS OF PROPOSED MODEL
While implementing an online education system with emotion detection can offer various advantages, it also comes with several challenges. Addressing these challenges is crucial to ensure the ethical use of technology and the creation of a supportive learning environment. Here are some challenges associated with this model:
The integration of emotion detection in online education systems has various applications that can enhance the learning experience, improve student engagement, and provide valuable insights to educators. Here are some applications:
IV. SOFTWARE REQUIREMENTS SPECIFICATION
A. Functional Requirements
B. Non-functional Requirements
a. Real-time Emotion Recognition: To ensure the least amount of delay, the system must be able to handle audio and visual input for emotion recognition in real-time.
b. Scalability: The platform must be able to accommodate an increase in users and courses without seeing a noticeable drop in performance. Peak loads during registration periods or live events should be supported.
c. Response Time: Quick response times for user engagements, such loading course materials, turning in assignments, or using real-time communication capabilities, should be provided by the system.
d. Content Delivery: Even for consumers with weaker internet connections, the platform should provide high-quality video streaming and content delivery without any lag or delays.
2. Safety Requirements
a. Data Privacy: To ensure that student emotional data is gathered and stored safely, the system must adhere to data privacy rules. Data that is personally identifiable (PII) needs to be safeguarded.
b. User Authentication: To stop unwanted access to the system, robust user authentication and authorization procedures must be in place.
c. Secure Communication: To avoid data breaches and eavesdropping, all data transported within the platform should be encrypted.
3. Security Requirements
a. Vulnerability Assessment: To find and fix system vulnerabilities, regular penetration tests and security assessments should be carried out.
b. User Data Protection: Put in place defences against typical security risks like SQL injection and cross-site scripting (XSS) to prevent data breaches.
c. Security Incident Response: Create an incident response strategy to deal with security lapses in a timely and efficient manner.
d. Access Control: Make sure users have the right amount of access according to their jobs by implementing access control to stop unauthorized modifications to settings or content.
e. User Account Security: To prevent unwanted access to user accounts, put in place password policies, account lockout procedures, and secure password storage.
f. Monitoring and Logging: To identify and address security incidents, employ thorough logging and ongoing monitoring of system activity.
C. System Requirements
1. Software Requirements
a. Emotion Detection Algorithms: Put into practice or make use of already-developed emotion detection algorithms. These algorithms may be based on natural language processing, computer vision, or both.
b. Machine Learning Libraries: To train and implement emotion detection models, use machine learning frameworks such as scikit-learn, PyTorch, or TensorFlow.
c. Data Collection and Storage: Software that ensures data security and privacy while gathering, storing, and managing emotional data from students.
e. Database management software, such as MySQL, PostgreSQL, or NoSQL databases, is used to manage student and emotional data.
f. Real-time Processing: To handle data as it is generated in live sessions, real-time data processing solutions such as Apache Kafka or Apache Flink are utilized.
g. User Interfaces: Create user interfaces that let educators and students see emotional responses and insights.
h. Tools for Privacy and Security: To protect data privacy and security, use encryption, access controls, and compliance tools.
i. Emotion Labelling Tools: Create or utilize tools to label emotional data so that models can be trained and validated.
j. Video and Audio Processing: Applications that handle audio and video data from students' microphones and webcams; these applications may use computer vision libraries such as OpenCV.
2. Hardware Requirements
a. A well-made megapixel camera module.
b. Cable for Power Supply
c. Class 10 Micro SD Card, 16GB
D. External Interface Requirements
a. Student Interface: Students should be able to access courses, see content, communicate with instructors, and get immediate emotional state feedback through an easy-to-use and intuitive user interface.
b. Instructor Interface: For the purpose of creating, managing, and tracking students' emotional states in real time, instructors ought to have their own interface. This interface ought to offer guidance and intervention resources.
c. Administrator Interface: Tools for controlling the complete platform, including user accounts, content, and system configurations, are necessary for administrators and support personnel to have access to.
d. Parent/Guardian Portal (K–12): An independent portal or interface for K–12 online learning that allows parents and guardians to keep an eye on their kids' academic performance, attendance, and emotional health
2. Hardware Interfaces:
a. Camera and Microphone: To collect audio and video data for emotion recognition, the system should communicate with the user's camera and microphone.
b. Mobile Devices: Verify that the user interfaces work well and are appropriate for a range of mobile devices, such as tablets and smartphones.
3. Software Interfaces
a. Learning Management System (LMS) Integration: To facilitate easy sharing of content and administration of courses, the platform ought to interface with widely used LMS systems.
b. Video Conferencing Integration: Real-time collaboration and online learning via integration with video conferencing platforms such as Zoom or Microsoft Teams.
c. Third-Party EdTech Tools: Facilitate content distribution and assessment through integration with third-party educational technology tools and APIs.
d. Data Analytics and Reporting Tools: Utilize these resources to obtain information and insights regarding emotional trends and student involvement.
4. Communication Interfaces:
a. Real-Time Chat and Messaging: Implement real-time chat and messaging for student-instructor interaction, group discussions, and student support.
b. Discussion Forums: Provide discussion forums for asynchronous communication and collaboration among students and instructors.
c. Email Notifications: Send email notifications to users for important updates, reminders, and system alerts.
d. APIs for Data Exchange: Develop APIs to allow data exchange between the platform and external systems, such as learning analytics or assessment tools.
V. OFTWARE QUALITY ATTRIBUTES
VI. METHODOLOGY OF PROPOSED MODEL
A. Automatic Frame Selection
In face recognition for e-learning, automatic frame selection refers to selecting the most appropriate face image from a video to symbolize a user's identity. This guarantees that the right individual is taking online courses or taking tests, and it also helps to increase the accuracy of authentication.
The webcam's video feed serves as our input for the suggested method, which is designed to help learners learn through internet videos.
Frame-based processing is used to extract discriminative features from video streaming. However, none of the frame’s aid in the face's detection. In order to obtain the optimal frames for face detection, frame selection is done. Every twenty seconds, or after a predetermined amount of time, the suggested method pulls the images from the video stream.
B. Automatic Emotion Recognition using Deep Learning Models
C. An Engagement detection system
An engagement detection system keeps track of a student's attentiveness in online classes. It is frequently combined with face recognition technology in e-learning. It assesses students' degree of participation by examining their facial expressions and eye movements, which enables teachers to modify their lesson plans and enhance the learning process as a whole. In this instance, facial emotions are classified into six classes, which are further divided into engaged and disengaged classes.
where WE = Weight of related Emotion, and EP = Emotion Probability (Emotion = Neutral, Angry, Sad, Happy, Surprised, and Fear). A deep CNN classifier generates both the corresponding emotion weight and the EP (emotion probability) score. Emotion Weight is a term used to characterize the emotional state that represents a learner's engagement at that particular moment in time. The emotions are used to compute the engagement percentage.
VII. SYSTEM ARCHITECTURE
A. User Interface Layer
B. Application Layer
C. Business Logic Layer
VIII. FUTURE SCOPE
Looking ahead, there is a lot of promise for improving E-learning and comprehending students' emotions in the future. As technology develops further, we should anticipate the release of more advanced instruments that are intended to track and react to students' emotional states more accurately, opening the door to a more customized and compassionate learning environment. Furthermore, as online and remote learning continue to expand in popularity, there will likely be an even greater need for emotional support and involvement in virtual classrooms. This changing environment offers exciting opportunities for the creation of cutting-edge teaching strategies that skilfully incorporate emotional intelligence into the classroom, fostering a holistic learning environment where students feel truly appreciated, understood, and inspired in addition to learning new material. This might therefore lead to a major improvement in their general well-being and academic performance, ushering in a new era of education that is both intellectually and emotionally stimulating.
Exposing students\' feelings and improving the state of e-learning is a journey that goes beyond technology and has the power to completely change the nature of education. With the advent of digital learning, we now know that although material can be displayed on screens, the real link that connects students to their online classes is emotional involvement. As we embark on this path, it\'s critical to keep in mind that learning about students\' experiences involves more than just data collection. It all comes down to empathy and comprehension. It entails identifying when a pupil is having a difficult period, acknowledging their accomplishments, and fostering an environment that is emotionally supportive. This is a personalized approach for every student, understanding that their emotions are as unique as their personalities; it\'s not a one-size-fits-all solution. By taking on this task, we\'re opening the door for an education system that is more understanding, inclusive, and productive. We\'re giving teachers the tools they need to adapt and satisfy the emotional needs of their pupils. We\'re making sure that in an increasingly digital world, students are acknowledged, heard, and valued. In summary, releasing students\' emotions and enhancing e-learning is not merely a means to an end but rather the beginning of a deeper educational journey in which emotions play a critical role in the process of learning. It\'s a commitment to developing our students\' hearts and souls in addition to their minds. Through this endeavour, we\'re transforming lives in addition to education.
 Vijayanand. G. Asst. Prof./CSE Muthyammal Engineering College, Rasipuram, Tamil Nadu, Emotion Detection using Machine Learning. 2Karthick, S. IV Year, CSE Muthyammal Engineering College, Rasipuram, Tamil Nadu. 3Hari, Rasipuram, Tamil Nadu; B IV Year; CSE Muthyammal Engineering College. 4 Rasipuram, Tamil Nadu - Jaikrishnan, V IV Year/CSE Muthyammal Engineering College  Xiaofeng Lu\'s CNN-BiLSTM algorithm-based Deep Learning Based Emotion Recognition and Visualization of Figural Representation  Yuxin Cui, Sheng Wang, and Ran Zhao\'s Machine Learning-Based Student Emotion Recognition for Business English Class  Eye Blink Based Fatigue Detection for Computer Vision Syndrome Prevention by Divjak, M., and Bischof, H. In: MVA (2009), pp. 351-353.  Turabzadeh, S., Meng, H., Swash, R.M., Pleva, M., and Juhar, J.: Real-time embedded systems using facial expression emotion recognition. Technologies. 2018.  Bidwell, J., and Fuchs, H.: Automated gaze tracking in the classroom for measuring student involvement. 49, 113 in Behav Res Methods (2011).  Kritika, L.B., GG, L.P.: Learner concentration metric-based student emotion recognition system (SERS) for e-learning enhancement. Procedia Computer Science, 85 (2016).  Kamath, A., Biswas, A., and Balasubramanian, V.: Recognizing student participation in e-learning environments by crowdsourced methods. pp. 1–9 in: IEEE Winter Conference on Applications of Computer Vision (WACV), 2016 IEEE.  Sharma, P., Esengönül, M., Khanal, S.R., Khanal, T.T., Filipe, V., Reis, M.J.C.S.: Facial emotion analysis as a means of assessing student concentration in an online learning environment. In: International Conference on Innovation and Technology in Education, pages 529–538, 2005. Springer, Teaching and Education (2018)  Methods for Facial Expression Recognition with Applications in Challenging Situations Anil Audumbar Pise, 1 , 2 Mejdal A. Alqahtani, 3 Priti Verma, 4 Purushothama K, 5 Dimitrios A. Karras, 6 Prathibha S, 7 and Awal Halifacorresponding author 8.  \"A comparative study of face landmarking techniques,\" Oya Celiktutan, Sezer Ulukaya, and Bulent Sankur, 2013, Article number: 13 (2013).  Real-Time Attention Monitoring System for Classroom: A Deep Learning Approach for Student’s Behavior Recognition by Zouheir Trabelsi 1,Fady Alnajjar 2,3,*ORCID,Medha Mohan Ambali Parambil 1ORCID,Munkhjargal Gochoo 2ORCID andLuqman Ali 2,3,4ORCID  Engagement matters: Student opinions of the significance of engagement techniques in the online learning environment, Martin, F., & Bolliger, D.U. (2018).  \"Teachers\' and Students\' Opinions About Students\' Attention Problems During the Lesson,\" Mehmet Ali Cicekci1 & Fatma Sadik2, Journal of Education and Learning, Vol. 8, No. 6, 2019, ISSN 1927-5250 E-ISSN 1927-5269, Published by Canadian Center of Science and Education
Copyright © 2024 Mr. Soham Jagtap, Mr. Aditya Marne, Mr. Adnan Sheikh, Mr. Vivek Potdar, Prof. Parinita Chate. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.