Authors: Kamal Naina Soni, Kushagra Agrawal, Navni Pandya, Nupur Agrawal
DOI Link: https://doi.org/10.22214/ijraset.2021.39143
Certificate: View Certificate
Human expressions play an important role in the extraction of an individual\\\\\\\\\\\\\\\'s emotional state. It helps in determining the current state and mood of an individual, extracting and understanding the emotion that an individual has based on various features of the face such as eyes, cheeks, forehead, or even through the curve of the smile. A survey confirmed that people use Music as a form of expression. They often relate to a particular piece of music according to their emotions. Considering these aspects of how music impacts a part of the human brain and body, our project will deal with extracting the user’s facial expressions and features to determine the current mood of the user. Once the emotion is detected, a playlist of songs suitable to the mood of the user will be presented to the user. This can be a big help to alleviate the mood or simply calm the individual and can also get quicker song according to the mood, saving time from looking up different songs and parallel developing a software that can be used anywhere with the help of providing the functionality of playing music according to the emotion detected.
I. INTRODUCTION
A survey of 2019 and 2020 proved that 68% of people aged from 18 to 34 listen to music every day and significant time spent weekly listening to songs is 16 hours and 14 minutes. This clearly states that music acts as an escape for a few moments and provides relaxation. Looking at the advancement of technology several music players with features such as fast forward, pause, shuffle, repeat, etc. have been developed but applications that provide facial recognition have not been used on a regular basis. Therefore our project can play an important role in this scenario as this music player implies working on the emotions and behavior of the user. It recognizes the facial emotions of the user and plays the songs according to their emotion. The emotions are recognized using a machine learning method EMO algorithm. The webcam captures the image of the user or the user can also select the emoji for expressions. It then extracts the facial features of the user from the captured image. The foremost concept of this project is to automatically play songs based on the emotions of the user. It aims to provide user-preferred music with respect to the emotions detected. According to the emotion, the music will be played from the predefined directories.
II. LITERATURE REVIEW
There are several applications that provide facilities and services for music playlist generation or play a particular song and in this process, all manual work is involved. Now to provide there are various techniques and approaches have been proposed and developed to classify the human emotional state of behavior. The proposed approaches have only focused on only some of the basic emotions using complex techniques like Viola and Jones. Several research papers giving a brief about the idea are:
III. METHODOLOGY
We propose an Emotion-based music player which will play songs according to the emotion of the user. It aims to provide user-preferred music with emotional awareness. It is based on the idea of automating much of the interaction between the music player and its user.
Emotion-Based Music Player is installed on a personal computer where the user can access their customized playlists and play songs based on their emotions.
Emotion Based Music Player is a useful application for music listeners with a PC and an internet connection. The Application is accessible by anyone who creates a profile on the system.
We have provided our users with the additional option of using emojis to generate the playlist. Whenever the user does not wish to or is unable to take a snapshot of their mood due to various reasons such as extremely high or low lighting, their camera not working properly, they have a lower resolution camera which is unable to take a clear picture of their face, which in turn is unable to detect the proper mood or any other reason, the user can click on the “Use Emoji” button and select the emoji which represents the mood which they are in, or the mood that they want their playlist to be generated of.
This flow chart provides an overview of the application and simply explains the functionality of the application.
IV. RESULT DISCUSSIONS
As every person has unique facial features, it is difficult to detect accurate human emotion or mood. But with proper facial expressions, it can be detected up to a certain extent. The camera of the device should have a higher resolution. The application shall run successfully and meet the outcome as precisely as possible.
For Example: For “angry”, “Fear”, “disgust” and “surprise” moods, devotional, motivational, and patriotic songs are suggested to the user. Hence, the user is also provided with mood improvement.
Instructions Explained to the User. In this scenario, the users were given instructions as to what is to be done to perform the prediction of the emotion expressed which provided the following results. Sometimes in cases where the inner emotion is sad and facial expression is happy it resulted in a failed case.
Thus, the accuracy may vary as follows-
USER |
EMOTION |
EXPRESSION |
ACCURACY |
1 |
Happy |
Happy |
100% |
2 |
Sad |
Happy |
0% |
3 |
Sad |
Sad |
100% |
V. EXISTING SYSTEM
S.No. |
Name |
Problem Identified |
Advantages |
Ref. Link |
1. |
Spotify |
1. No facial extraction 2. Manual browsing through playlist |
1. Easily accessible 2. High availability 3. Abundant songs collection |
|
2. |
Gaana |
1. No facial extraction 2. Manual browsing through playlist |
1. Easily accessible 2. High availability 3. Abundant songs collection
|
|
3. |
JioSaavan |
1. No facial extraction 2. Manual browsing through playlist |
1. Easily accessible 2. High availability 3. Abundant songs collection
|
|
3. |
Hungama |
1. No facial extraction 2. Manual browsing through playlist |
1. Easily accessible 2. High availability 3. Abundant songs collection |
VI. ACKNOWLEDGMENT
We as the authors would like to extend a special thanks of a vote to the reviewers of this paper for their valuable suggestions to improve this paper. The paper is supported by Acropolis Institute of Technology and Research, Indore (M.P.)
The Emotion-Based Music Player is used to automate and give a better music player experience for the end-user. The application solves the basic needs of music listeners without troubling them as existing applications do: it uses increases the interaction of the system with the user in many ways. It eases the work of the end-user by capturing the image using a camera, determining their emotion, and suggesting a customized play-list through a more advanced and interactive system. The user will also be notified of songs that are not being played, to help them free up storage space. Our main aim is to consume users’ time and to satisfy them. We have designed this application in such a manner that it can run on mobile [ Android ] as well as a desktop [ Windows ].
[1] Hafeez Kabani, Sharik Khan, Omar Khan, Shabana Tadvi” Emotion Based Music Player” International Journal of Engineering Research and General Science Volume 3, Issue 1, January-February, 2015 [2] Shlok Gikla, Husain Zafar, Chintan Soni, Kshitija Waghurdekar” SMART MUSIC INTEGRATING AND MUSIC MOOD RECOMMENDATION”2017 International Conference on Wireless Communications, Signal Processing and Networking(WiSppNET) [3] Srushti Sawant, Shraddha Patil, Shubhangi Biradar,” EMOTION BASED MUSIC SYSTEM”, International Journal of Innovations & Advancement in Computer Science, IJIACS ISSN 2347-8616 volume 7, Issue 3 March 2018
Copyright © 2022 Kamal Naina Soni, Kushagra Agrawal, Navni Pandya, Nupur Agrawal. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Paper Id : IJRASET39143
Publish Date : 2021-11-28
ISSN : 2321-9653
Publisher Name : IJRASET
DOI Link : Click Here