Ai Based Music Player
Ai Based Music Player
• Team Members:
Roll Sign
No. Name of Student Contact Email-ID
No.
01 Rajeshri M Bhole 8378053084 [email protected]
03 Aprajeeta singh 8521813718 [email protected]
23 kunal gawai 7218564358 [email protected]
71 khushibharti sah 9604379042 [email protected]
• Sponsored/ In-house:
• Problem Definition: - Human expression plays a vital role in determining the current
state and mood of an individual, it helps in extracting and understanding the emotion that an
individual has based on various features of the face such as eyes, cheeks, forehead or even
through the curve of the smile. Music is basically an art form that soothes and calms human
brain and body. By developing a recommendation system, it could assist a user to make a
decision regarding which music one should listen to helping the user to reduce his/her stress
levels
• List of modules/Functionalities
⚫ User:
• Registration
• Login
Music plays a very primary role in elevating an individual ‘s life as it is an important medium
of entertainment for music lovers and listeners. In today ‘s world, with the increasing advancements
in the field of multimedia and technology, various music players have been developed with features
like fast forward, reverse, variable playback speed, genre classification, streaming playback with
multicast streams and including volume modulation, etc.
These features might satisfy the user ‘s basic requirements, but the user has got to face the
task of manually browsing the playlist of songs and choose songs supported their current mood and
behaviour.
• Scope of the project
There are several applications that provides facilities and services for music playlist generation
or play a particular song and in this process all manual work is involved. Now to provide there are
various techniques and approaches have been proposed and developed to classify human emotional
state of behaviour. The proposed approaches have only focused on only some of the basic emotions
using complex techniques.
1. Login/signUp phase: Users have to create a profile in order to store personal date. If the user
already has an account, they can log-in to their account to access customized play-lists as well as
songs. Once user logs-in, their profile is saved on the application, until they manually log-out. While
the user adds songs, their input (i.e. category and interest level) is taken by the system.
2. Emotion Capture phase: As soon as the authentication phase is done, the application will ask
user's permission to access media and photos and will access camera to capture the user's image.
3. Afftectiva Affdex API: After the image is captured, the application sends image capture to
Affdex SDK. There, the captured image is processed and the image feedback is sent to the
application.
4. Emo-phase: In this phase, the application receives the image information and recognizes the
emotion based on the defined threshold. This emotion is sent to the database to fetch the emotion
play-list.
5. Display phase: Here, the songs are organized based on EMO-algorithm and the user can
play any song from the list displayed. The user has the option to add, remove, modify the songs and
also can change category and interest level of a song at anytime in the application. The application
also has a recommendation tab where the system notifies the user of songs that are rarely played
• Literature Survey (List of References only):
• https://ieeexplore.ieee.org/document/8802550
https://ieeexplore.ieee.org/document/8388671
https://www.programmableweb.com/api/affectiva-affdex
https://www.geeksforgeeks.org/emotion-based-music-player-python-project/
Requirements:
1. Hardware requirements
● Processor : 2 GHz
● RAM : 1 GB
Storage : 5 GB
Camera
2. Browser
● Chrome 51 or higher
● Opera 37
3. Database
● Firebase
● NoSQL
4. API : Affective Emotion Recognition API
5. IDE: Visual Studio Code
6. Music database
• Outcomes:
The Emotion-Based Music Player is used to automate and give a better music player
experience for the end user. The application solves the basic needs of music listeners without
troubling them as existing applications do: it uses increase the interaction of the system with the
user in many ways.
It eases the work of the end – user by capturing the image using a camera, determining their
emotion, and suggesting a customized play-list through a more advanced and interactive system.
The user will also be notified of songs that are not being played, to help them free up storage space.
The application can be improved by modifying and adding few functionality
Playing songs automatically, Optimizing the EMO-algorithm by including additional features
which helps system to categorize user based on many other factors like location and suggesting the
user to travel to that location and play songs accordingly .