Indian Researcher's AI Model Can 'Feel' Lyrics, Suggest the Perfect Song for Your Mood
Indian Researcher's AI Model Can 'Feel' Lyrics, Suggest the Perfect Song for Your Mood
Indian researcher Yudhik Agarwal has presented a music emotion recognition (MER) technology, which he claims may one day be used to treat mental issues, among others.

Music streaming apps already offer users the ability to select a playlist based on moods, and the latter have gotten fairly good at recognising a mellow track to play when you’re stuck in traffic on a Monday evening. However, Indian researcher Yudhik Agarwal states that such technology isn’t exactly the most sophisticated, and these systems can be significantly shored up by analysing song lyrics – also more difficult to do. On this note, Agarwal presented at the European conference on information retrieval about his own music mood curation technique, stating that one day, these technologies may even hold answers to therapies to take on various mental issues.

To be specific, the area where Agarwal worked on is Music Emotion Recognition (MER), which is a subset of Music Information Retrieval (MIR) that is used by all services offering custom curation of tracks into various moods. However, a post by the International Institute of Information Technology, Hyderabad (IIIT-H) on Agarwal’s achievements state how lyrics are often an ignored subset due to the still nascent days of natural language processing (NLP). The reason for this was linked to contextual interpretation of the written word, which AI and sequential deep learning models so far failed to understand.

It is this that Agarwal leveraged, by deploying the XLNet deep learning natural language processing technique. This helped Agarwal and his team plot a four-quadrant graph by using the Valence-Arousal metric – which includes valence for happiness and arousal for energy. This data was processed into different tallied points, which would therefore classify tracks to gauge the mood that its lyrics represent.

Prof. Vinoo Alluri, who mentored Agarwal’s research, said, “This study has important implications in improving applications involved in playlist generation of music based on emotions. For the first time, the transformer approach is being used for lyrics and is giving notable results. In the field of Psychology of Music, this research will additionally help us in understanding the relationship between individual differences like cognitive styles, empathic and personality traits and preferences for certain kinds of emotionally-laden lyrics.”

Going forward, Agarwal is slated to present a follow-up to his research at the upcoming International Conference on Music Perception and Cognition, which aims to use the XLNet working lyric classification model to map a listener’s personality traits with emotions, to gauge better music recognition algorithms – something that may have far-reaching implications in advanced scientific fields such as cognitive therapy.

Read all the Latest News, Breaking News and Coronavirus News here. Follow us on Facebook, Twitter and Telegram.

What's your reaction?

Comments

https://hapka.info/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!