FEELS: Framework for Emotion-Enhanced Language Systems

Framework Integrating data with Language Models to enhance Emotional Intelligence in Chatbots, applications in mental health, among others Category: Technology 

https://nu.technologypublisher.com/files/sites/adobestock_521250682.jpeg

https://nu.technologypublisher.com/files/sites/vaibhav-1280x16001.jpeg

Image Source: Miha Creative, 521250682, www.stock.adobe.com

Background 

Sentiment analysis methods that infer emotions from text are inherently subjective, as they rely heavily on input prompts and prompts vary from person to person. This approach introduces human biases, given that the analysis is directly influenced by an individual’s prompt, leading to inconsistent interpretations of emotions. Typically, these methods produce only qualitative or discrete emotion classes, which do not capture the full complexity and intensity of human emotional states. Furthermore, many existing emotion prediction models are black boxes, lacking transparency and exacerbating bias issues. These models often overlook essential aspects of human interaction, such as cognitive, emotional, and behavioral dimensions. To reach a level of human-like emotional intelligence, it is crucial to integrate these additional dimensions into LLMs.

 

Description 

Northeastern University researchers developed the Framework for Emotion-Enhanced Language Systems (FEELS), designed to augment the emotional intelligence (EI) of Large Language Models (LLMs) used in chatbots. FEELS function by integrating data from low-cost wearable devices that measure psychological, physiological, cognitive, and behavioral states. The framework operates in two stages: Stage One, it quantifies users' emotional states by analyzing physiological along with behavioral responses. Stage Two, these quantified emotion scores are then combined with traditional inputs to refine the chatbot's responses, aiming to create more empathetic and responsive interactions. Advanced learning techniques, including feature extraction, selection, and meta-learning, are employed to ensure accurate and efficient emotion recognition, evaluated using established emotional intelligence benchmarks.  

 

Benefits:  

 

  • Integrates psychological, physiological, cognitive, and behavioral measurements. 

  • Provides a depth and intensity of emotions, addressing limitations of existing sentiment analysis-based methods. 

  • Employs advanced learning techniques, including feature extraction, selection, and meta-learning, for accurate emotion recognition. 

  • Cost-effective solution for emotional and cognitive assessment using low-cost wearables. 

  • Facilitates the development of more empathetic and responsive chatbots, improving human-LLM interactions. 

 

Applications: 

 

  • Customer service chatbots 

  • Mental healthcare management 

  • Interactive entertainment platforms 

  • Employee well-being assessment 

 

Opportunity  

Seeking licensee/industry partner/funding 

 

Patent Status:  

Provisional filed 

 

Seeking:

  • Development Partner 

  • Commercial Partner 

  • Licensing 

  • Seeking Investment

:

 

Patent Information: