Emotion Classification Using BERT: A Comprehensive Study

Main Article Content

Satti Praveena

Abstract

This task is very significant in natural language processing, since emotion detection finds a variety of applications in different aspects, ranging from predicting the customer's sentiment to mental health analysis. The following work delves into the use of transformer-based models for multi-label emotion categorization, especially on BERT, or Bidirectional Encoder Representations from Transformers. Accuracy, recall, and F1-score metrics are used to measure the performance of the model after preprocessing the GoEmotions dataset for text classification into 28 emotions. The BERT model was enhanced, a multi-label classification framework was set up, and visuals were designed to better understand the results. The proposed method performed well, especially when it came to feelings like love, laughter, and adoration. Results here improve transformer-based approaches in the interpretation of emotional text and open a basis for future affective computing applications and research.

Article Details

Section
Articles