Emotion Classification from Covid-19Pandemic Tweets using RoBERTa

Main Article Content

Sanjeet Kumar, Jameel Ahmad

Abstract

This study proposes a hybrid sentiment analysis model combining RoBERTa (Robustly Optimized BERT Pretraining Approach), a state-of-the-art pre-trained transformer, with SVM (Support Vector Machine) for enhanced performance in predicting sentiments. The model's performance is evaluated across several key metrics, including accuracy, precision, recall, and F1 score, and it outperforms other models such as "SVM," "RoBERTa," and "BERT-BiLSTM (Bidirectional Encoder Representations from Transformers) and BiLSTM (Bidirectional Long Short-Term Memory) ". The proposed RoBERTa-SVM model achieves the highest accuracy (0.92), recall (0.88), and F1 score (0.82), demonstrating its robustness and effectiveness in sentiment classification tasks. While "RoBERTa" alone provides strong precision and recall, it does not perform as well in accuracy and F1 score compared to the hybrid approach. The "SVM" model, on the other hand, performs the weakest, especially in terms of F1 score. This study suggests that combining RoBERTa's language representation capabilities with SVM's classification power significantly improves sentiment analysis. Future work will focus on further optimization of the RoBERTa-SVM model through techniques like hyperparameter tuning and ensembling, as well as testing its performance on diverse datasets.

Article Details

Section
Articles