Fine-tuning Multilingual Transformers for Hausa-English Sentiment Analysis

Yusuf, A. and Sarlan, A. and Danyaro, K.U. and Rahman, A.S.B.A. (2023) Fine-tuning Multilingual Transformers for Hausa-English Sentiment Analysis. In: UNSPECIFIED.

Full text not available from this repository.
Official URL: https://www.scopus.com/inward/record.uri?eid=2-s2....

Abstract

Accurate sentiment analysis is greatly hindered by the code-switching phenomena, especially in the setting low resource language such as the Hausa. However, the majority of previous studies on Hausa sentiment analysis have mainly ignored this problem. This study explores the use of transformer fine-tuning techniques for Hausa language sentiment classification tasks using three pre-trained multilingual language models: Roberta, XLM-R, and mBERT. A multilabel sentiment classification was conducted using Python programming language and TensorFlow library, with a GPU hardware accelerator on Google Collaboratory. The Twitter dataset used in this study contains 16849 train and 2677 unlabeled dev and 5303 test unlabeled samples of tweets/accounts, each labelled with positive, negative, and neutral respectively for train set data. The findings demonstrate that the mBERT-base-cased model gets the maximum accuracy and F1-score of 0.73 and 0.73, respectively, outperforming the other two pre-trained models. The train and validation accuracy graph of the mBERT model shows improvement over time. The study underscores the importance of tailoring the implementation code to meet specific requirements and the significance of fine-tuning pre-trained models for optimal performance. © 2023 IEEE.

Item Type: Conference or Workshop Item (UNSPECIFIED)
Impact Factor: cited By 0
Uncontrolled Keywords: Optimal systems; Statistical tests, Code-switching; Fine tuning; Hausa; Low resource languages; Low-resource; Pre-trained; Sentiment analysis; Sentiment classification; Switching phenomenon; Transformer, Sentiment analysis
Depositing User: Mr Ahmad Suhairi Mohamed Lazim
Date Deposited: 30 Oct 2023 02:04
Last Modified: 30 Oct 2023 02:04
URI: http://scholars.utp.edu.my/id/eprint/37724

Actions (login required)

View Item
View Item