Title: | Towards the effectiveness of attention-free language models for e- commerce platform sentiment analysis |
Author(s): | Nguyễn Quốc Việt |
Advisor(s): | Đặng Ngọc Hoàng Thành |
Abstract: | While Transformers [104] has gained its popularity in the modern deep learning stack, it has high complexity due to the Attention operation. Although other language models without the use of the Attention component have been proposed, whether they perform effectively in sentiment analysis tasks for e-commerce platform feedback remains understudied. Furthermore, while there have been many surveys and comparative studies on related methods for sentiment analysis, the majority of these studies focus on machine learning and traditional recurrent-based models such as RNN and LSTM without taking into consideration current attention-free language models. In this work, we evaluate the performance of such attention-free models, namely BiLSTM, TextCNN, gMLP, and Hyena. We collected the dataset of e-commerce platform feedback and release it under the name UEH-Ecom. We then implement the aforementioned models and give a comprehensive analysis and comparison. Our findings show that the accuracy of Bidirectional LSTM, TextCNN, HyenaDNA, and gMLP achieved comparative results compared to RoBERTa with significantly less number of parameters. In addition, among the considered attention-free models, even though Bidirectional LSTM obtained the highest accuracy, the difference compared to gMLP is tiny. Otherwise, gMLP also acquired the highest F1 score in the considered attention-free model family |
Issue Date: | 2024 |
Publisher: | University of Economics Ho Chi Minh City |
Series/Report no.: | Giải thưởng Nhà nghiên cứu trẻ UEH 2024 |
URI: | https://digital.lib.ueh.edu.vn/handle/UEH/71653 |
Appears in Collections: | Nhà nghiên cứu trẻ UEH
|