Please use this identifier to cite or link to this item:
https://digital.lib.ueh.edu.vn/handle/UEH/71653
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Đặng Ngọc Hoàng Thành | en_US |
dc.contributor.author | Nguyễn Quốc Việt | en_US |
dc.contributor.other | Nguyễn Nhật Quang | en_US |
dc.contributor.other | Nguyễn King | en_US |
dc.date.accessioned | 2024-08-20T02:15:51Z | - |
dc.date.available | 2024-08-20T02:15:51Z | - |
dc.date.issued | 2024 | - |
dc.identifier.uri | https://digital.lib.ueh.edu.vn/handle/UEH/71653 | - |
dc.description.abstract | While Transformers [104] has gained its popularity in the modern deep learning stack, it has high complexity due to the Attention operation. Although other language models without the use of the Attention component have been proposed, whether they perform effectively in sentiment analysis tasks for e-commerce platform feedback remains understudied. Furthermore, while there have been many surveys and comparative studies on related methods for sentiment analysis, the majority of these studies focus on machine learning and traditional recurrent-based models such as RNN and LSTM without taking into consideration current attention-free language models. In this work, we evaluate the performance of such attention-free models, namely BiLSTM, TextCNN, gMLP, and Hyena. We collected the dataset of e-commerce platform feedback and release it under the name UEH-Ecom. We then implement the aforementioned models and give a comprehensive analysis and comparison. Our findings show that the accuracy of Bidirectional LSTM, TextCNN, HyenaDNA, and gMLP achieved comparative results compared to RoBERTa with significantly less number of parameters. In addition, among the considered attention-free models, even though Bidirectional LSTM obtained the highest accuracy, the difference compared to gMLP is tiny. Otherwise, gMLP also acquired the highest F1 score in the considered attention-free model family | en_US |
dc.format.medium | 80 p. | en_US |
dc.language.iso | en | en_US |
dc.publisher | University of Economics Ho Chi Minh City | en_US |
dc.relation.ispartofseries | Giải thưởng Nhà nghiên cứu trẻ UEH 2024 | en_US |
dc.title | Towards the effectiveness of attention-free language models for e- commerce platform sentiment analysis | en_US |
dc.type | Research Paper | en_US |
ueh.speciality | Công nghệ thông tin | en_US |
ueh.award | Giải A | en_US |
item.openairecristype | http://purl.org/coar/resource_type/c_18cf | - |
item.grantfulltext | reserved | - |
item.cerifentitytype | Publications | - |
item.fulltext | Full texts | - |
item.openairetype | Research Paper | - |
item.languageiso639-1 | en | - |
Appears in Collections: | Nhà nghiên cứu trẻ UEH |
Files in This Item:
File
Description
Size
Format
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.