Advanced
Please use this identifier to cite or link to this item: https://digital.lib.ueh.edu.vn/handle/UEH/71653
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorĐặng Ngọc Hoàng Thànhen_US
dc.contributor.authorNguyễn Quốc Việten_US
dc.contributor.otherNguyễn Nhật Quangen_US
dc.contributor.otherNguyễn Kingen_US
dc.date.accessioned2024-08-20T02:15:51Z-
dc.date.available2024-08-20T02:15:51Z-
dc.date.issued2024-
dc.identifier.urihttps://digital.lib.ueh.edu.vn/handle/UEH/71653-
dc.description.abstractWhile Transformers [104] has gained its popularity in the modern deep learning stack, it has high complexity due to the Attention operation. Although other language models without the use of the Attention component have been proposed, whether they perform effectively in sentiment analysis tasks for e-commerce platform feedback remains understudied. Furthermore, while there have been many surveys and comparative studies on related methods for sentiment analysis, the majority of these studies focus on machine learning and traditional recurrent-based models such as RNN and LSTM without taking into consideration current attention-free language models. In this work, we evaluate the performance of such attention-free models, namely BiLSTM, TextCNN, gMLP, and Hyena. We collected the dataset of e-commerce platform feedback and release it under the name UEH-Ecom. We then implement the aforementioned models and give a comprehensive analysis and comparison. Our findings show that the accuracy of Bidirectional LSTM, TextCNN, HyenaDNA, and gMLP achieved comparative results compared to RoBERTa with significantly less number of parameters. In addition, among the considered attention-free models, even though Bidirectional LSTM obtained the highest accuracy, the difference compared to gMLP is tiny. Otherwise, gMLP also acquired the highest F1 score in the considered attention-free model familyen_US
dc.format.medium80 p.en_US
dc.language.isoenen_US
dc.publisherUniversity of Economics Ho Chi Minh Cityen_US
dc.relation.ispartofseriesGiải thưởng Nhà nghiên cứu trẻ UEH 2024en_US
dc.titleTowards the effectiveness of attention-free language models for e- commerce platform sentiment analysisen_US
dc.typeResearch Paperen_US
ueh.specialityCông nghệ thông tinen_US
ueh.awardGiải Aen_US
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.grantfulltextreserved-
item.cerifentitytypePublications-
item.fulltextFull texts-
item.openairetypeResearch Paper-
item.languageiso639-1en-
Appears in Collections:Nhà nghiên cứu trẻ UEH
Files in This Item:

File

Description

Size

Format

Show simple item record

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.