Please use this identifier to cite or link to this item:
https://elib.bsu.by/handle/123456789/337597| Title: | Research on sentiment analysis of hotel review text based on BERT-TCN-BiLSTM-attention model |
| Authors: | Chi, D. Huang, T. Jia, Z. Zhang, S. |
| Keywords: | ЭБ БГУ::ОБЩЕСТВЕННЫЕ НАУКИ::Языкознание |
| Issue Date: | 2025 |
| Publisher: | Elsevier |
| Citation: | Array Volume.2025; 25: 100378 |
| Abstract: | Due to the high semantic flexibility of Chinese text, the difficulty of word separation, and the problem of multiple meanings of one word, a sentiment analysis model based on the combination of BERT dynamic semantic coding with temporal convolutional neural network (TCN), bi-directional long- and short-term memory network (BiLSTM), and Self-Attention mechanism (Self-Attention) is proposed. The model uses BERT pre-training to generate word vectors as model input, uses the causal convolution and dilation convolution structures of TCN to obtain higher-level sequential features, then passes to the BiLSTM layer to fully extract contextual sentiment features, and finally uses the Self-Attention mechanism to distinguish the importance of sentiment features in sentences, thus improving the accuracy of sentiment classification. The proposed model demonstrates superior performance across multiple datasets, achieving accuracy rates of 89.4 % and 91.2 % on the hotel review datasets C1 and C2, with corresponding F1 scores of 0.898 and 0.904. These results, which surpass those of the comparative models, validate the model’s effectiveness across different datasets and highlight its robustness and generalizability in sentiment analysis. It also shows that BERT-based coding can improve the model’s performance more than Word2Vec. |
| URI: | https://elib.bsu.by/handle/123456789/337597 |
| DOI: | 10.1016/j.array.2025.100378 |
| Scopus: | 85218266230 |
| Licence: | info:eu-repo/semantics/openAccess |
| Appears in Collections: | Кафедра востоковедения (публикации, вышедшие не на ФМО) |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| 1-s2.0-S2590005625000050-main.pdf | 5,35 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

