Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Targeted sentiment classification with multi-attention network

Targeted sentiment classification with multi-attention network Targeted sentiment classification aims at recognising the sentiment polarity of specific targets. However, existing methods mainly depend on a crude attention mechanism, while neglecting the mutual effects between target and context. In order to solve this problem, this paper introduces a Multi-Attention Network (MAN) for aspect level sentiment classification. We jointly modelled intra-level and inter-level attentional components to capture the interaction between target and context. The former attention mechanism pays attention to the context relation, whereas the latter attention mechanism considers important parts in a sentence. The experimental conducted on laptop, restaurant and Twitter data sets indicate that our model surpasses the baseline model. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png International Journal of Wireless and Mobile Computing Inderscience Publishers

Targeted sentiment classification with multi-attention network

Loading next page...
 
/lp/inderscience-publishers/targeted-sentiment-classification-with-multi-attention-network-tlDkzlDvgl

References

References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.

Publisher
Inderscience Publishers
Copyright
Copyright © Inderscience Enterprises Ltd
ISSN
1741-1084
eISSN
1741-1092
DOI
10.1504/ijwmc.2022.127585
Publisher site
See Article on Publisher Site

Abstract

Targeted sentiment classification aims at recognising the sentiment polarity of specific targets. However, existing methods mainly depend on a crude attention mechanism, while neglecting the mutual effects between target and context. In order to solve this problem, this paper introduces a Multi-Attention Network (MAN) for aspect level sentiment classification. We jointly modelled intra-level and inter-level attentional components to capture the interaction between target and context. The former attention mechanism pays attention to the context relation, whereas the latter attention mechanism considers important parts in a sentence. The experimental conducted on laptop, restaurant and Twitter data sets indicate that our model surpasses the baseline model.

Journal

International Journal of Wireless and Mobile ComputingInderscience Publishers

Published: Jan 1, 2022

There are no references for this article.