Publication | Closed Access
Convolutional multi-head self-attention on memory for aspect sentiment classification
83
Citations
18
References
2020
Year
EngineeringMemory NetworkMultimodal Sentiment AnalysisRecurrent Neural NetworkSentiment AnalysisSocial SciencesText MiningWord EmbeddingsNatural Language ProcessingData ScienceSelf-supervised LearningComputational LinguisticsAffective ComputingMemoryMachine TranslationCognitive ScienceSequence ModellingNlp TaskDeep LearningMulti-head Self-attentionConvolutional Multi-head Self-attentionRecurrent UnitLinguisticsPo Tagging
This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network (CMA-MemNet). This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic information from sequences and aspects. In order to fix the memory network's inability to capture context-related information on a word-level, we propose utilizing convolution to capture n-gram grammatical information. We use multi-head self-attention to make up for the problem where the memory network ignores the semantic information of the sequence itself. Meanwhile, unlike most recurrent neural network (RNN) long short term memory (LSTM), gated recurrent unit (GRU) models, we retain the parallelism of the network. We experiment on the open datasets SemEval-2014 Task 4 and SemEval-2016 Task 6. Compared with some popular baseline methods, our model performs excellently.
| Year | Citations | |
|---|---|---|
Page 1
Page 1