EmoChannel-SA: exploring emotional dependency towards classification task with self-attention mechanism

Zongxi LI, Xinhong CHEN, Haoran XIE, Qing LI, Xiaohui TAO, Gary CHENG*

*Corresponding author for this work

Research output: Journal PublicationsJournal Article (refereed)peer-review

5 Citations (Scopus)

Abstract

Exploiting hand-crafted lexicon knowledge to enhance emotional or sentimental features at word-level has become a widely adopted method in emotion-relevant classification studies. However, few attempts have been made to explore the emotion construction in the classification task, which provides insights to how a sentence’s emotion is constructed. The major challenge of exploring emotion construction is that the current studies assume the dataset labels as relatively independent emotions, which overlooks the connections among different emotions. This work aims to understand the coarse-grained emotion construction and their dependency by incorporating fine-grained emotions from domain knowledge. Incorporating domain knowledge and dimensional sentiment lexicons, our previous work proposes a novel method named EmoChannel to capture the intensity variation of a particular emotion in time series. We utilize the resultant knowledge of 151 available fine-grained emotions to comprise the representation of sentence-level emotion construction. Furthermore, this work explicitly employs a self-attention module to extract the dependency relationship within all emotions and propose EmoChannel-SA Network to enhance emotion classification performance. We conducted experiments to demonstrate that the proposed method produces competitive performances against the state-of-the-art baselines on both multi-class datasets and sentiment analysis datasets.
Original languageEnglish
Article number6
Pages (from-to)2049-2070
Number of pages22
JournalWorld Wide Web
Volume24
Issue number6
Early online date6 Oct 2021
DOIs
Publication statusPublished - Nov 2021

Bibliographical note

Publisher Copyright:
© 2021, The Author(s).

Funding

This journal paper is an extension based on a conference paper, entitled “Emochannelattn: Exploring emotional construction towards multi-class emotion classification”, which has been published in the 2020 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT) []. For a concise expression and the consistency with the original paper, we reuse a part of the content from the conference version. We have significantly revised the conference version by proposing a novel model, adding extensive experiments, and discussing parameter issues. Compared to the conferecne version, there are at least 70% new content in this extended journal version. Some notations, equations, algorithms and so on are reused for smooth presentation. The research has been supported by the One-off Special Fund from Central and Faculty Fund in Support of Research from 2019/20 to 2021/22 (MIT02/19-20), the Research Cluster Fund (RG 78/2019-2020R), and the Interdisciplinary Research Scheme of the Dean’s Research Fund 2019-20 (FLASS/DRF/IDS-2) of The Education University of Hong Kong, the Hong Kong Research Grants Council under the Collaborative Research Fund (project number: C1031-18G), and the Direct Grant (DR21A5) and the Faculty Research Grant (DB21A9) of Lingnan University, Hong Kong.

Keywords

  • Emochannel
  • Emotion classification
  • Emotion lexicon
  • Sentiment analysis

Fingerprint

Dive into the research topics of 'EmoChannel-SA: exploring emotional dependency towards classification task with self-attention mechanism'. Together they form a unique fingerprint.

Cite this