Multi-Label Classification of Chinese Humor Texts Using Hypergraph Attention Networks

Hao Chuan Kao, Man Chen Hung, Lung Hao Lee, Yuen Hsien Tseng

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

We use Hypergraph Attention Networks (HyperGAT) to recognize multiple labels of Chinese humor texts. We firstly represent a joke as a hypergraph. The sequential hyperedge and semantic hyperedge structures are used to construct hyperedges. Then, attention mechanisms are adopted to aggregate context information embedded in nodes and hyperedges. Finally we use trained HyperGAT to complete the multi-label classification task. Experimental results on the Chinese humor multi-label dataset showed that HyperGAT model outperforms previous sequence-based (CNN, BiLSTM, FastText) and graph-based (Graph-CNN, TextGCN, Text Level GNN) deep learning models.

Original languageEnglish
Title of host publicationROCLING 2021 - Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing
EditorsLung-Hao Lee, Chia-Hui Chang, Kuan-Yu Chen
PublisherThe Association for Computational Linguistics and Chinese Language Processing (ACLCLP)
Pages257-264
Number of pages8
ISBN (Electronic)9789869576949
Publication statusPublished - 2021
Event33rd Conference on Computational Linguistics and Speech Processing, ROCLING 2021 - Taoyuan, Taiwan
Duration: 2021 Oct 152021 Oct 16

Publication series

NameROCLING 2021 - Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing

Conference

Conference33rd Conference on Computational Linguistics and Speech Processing, ROCLING 2021
Country/TerritoryTaiwan
CityTaoyuan
Period2021/10/152021/10/16

Keywords

  • Humor recognition
  • Hypergraph neural networks
  • Multi-label classification

ASJC Scopus subject areas

  • Language and Linguistics
  • Linguistics and Language
  • Speech and Hearing

Fingerprint

Dive into the research topics of 'Multi-Label Classification of Chinese Humor Texts Using Hypergraph Attention Networks'. Together they form a unique fingerprint.

Cite this