Neural Mixed Counting Models for Dispersed Topic Discovery

Jiemin WU, Yanghui RAO*, Zusheng ZHANG, Haoran XIE, Qing LI, Fu Lee WANG, Ziye CHEN

*Corresponding author for this work

Research output: Book Chapters | Papers in Conference ProceedingsConference paper (refereed)Researchpeer-review

15 Citations (Scopus)

Abstract

Mixed counting models that use the negative binomial distribution as the prior can well model over-dispersed and hierarchically dependent random variables; thus they have attracted much attention in mining dispersed document topics. However, the existing parameter inference method like Monte Carlo sampling is quite time-consuming. In this paper, we propose two efficient neural mixed counting models, i.e., the Negative Binomial-Neural Topic Model (NB-NTM) and the Gamma Negative Binomial-Neural Topic Model (GNB-NTM) for dispersed topic discovery. Neural variational inference algorithms are developed to infer model parameters by using the reparameterization of Gamma distribution and the Gaussian approximation of Poisson distribution. Experiments on real-world datasets indicate that our models outperform state-of-the-art baseline models in terms of perplexity and topic coherence. The results also validate that both NB-NTM and GNB-NTM can produce explainable intermediate variables by generating dispersed proportions of document topics.
Original languageEnglish
Title of host publicationProceedings of the 58th Annual Meeting of the Association for Computational Linguistics
EditorsDan JURAFSKY, Joyce CHAI, Natalie SCHLUTER, Joel TETREAULT
PublisherAssociation for Computational Linguistics (ACL)
Pages6159-6169
Number of pages11
ISBN (Electronic)9781952148255
Publication statusPublished - Jul 2020
Event58th Annual Meeting of the Association for Computational Linguistics, ACL 2020 - Virtual, Online, United States
Duration: 5 Jul 202010 Jul 2020

Publication series

NameProceedings of the Annual Meeting of the Association for Computational Linguistics
ISSN (Print)0736-587X

Conference

Conference58th Annual Meeting of the Association for Computational Linguistics, ACL 2020
Country/TerritoryUnited States
CityVirtual, Online
Period5/07/2010/07/20

Bibliographical note

The first two authors contributed equally to this work which was finished when Jiemin Wu was an undergraduate student of his final year.

Fingerprint

Dive into the research topics of 'Neural Mixed Counting Models for Dispersed Topic Discovery'. Together they form a unique fingerprint.

Cite this