Tree-Structured Topic Modeling with Nonparametric Neural Variational Inference

Ziye CHEN, Cheng DING, Zusheng ZHANG, Yanghui RAO, Haoran XIE

Research output: Book Chapters | Papers in Conference ProceedingsConference paper (refereed)Researchpeer-review


Topic modeling has been widely used for discovering the latent semantic structure of documents, but most existing methods learn topics with a flat structure. Although probabilistic models can generate topic hierarchies by introducing nonparametric priors like Chinese restaurant process, such methods have data scalability issues. In this study, we develop a tree-structured topic model by leveraging nonparametric neural variational inference. Particularly, the latent components of the stickbreaking process are first learned for each document, then the affiliations of latent components are modeled by the dependency matrices between network layers. Utilizing this network structure, we can efficiently extract a tree-structured topic hierarchy with reasonable structure, low redundancy, and adaptable widths. Experiments on real-world datasets validate the effectiveness of our method.
Original languageEnglish
Title of host publicationProceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing
Number of pages11
Publication statusPublished - Aug 2021

Bibliographical note

We are grateful to the reviewers for their constructive comments and suggestions on this study. This work has been supported by the National Natural Science Foundation of China (61972426) and Guangdong Basic and Applied Basic Research Foundation (2020A1515010536).


Dive into the research topics of 'Tree-Structured Topic Modeling with Nonparametric Neural Variational Inference'. Together they form a unique fingerprint.

Cite this