CobwebTM: Probabilistic Concept Formation for Lifelong and Hierarchical Topic Modeling

Karthik Singaravadivelan, Anant Gupta, Zekun Wang, Christopher J. MacLellan

Findings of the association for computational linguistics

2026

Abstract

Topic modeling seeks to uncover latent semantic structure in text corpora with minimal supervision. Neural approaches achieve strong performance but require extensive tuning and struggle with lifelong learning due to catastrophic forgetting and fixed capacity, while classical probabilistic models lack flexibility and adaptability to streaming data. We introduce CobwebTM, a low-parameter lifelong hierarchical topic model based on incremental probabilistic concept formation. By adapting the Cobweb algorithm to continuous document embeddings, CobwebTM constructs semantic hierarchies online, enabling unsupervised topic discovery, dynamic topic creation, and hierarchical organization without predefining the number of topics. Across diverse datasets, CobwebTM achieves strong topic coherence, stable topics over time, and high-quality hierarchies, demonstrating that incremental symbolic concept formation combined with pretrained representations is an efficient approach to topic modeling.

Topics:Concept LearningTopic Modeling
Projects:Cobweb

BibTeX

@inproceedings{singaravadivelan-acl-2026,
  title     = {CobwebTM: Probabilistic Concept Formation for Lifelong and Hierarchical Topic Modeling},
  author    = {Singaravadivelan, Karthik and Gupta, Anant and Wang, Zekun and MacLellan, Christopher J.},
  booktitle = {Findings of the association for computational linguistics},
  year      = {2026},
}

← All Publications