Abstract
Most nonparametric topic models such as Hierarchical Dirichlet Processes, when viewed as an infinite-dimensional extension to the Latent Dirichlet Allocation, rely on the bag-of-words assumption. They thus lose the semantic ordering of the words inherent in the text which can give an extra leverage to the computational model. We present a new nonparametric topic model that not only maintains the word order in the topic discovery process, but also generates topical n-gram words leading to more interpretable latent topics in the family of the nonparametric topic models. Our experimental results show an improved performance over the current state-of-the-art topic models in document modeling and generating n-gram words in topics.
The work described in this paper is substantially supported by grants from the Research Grant Council of the Hong Kong Special Administrative Region, China (Project Code: CUHK413510) and the Direct Grant of the Faculty of Engineering, CUHK (Project Code: 2050522). This work is also affiliated with the CUHK MoE-Microsoft Key Laboratory of Human-centric Computing and Interface Technologies.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Aldous, D.: Exchangeability and related topics. Ecole d’Ete de Probabilites de Saint-Flour XIII-1983, pp. 1–198 (1985)
Barbieri, N., Manco, G., Ritacco, E., Carnuccio, M., Bevacqua, A.: Probabilistic topic models for sequence data. Machine Learning 93(1), 5–29 (2013)
Blei, D.M.: Probabilistic topic models. Commun. ACM 55(4), 77–84 (2012)
Blei, D., Ng, A., Jordan, M.: Latent Dirichlet allocation. JMLR 3, 993–1022 (2003)
Blunsom, P., Cohn, T., Goldwater, S., Johnson, M.: A note on the implementation of hierarchical Dirichlet processes. In: Proc. of ACL-IJCNLP, pp. 337–340 (2009)
Boyd-Graber, J., Blei, D.M.: Syntactic topic models. In: Proc. of NIPS (2008)
Caballero, K.L., Barajas, J., Akella, R.: The generalized Dirichlet distribution in enhanced topic detection. In: Proc. of CIKM, pp. 773–782 (2012)
Chen, S.F., Goodman, J.: An empirical study of smoothing techniques for language modeling. In: Proc. of ACL, pp. 310–318 (1996)
Claeskens, G., Hjort, N.: Model selection and model averaging. Cambridge Books (1993)
Darling, W.: Generalized Probabilistic Topic and Syntax Models for Natural Language Processing. Ph.D. thesis (2012)
Deane, P.: A nonparametric method for extraction of candidate phrasal terms. In: Proc. of ACL, pp. 605–613 (2005)
Fang, Y., Si, L., Somasundaram, N., Yu, Z.: Mining contrastive opinions on political texts using cross-perspective topic model. In: Proc. of WSDM, pp. 63–72 (2012)
Fox, E., Sudderth, E., Jordan, M., Willsky, A.: A sticky HDP-HMM with application to speaker diarization. The Annals of Applied Statistics 5(2A), 1020–1056 (2011)
Goldwater, S., Griffiths, T., Johnson, M.: A Bayesian framework for word segmentation: Exploring the effects of context. Cognition 112(1), 21–54 (2009)
Goldwater, S., Griffiths, T.L., Johnson, M.: Contextual dependencies in unsupervised word segmentation. In: Proc. of ACL, pp. 673–680 (2006)
Goldwater, S., Griffiths, T., Johnson, M.: Interpolating between types and tokens by estimating power-law generators. In: Proc. of NIPS, vol. 18, p. 459 (2006)
Griffiths, T.L., Steyvers, M., Blei, D., Tenenbaum, J.: Integrating topics and syntax. In: Proc. of NIPS, vol. 17, pp. 537–544 (2005)
Griffiths, T., Steyvers, M., Tenenbaum, J.: Topics in semantic representation. Psychological Review 114(2), 211–244 (2007)
Gruber, A., Rosen-Zvi, M., Weiss, Y.: Hidden topic Markov models. In: Proc. of AISTATS (2007)
Johnson, M.: PCFGs, topic models, adaptor grammars and learning topical collocations and the structure of proper names. In: Proc. of ACL, pp. 1148–1157 (2010)
Kim, H.D., Park, D.H., Lu, Y., Zhai, C.: Enriching text representation with frequent pattern mining for probabilistic topic modeling. JASIST 49(1), 1–10 (2012)
Lau, J.H., Baldwin, T., Newman, D.: On collocations and topic models. ACM Trans. Speech Lang. Process. 10(3), 10:1–10:14 (2013)
Lindsey, R.V., Headden, W.P., Stipicevic, M.J.: A phrase-discovering topic model using hierarchical Pitman-Yor processes. In: Proc. of EMNLP, pp. 214–222 (2012)
McCallum, A., Wang, X.: A note on topical N-grams. Department of Computer Science, University of Massachusetts, Amherst (2005)
Petrović, S., Śnajder, J., Baśić, B.: Extending lexical association measures for collocation extraction. Computer Speech & Language 24(2), 383–394 (2010)
Steyvers, M., Griffiths, T.: Probabilistic topic models. Handbook of Latent Semantic Analysis 427(7), 424–440 (2007)
Teh, Y.: A hierarchical Bayesian language model based on Pitman-Yor processes. In: Proc. of ACL, pp. 985–992 (2006)
Teh, Y., Jordan, M., Beal, M., Blei, D.: Hierarchical Dirichlet processes. JASA 101(476), 1566–1581 (2006)
Wallach, H.M.: Structured topic models for language. Ph.D. thesis (2008)
Wallach, H.: Topic modeling: beyond bag-of-words. In: Proc. of ICML, pp. 977–984 (2006)
Wang, X., McCallum, A., Wei, X.: Topical N-grams: Phrase and topic discovery, with an application to information retrieval. In: Proc. of ICDM, pp. 697–702 (2007)
Wood, F., Teh, Y.W.: A hierarchical nonparametric Bayesian approach to statistical language model domain adaptation. Journal of Machine Learning 5, 607–614 (2009)
Yoshii, K., Goto, M.: A vocabulary-free infinity-gram model for nonparametric Bayesian chord progression analysis. In: Proc. of ISMIR (2011)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Jameel, S., Lam, W. (2013). A Nonparametric N-Gram Topic Model with Interpretable Latent Topics. In: Banchs, R.E., Silvestri, F., Liu, TY., Zhang, M., Gao, S., Lang, J. (eds) Information Retrieval Technology. AIRS 2013. Lecture Notes in Computer Science, vol 8281. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-45068-6_7
Download citation
DOI: https://doi.org/10.1007/978-3-642-45068-6_7
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-45067-9
Online ISBN: 978-3-642-45068-6
eBook Packages: Computer ScienceComputer Science (R0)