Topic model
In statistics and natural language processing, a topic model is a type of statistical model for discovering the abstract "topics" that occur in a collection of documents. Topic modeling is a frequently used text-mining tool for discovery of hidden semantic structures in a text body. Intuitively, given that a document is about a particular topic, one would expect particular words to appear in the document more or less frequently: "dog" and "bone" will appear more often in documents about dogs, "cat" and "meow" will appear in documents about cats, and "the" and "is" will appear approximately equally in both. A document typically concerns multiple topics in different proportions; thus, in a document that is 10% about cats and 90% about dogs, there would probably be about 9 times more dog words than cat words. The "topics" produced by topic modeling techniques are clusters of similar words. A topic model captures this intuition in a mathematical framework, which allows examining a set of documents and discovering, based on the statistics of the words in each, what the topics might be and what each document's balance of topics is.
Topic models are also referred to as probabilistic topic models, which refers to statistical algorithms for discovering the latent semantic structures of an extensive text body. In the age of information, the amount of the written material we encounter each day is simply beyond our processing capacity. Topic models can help to organize and offer insights for us to understand large collections of unstructured text bodies. Originally developed as a text-mining tool, topic models have been used to detect instructive structures in data such as genetic information, images, and networks. They also have applications in other fields such as bioinformatics[1] and computer vision.[2]
History
An early topic model was described by Papadimitriou, Raghavan, Tamaki and Vempala in 1998.[3] Another one, called probabilistic latent semantic analysis (PLSA), was created by Thomas Hofmann in 1999.[4] Latent Dirichlet allocation (LDA), perhaps the most common topic model currently in use, is a generalization of PLSA. Developed by David Blei, Andrew Ng, and Michael I. Jordan in 2002, LDA introduces sparse Dirichlet prior distributions over document-topic and topic-word distributions, encoding the intuition that documents cover a small number of topics and that topics often use a small number of words.[5] Other topic models are generally extensions on LDA, such as Pachinko allocation, which improves on LDA by modeling correlations between topics in addition to the word correlations which constitute topics. Hierarchical latent tree analysis (HLTA) is an alternative to LDA, which models word co-occurrence using a tree of latent variables and the states of the latent variables, which correspond to soft clusters of documents, are interpreted as topics.
Topic models for context information
Approaches for temporal information include Block and Newman's determination of the temporal dynamics of topics in the Pennsylvania Gazette during 1728–1800. Griffiths & Steyvers used topic modeling on abstracts from the journal PNAS to identify topics that rose or fell in popularity from 1991 to 2001 whereas Lamba & Madhusushan [6] used topic modeling on full-text research articles retrieved from DJLIT journal from 1981 to 2018. In the field of library and information science, Lamba & Madhusudhan [6][7][8][9] applied topic modeling on different Indian resources like journal articles and electronic theses and resources (ETDs). Nelson [10] has been analyzing change in topics over time in the Richmond Times-Dispatch to understand social and political changes and continuities in Richmond during the American Civil War. Yang, Torget and Mihalcea applied topic modeling methods to newspapers from 1829 to 2008. Mimno used topic modelling with 24 journals on classical philology and archaeology spanning 150 years to look at how topics in the journals change over time and how the journals become more different or similar over time.
Yin et al.[11] introduced a topic model for geographically distributed documents, where document positions are explained by latent regions which are detected during inference.
Chang and Blei[12] included network information between linked documents in the relational topic model, to model the links between websites.
The author-topic model by Rosen-Zvi et al.[13] models the topics associated with authors of documents to improve the topic detection for documents with authorship information.
HLTA was applied to a collection of recent research papers published at major AI and Machine Learning venues. The resulting model is called The AI Tree. The resulting topics are used to index the papers at aipano.cse.ust.hk to help researchers track research trends and identify papers to read, and help conference organizers and journal editors identify reviewers for submissions.
To improve the qualitative aspects and coherency of generated topics, some researchers have explored the efficacy of "coherence scores", or otherwise how computer-extracted clusters (i.e. topics) align with a human benchmark.[14][15] Coherence scores are metrics for optimising the number of topics to extract from a document corpus.[16]
Algorithms
In practice, researchers attempt to fit appropriate model parameters to the data corpus using one of several heuristics for maximum likelihood fit. A survey by D. Blei describes this suite of algorithms.[17] Several groups of researchers starting with Papadimitriou et al.[3] have attempted to design algorithms with provable guarantees. Assuming that the data were actually generated by the model in question, they try to design algorithms that probably find the model that was used to create the data. Techniques used here include singular value decomposition (SVD) and the method of moments. In 2012 an algorithm based upon non-negative matrix factorization (NMF) was introduced that also generalizes to topic models with correlations among topics.[18]
In 2017, neural network has been leveraged in topic modeling to make it faster in inference,[19] which has been extended weakly supervised version.[20]
In 2018 a new approach to topic models was proposed: it is based on stochastic block model.[21]
Because of the recent development of LLM, topic modeling has leveraged LLM through contextual embedding[22] and fine tuning.[23]
Applications of topic models
To quantitative biomedicine
Topic models are being used also in other contexts. For examples uses of topic models in biology and bioinformatics research emerged.[24] Recently topic models has been used to extract information from dataset of cancers' genomic samples.[25] In this case topics are biological latent variables to be inferred.
To analysis of music and creativity
Topic models can be used for analysis of continuous signals like music. For instance, they were used to quantify how musical styles change in time, and identify the influence of specific artists on later music creation. [26]
See also
- Explicit semantic analysis
- Latent semantic analysis
- Latent Dirichlet allocation
- Hierarchical Dirichlet process
- Non-negative matrix factorization
- Statistical classification
- Unsupervised learning
- Mallet (software project)
- Gensim
- Sentence embedding
References
- ↑ Blei, David (April 2012). "Probabilistic Topic Models". Communications of the ACM 55 (4): 77–84. doi:10.1145/2133806.2133826.
- ↑ Cao, Liangliang, and Li Fei-Fei. "Spatially coherent latent topic model for concurrent segmentation and classification of objects and scenes." 2007 IEEE 11th International Conference on Computer Vision. IEEE, 2007.
- ↑ 3.0 3.1 Papadimitriou, Christos; Raghavan, Prabhakar; Tamaki, Hisao; Vempala, Santosh (1998). "Latent semantic indexing". Proceedings of the seventeenth ACM SIGACT-SIGMOD-SIGART symposium on Principles of database systems - PODS '98. pp. 159–168. doi:10.1145/275487.275505. ISBN 978-0897919968. http://www.cs.berkeley.edu/~christos/ir.ps. Retrieved 2012-04-17.
- ↑ Hofmann, Thomas (1999). "Probabilistic Latent Semantic Indexing". Proceedings of the Twenty-Second Annual International SIGIR Conference on Research and Development in Information Retrieval. http://www.cs.brown.edu/~th/papers/Hofmann-SIGIR99.pdf.
- ↑ Blei, David M.; Ng, Andrew Y. (January 2003). "Latent Dirichlet allocation". Journal of Machine Learning Research 3: 993–1022. doi:10.1162/jmlr.2003.3.4-5.993. http://jmlr.csail.mit.edu/papers/v3/blei03a.html.
- ↑ 6.0 6.1 Lamba, Manika jun (2019). "Mapping of topics in DESIDOC Journal of Library and Information Technology, India: a study". Scientometrics 120 (2): 477–505. doi:10.1007/s11192-019-03137-5. ISSN 0138-9130.
- ↑ Lamba, Manika jun (2019). "Metadata Tagging and Prediction Modeling: Case Study of DESIDOC Journal of Library and Information Technology (2008-2017)". World Digital Libraries 12: 33–89. doi:10.18329/09757597/2019/12103. ISSN 0975-7597. https://content.iospress.com/articles/world-digital-libraries-an-international-journal/wdl12103.
- ↑ Lamba, Manika may (2019). "Author-Topic Modeling of DESIDOC Journal of Library and Information Technology (2008-2017), India". Library Philosophy and Practice. https://www.proquest.com/openview/4416f54af3fe77e1c49c811af86990eb/1?pq-origsite=gscholar&cbl=54903.
- ↑ Lamba, Manika sep (2018). "Metadata Tagging of Library and Information Science Theses: Shodhganga (2013-2017)". ETD2018:Beyond the boundaries of Rims and Oceans. Taiwan,Taipei. https://etd2018.ncl.edu.tw/images/phocadownload/3-2_Manika_Lamba_Extended_Abstract_ETD_2018.pdf.
- ↑ Nelson, Rob. "Mining the Dispatch". Digital Scholarship Lab, University of Richmond. https://dsl.richmond.edu/dispatch/.
- ↑ Yin, Zhijun (2011). "Geographical topic discovery and comparison". Proceedings of the 20th international conference on World wide web. pp. 247–256. doi:10.1145/1963405.1963443. ISBN 9781450306324.
- ↑ Chang, Jonathan (2009). "Relational Topic Models for Document Networks". Aistats 9: 81–88. http://www.jmlr.org/proceedings/papers/v5/chang09a/chang09a.pdf.
- ↑ Rosen-Zvi, Michal (2004). "The author-topic model for authors and documents". Proceedings of the 20th Conference on Uncertainty in Artificial Intelligence: 487–494.
- ↑ Nikolenko, Sergey (2017). "Topic modelling for qualitative studies". Journal of Information Science 43: 88–102. doi:10.1177/0165551515617393.
- ↑ Reverter-Rambaldi, Marcel (2022). Topic Modelling in Spontaneous Speech Data (Honours thesis). Australian National University. doi:10.25911/M1YF-ZF55.
- ↑ Newman, David (2010). "Automatic evaluation of topic coherence". Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics: 100–108.
- ↑ Blei, David M. (April 2012). "Introduction to Probabilistic Topic Models" (PDF). Comm. ACM 55 (4): 77–84. doi:10.1145/2133806.2133826. https://cacm.acm.org/magazines/2012/4/147361-probabilistic-topic-models/fulltext.
- ↑ Sanjeev Arora; Rong Ge; Ankur Moitra (April 2012). "Learning Topic Models—Going beyond SVD". arXiv:1204.1956 [cs.LG].
- ↑ Miao, Yishu; Grefenstette, Edward; Blunsom, Phil (2017). "Discovering Discrete Latent Topics with Neural Variational Inference" (in en). Proceedings of the 34th International Conference on Machine Learning (PMLR): 2410–2419. https://proceedings.mlr.press/v70/miao17a.html.
- ↑ Xu, Weijie; Jiang, Xiaoyu; Sengamedu Hanumantha Rao, Srinivasan; Iannacci, Francis; Zhao, Jinjin (2023). "vONTSS: vMF based semi-supervised neural topic modeling with optimal transport". Findings of the Association for Computational Linguistics: ACL 2023 (Stroudsburg, PA, USA: Association for Computational Linguistics): 4433–4457. doi:10.18653/v1/2023.findings-acl.271. http://dx.doi.org/10.18653/v1/2023.findings-acl.271.
- ↑ Martin Gerlach; Tiago Pexioto; Eduardo Altmann (2018). "A network approach to topic models". Science Advances 4 (7): eaaq1360. doi:10.1126/sciadv.aaq1360. PMID 30035215. Bibcode: 2018SciA....4.1360G.
- ↑ Bianchi, Federico; Terragni, Silvia; Hovy, Dirk (2021). "Pre-training is a Hot Topic: Contextualized Document Embeddings Improve Topic Coherence". Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers). Stroudsburg, PA, USA: Association for Computational Linguistics. pp. 759–766. doi:10.18653/v1/2021.acl-short.96. http://dx.doi.org/10.18653/v1/2021.acl-short.96.
- ↑ Xu, Weijie; Hu, Wenxiang; Wu, Fanyou; Sengamedu, Srinivasan (2023). "DeTiME: Diffusion-Enhanced Topic Modeling using Encoder-decoder based LLM". Findings of the Association for Computational Linguistics: EMNLP 2023 (Stroudsburg, PA, USA: Association for Computational Linguistics): 9040–9057. doi:10.18653/v1/2023.findings-emnlp.606. http://dx.doi.org/10.18653/v1/2023.findings-emnlp.606.
- ↑ Liu, L. et al. (2016). "An overview of topic modeling and its current applications in bioinformatics". SpringerPlus 5 (1): 1608. doi:10.1186/s40064-016-3252-8. PMID 27652181.
- ↑ Valle, F.; Osella, M.; Caselle, M. (2020). "A Topic Modeling Analysis of TCGA Breast and Lung Cancer Transcriptomic Data". Cancers 12 (12): 3799. doi:10.3390/cancers12123799. PMID 33339347.
- ↑ Shalit, Uri; Weinshall, Daphna; Chechik, Gal (2013-05-13). "Modeling Musical Influence with Topic Models" (in en). Proceedings of the 30th International Conference on Machine Learning (PMLR): 244–252. https://proceedings.mlr.press/v28/shalit13.html.
Further reading
- Steyvers, Mark; Griffiths, Tom (2007). "Probabilistic Topic Models". in Landauer, T.; McNamara, D; Dennis, S. et al. (PDF). Handbook of Latent Semantic Analysis. Psychology Press. ISBN 978-0-8058-5418-3. http://psiexp.ss.uci.edu/research/papers/SteyversGriffithsLSABookFormatted.pdf.
- Blei, D.M.; Lafferty, J.D. (2009). "Topic Models". https://www.cs.columbia.edu/~blei/papers/BleiLafferty2009.pdf.
- Blei, D.; Lafferty, J. (2007). "A correlated topic model of Science". Annals of Applied Statistics 1 (1): 17–35. doi:10.1214/07-AOAS114.
- Mimno, D. (April 2012). "Computational Historiography: Data Mining in a Century of Classics Journals". Journal on Computing and Cultural Heritage 5 (1): 1–19. doi:10.1145/2160165.2160168. https://www.perseus.tufts.edu/~amahoney/02-jocch-mimno.pdf.
- Marwick, Ben (2013). "Discovery of Emergent Issues and Controversies in Anthropology Using Text Mining, Topic Modeling, and Social Network Analysis of Microblog Content". in Yanchang, Zhao; Yonghua, Cen. Data Mining Applications with R. Elsevier. pp. 63–93. https://www.academia.edu/5508141.
- Jockers, M. 2010 Who's your DH Blog Mate: Match-Making the Day of DH Bloggers with Topic Modeling Matthew L. Jockers, posted 19 March 2010
- Drouin, J. 2011 Foray Into Topic Modeling Ecclesiastical Proust Archive. posted 17 March 2011
- Templeton, C. 2011 Topic Modeling in the Humanities: An Overview Maryland Institute for Technology in the Humanities Blog. posted 1 August 2011
- Griffiths, T.; Steyvers, M. (2004). "Finding scientific topics". Proceedings of the National Academy of Sciences 101 (Suppl 1): 5228–35. doi:10.1073/pnas.0307752101. PMID 14872004. Bibcode: 2004PNAS..101.5228G.
- Yang, T., A Torget and R. Mihalcea (2011) Topic Modeling on Historical Newspapers. Proceedings of the 5th ACL-HLT Workshop on Language Technology for Cultural Heritage, Social Sciences, and Humanities. The Association for Computational Linguistics, Madison, WI. pages 96–104.
- Block, S. (January 2006). "Doing More with Digitization: An introduction to topic modeling of early American sources". Common-place the Interactive Journal of Early American Life 6 (2). http://www.common-place.org/vol-06/no-02/tales/.
- Newman, D.; Block, S. (March 2006). "Probabilistic Topic Decomposition of an Eighteenth-Century Newspaper". Journal of the American Society for Information Science and Technology 57 (5): 753–767. doi:10.1002/asi.20342. http://www.ics.uci.edu/~newman/pubs/JASIST_Newman.pdf.
External links
- Mimno, David. "Topic modeling bibliography". http://mimno.infosci.cornell.edu/topics.html.
- Brett, Megan R.. "Topic Modeling: A Basic Introduction". Journal of Digital Humanities. http://journalofdigitalhumanities.org/2-1/topic-modeling-a-basic-introduction-by-megan-r-brett/.
- Topic Models Applied to Online News and Reviews Video of a Google Tech Talk presentation by Alice Oh on topic modeling with LDA
- Modeling Science: Dynamic Topic Models of Scholarly Research Video of a Google Tech Talk presentation by David M. Blei
- Automated Topic Models in Political Science Video of a presentation by Brandon Stewart at the Tools for Text Workshop, 14 June 2010
- Shawn Graham, Ian Milligan, and Scott Weingart "Getting Started with Topic Modeling and MALLET". The Programming Historian. http://programminghistorian.org/lessons/topic-modeling-and-mallet/.
- Blei, David M. "Introductory material and software"
- code, demo - example of using LDA for topic modelling
Original source: https://en.wikipedia.org/wiki/Topic model.
Read more |