Publication | Closed Access
Peacock
45
Citations
45
References
2015
Year
Latent Dirichlet AllocationEngineeringSemantic WebLanguage ProcessingText MiningNatural Language ProcessingPopular TopicInformation RetrievalData ScienceBig Lda ModelsComputational LinguisticsNews RecommendationSearch TechnologyKnowledge DiscoveryComputer ScienceSearch Engine DesignVector Space ModelTopic ModelArts
Latent Dirichlet allocation (LDA) is a popular topic modeling technique in academia but less so in industry, especially in large-scale applications involving search engine and online advertising systems. A main underlying reason is that the topic models used have been too small in scale to be useful; for example, some of the largest LDA models reported in literature have up to 10 3 topics, which difficultly cover the long-tail semantic word sets. In this article, we show that the number of topics is a key factor that can significantly boost the utility of topic-modeling systems. In particular, we show that a “big” LDA model with at least 10 5 topics inferred from 10 9 search queries can achieve a significant improvement on industrial search engine and online advertising systems, both of which serve hundreds of millions of users. We develop a novel distributed system called Peacock to learn big LDA models from big data. The main features of Peacock include hierarchical distributed architecture, real-time prediction, and topic de-duplication. We empirically demonstrate that the Peacock system is capable of providing significant benefits via highly scalable LDA topic models for several industrial applications.
| Year | Citations | |
|---|---|---|
Page 1
Page 1