Since this prior structure can be implemented using efficient algorithms that add negligible cost beyond standard inference techniques, we recommend it as a new standard for topic modeling. Requests for name changes in the electronic proceedings will be accepted with no questions asked. However name changes may cause bibliographic tracking issues.
The prior is nothing more than an assumption about your data of how the topics are actually distributed and I think for most cases it absolutely makes sense to assume that there are documents that cover more topics and such that cover less.
Whereas for the topic-word distribution they explain in the Discussion section 7 why it is a bad idea to assume it to be asymmetric:. Sign up to join this community. The best answers are voted up and rise to the top. Stack Overflow for Teams — Collaborate and share knowledge with a private group. Create a free Team What is Teams? Learn more. What is the benefit of using asymmetric LDA prior?
Ask Question. Asked 3 years, 7 months ago. Active 2 years, 4 months ago. Viewed 1k times. More Filters. Experiments with non-parametric topic models. Highly Influenced. View 6 excerpts, cites background, methods and results. Selecting Priors for Latent Dirichlet Allocation.
View 4 excerpts, cites background and methods. View 1 excerpt, cites background. View 3 excerpts, cites background. View 4 excerpts, cites background, results and methods. Dirichlet belief networks for topic structure learning. View 3 excerpts, cites methods and background. We study the Latent Dirichlet Allocation model, a popular Bayesian algorithm for text analysis. Transfer Topic Modeling with Ease and Scalability. View 2 excerpts, cites background. On Smoothing and Inference for Topic Models.
View 3 excerpts, references background and methods.
0コメント